AVL tree
From Wikipedia, the free encyclopedia
In computer science, an AVL tree is a self-balancing binary search tree, and the first such data structure to be invented. In an AVL tree the heights of the two child subtrees of any node differ by at most one, therefore it is also called height-balanced. Lookup, insertion, and deletion all take O(log n) time in both the average and worst cases. Additions and deletions may require the tree to be rebalanced by one or more tree rotations.
The AVL tree is named after its two inventors, G.M. Adelson-Velsky and E.M. Landis, who published it in their 1962 paper "An algorithm for the organization of information."
The balance factor of a node is the height of its right subtree minus the height of its left subtree. A node with balance factor 1, 0, or -1 is considered balanced. A node with any other balance factor is considered unbalanced and requires rebalancing the tree. The balance factor is either stored directly at each node or computed from the heights of the subtrees.
AVL trees are often compared with red-black trees because they support the same set of operations and because red-black trees also take O(log n) time for the basic operations. AVL trees perform better than red-black trees for lookup-intensive applications.[1] The AVL tree balancing algorithm appears in many computer science curricula.
Contents |
[edit] Operations
The basic operations of an AVL tree generally involve carrying out the same algorithms as would be carried out on an unbalanced binary search tree, but preceded or followed by one or more of the so-called "AVL rotations."
[edit] Insertion
Insertion into an AVL tree may be carried out by inserting the given value into the tree as if it were an unbalanced binary search tree, and then retracing one's steps toward the root updating the balance factor of the nodes. Retracing is stopped when a node's balance factor becomes 0, 1, or -1. If the balance factor becomes 0 then the height of the subtree hasn't changed because of the insert. The insertion is finished.
If the balance factor becomes 2 or -2 then the tree rooted at this node is unbalanced, and a tree rotation is needed. The tree rotation will always leave the subtree evenly balanced. The rotation can be done in constant time.
Similar to an unbalanced binary search tree, insertion takes O(height) time. This is O(height) to find the insertion point plus O(height) to check for any needed rotations. The balanced nature of the AVL tree provides us with an upper bound on its height: at most 1.44 lg(n + 2) [2]. So the insertion process in total takes O(log n) time.
[edit] Deletion
If the node is a leaf, remove it. If the node is not a leaf, replace it with either the largest in its left subtree or the smallest in its right subtree, and remove that node. Thus the node that is removed has at most one child. After deletion retrace the path back up the tree to the root, adjusting the balance factors as needed.
The retracing can stop if the balance factor becomes -1 or 1 indicating that the height of that subtree has remained unchanged. If the balance factor becomes 0 then the height of the subtree has decreased by one and the retracing needs to continue. If the balance factor becomes -2 or 2 then the subtree is unbalanced and needs to be rotated to fix it. If the rotation leaves the subtree's balance factor at 0 then the retracing towards the root must continue since the height of this subtree has decreased by one. This is in contrast to an insertion where a rotation resulting in a balance factor of 0 indicated that the subtree's height has remained unchanged.
The time required is O(h) for lookup plus O(h) rotations on the way back to the root; so the operation can be completed in O(log n) time.
[edit] Lookup
Lookup in an AVL tree is performed exactly as in an unbalanced binary search tree, and thus takes O(log n) time, since an AVL tree is always kept balanced. No special provisions need to be taken, and the tree's structure is not modified by lookups. (This is in contrast to splay tree lookups, which do modify their tree's structure.)
[edit] See also
[edit] References
- G. Adelson-Velskii and E.M. Landis, "An algorithm for the organization of information." Doklady Akademii Nauk SSSR, 146:263–266, 1962 (Russian). English translation by Myron J. Ricci in Soviet Math. Doklady, 3:1259–1263, 1962.
- Donald Knuth. The Art of Computer Programming, Volume 3: Sorting and Searching, Third Edition. Addison-Wesley, 1997. ISBN 0-201-89685-0. Pages 458–475 of section 6.2.3: Balanced Trees. Note that Knuth calls AVL trees simply "balanced trees".
- ^ Pfaff, Ben (June 2004). Performance Analysis of BSTs in System Software (PDF). Stanford University.
- ^ E. Horowitz, S. Sahni, and D. Mehta, Fundamentals of Data Structures in C++. Computer Science Press, 1995. ISBN 0-7167-8292-8
[edit] External links
- Description from the Dictionary of Algorithms and Data Structures
- The AVL Tree Rotations Tutorial (RTF) by John Hargrove
- 5 Types of AVL Trees in C++
- Memory Allocation in Visual Basic using Balanced Binary AVL Trees for keys of arbitrary length
- Iterative Implementation of AVL Trees in C#
- Heavily documented fast Implementation in Linoleum (a cross-platform Assembler) by Herbert Glarner
- AVL Tree Traversal
- C++ AVL Tree Template and C AVL TREE "Generic Package" by Walt Karas
- A Visual Basic AVL Tree Container Class by Jim Harris
- AVL Trees: Tutorial and C++ Implementation by Brad Appleton
- Ulm's Oberon Library: AVLTrees
- The AVL TREE Data Type
- CNAVLTree Class Reference
- AVL-trees - balanced binary trees by Alex Konshin
- Simulation of AVL Trees
- AVL tree applet
- AVL, Splay and Red/Black Applet
- Visual Tutorial of AVL Tree operations
- Navl: threaded Avl-tree C# class to implement an ordered list of items which can be accessed by value and by index