Today, we talked about union find, data structure that supports find and union operation. We gave an implementation that has cost O(log n) for both operations. We also explain the idea of how to modify the find function to improve the running time to log* n.
If you are interested in the reference of total number of atoms in the universe, here is a wiki page:
http://en.wikipedia.org/wiki/Observable_universe
Friday, September 28, 2012
Wednesday, September 26, 2012
change of Siddharth's office hour
The office hour of our TA, Siddharth, is now 10:30-12:30 at Thu.
Lecture 16 Fibonacci heap (3)
Today, we finished the Fibonacci heap and proved that it has O(1) amortized cost for insertion, merge, find_min, decrease_key. The only expensive work is delete_min which takes O(log n).
As an application, we show how to use Fibonacci heap in Prim's algorithm for Minimum Spanning Tree.
As an application, we show how to use Fibonacci heap in Prim's algorithm for Minimum Spanning Tree.
Monday, September 24, 2012
Hint on homework 2
Some student asked me hint for problem 2 of homework 2. The suggestion is that prove your claim with simple examples. For example, can you prove you claim on trees of size 3?
Lecture 15. Fibonacci Heap (2)
Today we talked about how to implement a decrease_key. This implementation will give us tree that is no longer binomial tree. But it will still have the property that suppose the children of a node is sorted according to its degree, then the i-th has degree at least i-2. (An alternative definition of binomial tree is that its ith child has degree exact i-1). We call any tree of above property Fibonacci Tree.
Friday, September 21, 2012
Lecture 14 Binomial Heap and Fibonacci Heap
Today, we talked about Binomial Heap and why it takes an amortized running time of O(log n) for the deletion and O(1) for all the other operations. We also (try to) motivate Fibonacci Heap which also support decrease_key in O(1) time.
If you are confused by the Dijkstra's algorithm I described in the end, take a look at the wiki description.
If you are confused by the Dijkstra's algorithm I described in the end, take a look at the wiki description.
Wednesday, September 19, 2012
lecture 13 Binomial Heaps (2)
Today, we talked about how to implement the merge operation of Binomial heap in O(log n) time. This will give us insertion, delete_min, merge all in O(log n).
Surprisingly, if we are a little bit lazier in the implementation, we can make all the operations O(1) except for the delete_min operation.
Surprisingly, if we are a little bit lazier in the implementation, we can make all the operations O(1) except for the delete_min operation.
Monday, September 17, 2012
Lecture 12. Binomial Trees and clarifications
Today we talked about the construction of Binomial Heaps. The main motivation is to allow efficient merging operation. As we know the binary heap merging takes time \Omega(n).
In the end of the class, there is a mistake when I described the delete-min operations. For a binomial heap H,we will first take the tree with the smallest root B_min and delete of its root to get H'= (B_1,...B_{min-1}). In the class, I mention that then we will merge H' with H. This is wrong. As is pointed out by some student, we want to merge H' with H'' where H'' =(H\B_min) (the notion means that H'' is the heap we get after remove B_min from H).
In the end of the class, there is a mistake when I described the delete-min operations. For a binomial heap H,we will first take the tree with the smallest root B_min and delete of its root to get H'= (B_1,...B_{min-1}). In the class, I mention that then we will merge H' with H. This is wrong. As is pointed out by some student, we want to merge H' with H'' where H'' =(H\B_min) (the notion means that H'' is the heap we get after remove B_min from H).
Friday, September 14, 2012
Lecture 11 Splay Tree
Today we proved that for deletion and search as well as the splay operation, the amortized cost is O(log n) for splay trees. As a homework, you will show the insertion will also have an amortized cost O(log n).
Wednesday, September 12, 2012
Lecture 10 Splay Tree
Today, we talked about the implementation of Splay Trees. We also introduced the potential function for the analysis of Splay Trees.
Monday, September 10, 2012
Lecture 9 Amortized Analysis and Splay Tree
Today we summarize 3 different techniques of analyzing amortized cost. Also, we introduce the basic idea of Splay Tree. It is similar to the standard binary tree while the difference is that after it access some element by the standard BST operations, it will bring that element to the root of the tree.
Sunday, September 9, 2012
Good news(?)
Starting next Wednesday 9/12, the class will meet at LWSN B151.
Friday, September 7, 2012
Lecture 8. Amortized Analysis (2)
Today, we introduce 3 ways of analyzing the amortized cost of an algorithm. We use the problem of simulating a queue by two stacks as an example.
These 3 mehtods are aggregate analysis, banker's method and physicist's method. Aggregate analysis basically analyzes the average cost of n different operations. Banker's method is a generalization (of aggregate analysis) which allows us to assign different cost to different operations as long as the cost we pay can always cover the time we spend on the operations. The physicist's method introduce a potential function (which can be viewed as all the credits available in the banker's method). After defining the potential function, it naturally induces a amortized cost for each operation.
These 3 mehtods are aggregate analysis, banker's method and physicist's method. Aggregate analysis basically analyzes the average cost of n different operations. Banker's method is a generalization (of aggregate analysis) which allows us to assign different cost to different operations as long as the cost we pay can always cover the time we spend on the operations. The physicist's method introduce a potential function (which can be viewed as all the credits available in the banker's method). After defining the potential function, it naturally induces a amortized cost for each operation.
Wednesday, September 5, 2012
Lecture 7. Amortized Analysis (1)
Today, we talked about amortized analysis. One example we use is the dynamic array.
We showed that although the worst case insertion can be Omega(n), the amortized cost is actually constant.
We showed that although the worst case insertion can be Omega(n), the amortized cost is actually constant.
Subscribe to:
Comments (Atom)