CS6841: Advanced Algorithms, Jan-May 2016
- Lecturers: Ravishankar Krishnaswamy, Rajsekar Manokaran
- TAs:
- Location: CS27, W 2:00-3:50. F 3:00-4:50
Announcements
- Scribing instructions
(scribe notes are largely for students' benefit as auxiliary material to their own notes, they have not been edited/verified by the instructors; please use them at your discretion.)
Homeworks
And here are some common references for all the homeworks.
Completed Lectures, and Tentative Outline
Part 1: Dimension Reduction for Better Data-Structures and Algorithms
- (13/01) Lecture 1. Hashing, Universal Hashing, Perfect Hashing, Balls and Bins.
- (20/01) Lecture 2. Balls and Bins Lower Bound, Power of Two Choices.
- Anupam Gupta's comprehensive
notes.
- Balls and Bins lower bound used "negative correlation", and power of two choices proof used Bernstein's inequality.
- (22/01) Lecture 3. Cuckoo Hashing.
- Scribe notes link.
- Went over examples of different concentration bounds (Markov, Chebyshev, and Bernstein).
- Described Cuckoo Hashing algorithm, introduced random graph view of analyzing the algorithm, intro to Branching Processes.
- Proof also used Wald's equation, we stated this without proof.
- (27/01) Lecture 4. Min-hash for estimating Similarity, Locality Sensitive Hashing.
- Alex Andoni and Piotr Indyk's beautiful CACM article.
- Gionis, Indyk, Motwani's paper on LSH schemes.
- Scribe notes are here.
- (29/01) Lecture 5. Dimensionality Reduction: Johnson Lindenstrauss Lemma.
- Introduced Gaussian random variables, studied some of their properties, and proved JL.
- We saw another LSH construction that used Gaussian projections.
- Useful notes are here.
- (03/02) Lecture 6. Data-Aware Dimensionality Reduction: Singular Value Decomposition.
- We followed Chapter 3 from Hopcroft-Kannan's book upto Section 3.5.
- We saw how SVDs capture the best-fit subspaces for any dimension, and also how they capture the best low-rank matrix approximations.
- (05/02) Lecture 7. SVDs continued. Algorithmic Application (Graph Partitioning)
- Introduced Stochastic Block Models, and saw how SVD (or eigenvector) approaches can be used to recover the blocks.
- Outlined the proof technique which used matrix perturbation analysis (davis kahan sin-theta theorem) and matrix concentration.
- Roughly followed Dan Spielman's lecture notes here.
- For those interested, Terry tao has some nice notes on inequalities concerning eigenvalues relating A, B, and A+B.
- Scribe notes are here.
Part 2: Optimizitation with LPs and Convex Programming
- (10/02) Lecture 8. Intro to LPs and Duality.
- Completed the Stochastic Block Model proof.
- Introduced Linear Programming, showed that BFS optimal solutions always exist, and motivated Simplex algorithm.
- Introduced duality and mentioned the strong duality theorem.
- Scribe notes are here.
- (12/02) Lecture 9. LP Duality: Max-Flow Min-Cut, Multiway Cut.
- Saw compact formulations for Max-Flow, Showed duality with Min Cut.
- Motivated Ball-cut based rounding procedures.
- Applications in Multiway Cut Approximation Algorithms.
- (17/02) Lecture 10. LP Based Approximation Algorithms.
- Introduced Facility Location Problem.
- Showed a Primal-Dual inspired approximation algorithm.
- (19/02) Lecture 11. Facility Location Primal Dual Algorithm.
- Completed the proof of Jain-Vazirani.
- (24/02) Lecture 12. Unrelated Machine Scheduling.
- Showed that BFS = corner points = extreme points.
- Used this to design approximation algorithms for scheduling on unrelated machines.
- Scribe notes are here.
- (26/02) Lecture 13. Compressed Sensing.
- Motivated compressed sensing using the wine-bottle poison puzzle. We noticed in class that the problem is harder than compressed sensing since we can only see if people fall sick or not (boolean outputs). In fact, it turns out that this problem is a well-studied problem called group testing. See this link for more information.
- Saw most of the Candes proof for using RIP-ness to bound error of recovery.
- Scribe notes are here.
- (02/03) Lecture 14. More Compressed Sensing.
- Finished the compressed sensing recover proof.
- Saw how to prove that Gaussian matrices are RIP using concentration + union bound over all vectors.
- Handwritten notes of second part of Lecture 13 and Lecture 14 are here.
- (04/03) Lecture 15. Recap Lecture.
- Recap of all the important algorithmic techniques and proof techniques covered until now.
- Handwritten notes (without Hashing, JL, and LP Rounding) are here.
- (09/03) Lecture 16. Intro to Convex Optimization.
- Introduced notions of convex sets, convex functions, gradient, Hessian.
- (11/03) Lecture 17. Gradient Descent.
- Saw how to optimize convex functions using the Gradient Descent algorithm.
- (16/03) Lecture 18. Online Optimization Using Multiplicative Weights Algorithm.
- Introduced notion of regret, and analyzed MW algorithm.
- (18/03) Lecture 19. More MW: Applications in Boosting.
- Saw how to reproduce the Boosting algorithm in Machine Learning using MW method.
Part 3: Online Models for Resource Allocation Problems
- (23/03) Lecture 20. Online Paging and Online Set Cover.
- Understood notion of Competitive Ratio.
- Motivated online models in several settings, and studied the Marking algorithm for paging. Introduced Online Set Cover problem.
- Handwritten notes are here; scribe notes are here.
- (30/03) Lecture 21. Online Set Cover.
- Finished the online set cover algorithm using primal-dual analysis.
- Introduced the Virtual Circuit Routing problem.
- Handwritten notes are here; scribe notes are here.
- (01/04) Lecture 22. Virtual Circuit Routing; Secretary Problem.
- Presented the exponential-greedy algorithm and analysis using Potential Functions.
- Introduced "online stochastic" optimization using Secretary problem, and presented the 1/e-competitive algorithm. Here's an article expositing the rich history of the secretary problem.
- Handwritten notes are here; scribe notes are here.
Part 4: Expander Graphs and Applications
- (06/04) Lecture 23. AKS Sorting Networks.
- Introduced Sorting Networks and presented simple algorithms.
- Introduced the concept of Expander Graphs, useful in AKS Sorting Networks.
- Scribe notes are here.
- (08/04) Lecture 24. Expander Mixing Lemma.
- More on expander graphs, how they mix well.
- (13/04; 15/04) Lecture 25-26. Spectral Sparsifiers.
- Studied Properties of Graph Laplacian, Effective Resistances, Studied Spectral Sparsifiers.
- Scribe notes are here.
- (22/04) Lecture 26. Recap Lecture.
- Recap contents of course, prepare for final exam.
- Handwritten notes are here.
Thanks to Anupam Gupta for webpage design, and scribe template.