Meetings

Click on the theme item for the meeting plan for that theme.Click on the meeting item for references, exercises, and additional reading related to it.

**Theme 1 : Introduction and Overview**- 1 meetings- Meeting 01 : Mon, Jul 31, 11:00 am-11:50 am

Administrative announcements.<br> Three views about this course. Learning outcomes.<br> Algorithms, Pseudo-codes, Programs, Correctness, Efficiency.<br> Algorithm Design Paradigms. References <a href="https://pdfs.semanticscholar.org/9a41/16f84c836fe545dc9b569d434a6cbae634bb.pdf">Notes on Correctness Proofs</a> by James Aspnes. Exercises Write down a correctness statement for the array initialization loop that we wrote in class. <br><br>Which of the following is growing asymptotically faster? n/(log n) or sqrt(n)? Reading <a href="https://courses.cs.washington.edu/courses/cse331/13sp/conceptual-info/hoare-logic.pdf">Reasoning about code</a>. Administrative announcements.

Three views about this course. Learning outcomes.

Algorithms, Pseudo-codes, Programs, Correctness, Efficiency.

Algorithm Design Paradigms.References : Notes on Correctness Proofs by James Aspnes. Exercises : Write down a correctness statement for the array initialization loop that we wrote in class.

Which of the following is growing asymptotically faster? n/(log n) or sqrt(n)?Reading : Reasoning about code. **Theme 2 : Divide & Conquer Technique**- 16 meetings- Meeting 02 : Tue, Aug 01, 10:00 am-10:50 am
- Meeting 03 : Wed, Aug 02, 09:00 am-09:50 am
- Meeting 04 : Thu, Aug 03, 12:00 pm-12:50 pm
- Meeting 05 : Mon, Aug 07, 11:00 am-11:50 am
- Meeting 06 : Tue, Aug 08, 10:00 am-10:50 am
- Meeting 07 : Wed, Aug 09, 09:00 am-09:50 am
- Meeting 08 : Thu, Aug 10, 12:00 pm-12:50 pm
- Meeting 09 : Wed, Aug 16, 09:00 am-09:50 am
- Meeting 10 : Thu, Aug 17, 12:00 pm-12:50 pm
- Meeting 11 : Mon, Aug 21, 11:00 am-11:50 am
- Meeting 12 : Tue, Aug 22, 10:00 am-10:50 am
- Meeting 13 : Thu, Aug 24, 12:00 pm-12:50 pm
- Meeting 14 : Mon, Aug 28, 11:00 am-11:50 am
- Meeting 15 : Tue, Aug 29, 10:00 am-10:50 am
- Meeting 16 : Wed, Aug 30, 09:00 am-09:50 am
- Meeting 17 : Thu, Aug 31, 12:00 pm-12:50 pm

A simple (hopefully new) example. Karatsuba multiplication. From Karatsuba to Strassen's method for matrix multiplication. Algorithmic upper bounds and lower bounds. Review of Big-O and Big-Omega notation. References href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0830.pdf">Sorting in the exchange model</a> - Notes by Avrim Blum. Exercises Reading A simple (hopefully new) example. Karatsuba multiplication. From Karatsuba to Strassen's method for matrix multiplication. Algorithmic upper bounds and lower bounds. Review of Big-O and Big-Omega notation.References : href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0830.pdf">Sorting in the exchange model - Notes by Avrim Blum. Formal definition of Big-O notation. Dissecting the definition. Why a constant? Why for all n greater than n_0. What we hope to capture by the definition. Sorting as an example problem. Decrease and Conquer - Selection Sort. Analysis of selection sort. Upper bound of O(n^2). Can selection sort be analysed better? Better analysis of existing algorithm vs Designing faster (different) algorithms. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0901.pdf">Notes by Avrim Blum</a> Exercises Reading Formal definition of Big-O notation. Dissecting the definition. Why a constant? Why for all n greater than n_0. What we hope to capture by the definition. Sorting as an example problem. Decrease and Conquer - Selection Sort. Analysis of selection sort. Upper bound of O(n^2). Can selection sort be analysed better? Better analysis of existing algorithm vs Designing faster (different) algorithms.References : Notes by Avrim Blum Lower Bounds for comparison based sorting. The existential (counting argument). Weakness of the theorem. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0913.pdf">Lower Bounds for Comparison Based Sorting</a> - Notes by Avrim Blum. Exercises Reading Lower Bounds for comparison based sorting. The existential (counting argument). Weakness of the theorem.References : Lower Bounds for Comparison Based Sorting - Notes by Avrim Blum. Proving the stronger lower bound theorem. The adversarial argument. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0913.pdf">Lower Bounds for Comparison Based Sorting</a> - Notes by Avrim Blum. Exercises Reading Proving the stronger lower bound theorem. The adversarial argument.References : Lower Bounds for Comparison Based Sorting - Notes by Avrim Blum. More examples of lower bound arguments. The "swap" model for sorting. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0915.pdf">Sorting in the exchange model</a> - Notes by Avrim Blum. Exercises Reading More examples of lower bound arguments. The "swap" model for sorting.References : Sorting in the exchange model - Notes by Avrim Blum. Solving recurrences. Masters theorem. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0901.pdf">Asymptotics</a> - Notes by Avrim Blum. Exercises Reading Solving recurrences. Masters theorem.References : Asymptotics - Notes by Avrim Blum. Average Case Lower Bounds using average depth of a tree. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0913.pdf">Lower Bounds for Comparison Based Sorting</a> - Notes by Avrim Blum. Exercises Reading Average Case Lower Bounds using average depth of a tree.References : Lower Bounds for Comparison Based Sorting - Notes by Avrim Blum. Quicksort and Randomization - The set up and the claims. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0906.pdf">Quicksort - Randomized</a> - Notes by Avrim Blum. Exercises Reading Quicksort and Randomization - The set up and the claims.References : Quicksort - Randomized - Notes by Avrim Blum. Random variables, expectation, examples, linearity of expectation. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0906.pdf">Quicksort - Randomized</a> - Notes by Avrim Blum. Exercises Reading Random variables, expectation, examples, linearity of expectation.References : Quicksort - Randomized - Notes by Avrim Blum. Using linearity of expectation. Expected number of fixed points of a random permutation. Analyzing a simple randomized 7/8 approximation of MaxSAT. References Exercises Reading Using linearity of expectation. Expected number of fixed points of a random permutation. Analyzing a simple randomized 7/8 approximation of MaxSAT.References : None Analysis of Quicksort using linearity of expectation. Connections to average case analysis. References <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0906.pdf">Quicksort - Randomized</a> - Notes by Avrim Blum. Exercises Reading Analysis of Quicksort using linearity of expectation. Connections to average case analysis.References : Quicksort - Randomized - Notes by Avrim Blum. Finding the kth smallest element. Strategy. Observations from Quicksort. Randomized Algorithm for Selection. Expected worst case number of comparisons is at most 4n. References Lecture 4 in this <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lects1-10.pdf">notes by Avrim Blum</a>. Exercises Reading Finding the kth smallest element. Strategy. Observations from Quicksort. Randomized Algorithm for Selection. Expected worst case number of comparisons is at most 4n.References : Lecture 4 in this notes by Avrim Blum. Deterministic Algorithms for Selection. Median of Medians of groups of 5. References Lecture 4 in this <a href="https://www.cs.cmu.edu/~avrim/451f11/lectures/lects1-10.pdf">notes by Avrim Blum</a>. Exercises Reading Deterministic Algorithms for Selection. Median of Medians of groups of 5.References : Lecture 4 in this notes by Avrim Blum. Maximum Subarray Problem.<br> Divide and Conquer Technique with two recursive calls and a variant of the problem as one of the subproblems to solve. References Section 4.1 in the CLRS textbook. The way it was done in the class (the variants etc) does not match with that is given in the textbook. Please use class notes for that detail. Exercises Reading Maximum Subarray Problem.

Divide and Conquer Technique with two recursive calls and a variant of the problem as one of the subproblems to solve.References : Section 4.1 in the CLRS textbook. The way it was done in the class (the variants etc) does not match with that is given in the textbook. Please use class notes for that detail. (Two dimensional) Closest Pair Problem.<br> The One-dimensional case. Sorting-based method. Generalizability. A Divide and Conquer Approach. Techniques to bound the size of the work done other than recursive subproblems in the simple case. Generalizing to two dimensional case. References Section 33.4 in CLRS Textbook. The way it was done in the class does not match with that in the textbook. Please use class notes. Exercises Reading (Two dimensional) Closest Pair Problem.

The One-dimensional case. Sorting-based method. Generalizability. A Divide and Conquer Approach. Techniques to bound the size of the work done other than recursive subproblems in the simple case. Generalizing to two dimensional case.References : Section 33.4 in CLRS Textbook. The way it was done in the class does not match with that in the textbook. Please use class notes. A solution to the 2-dimensional version of the closest pair problem. Importance of maintaining the data in a structured (sorted in this case) way. References Section 33.4 in CLRS Textbook Exercises Reading A solution to the 2-dimensional version of the closest pair problem. Importance of maintaining the data in a structured (sorted in this case) way.References : Section 33.4 in CLRS Textbook **Theme 3 : Greedy Technique**- 19 meetings- Meeting 18 : Fri, Sep 01, 03:30 pm-04:30 pm
- Meeting 19 : Tue, Sep 05, 10:00 am-10:50 am
- Meeting 20 : Wed, Sep 06, 09:00 am-09:50 am
- Meeting 21 : Thu, Sep 07, 12:00 pm-12:50 pm
- Meeting 22 : Mon, Sep 11, 11:00 am-11:50 am
- Meeting 23 : Tue, Sep 12, 10:00 am-10:50 am
- Meeting 24 : Wed, Sep 13, 09:00 am-09:50 am
- Meeting 25 : Thu, Sep 14, 12:00 pm-12:50 pm
- Meeting 26 : Fri, Sep 15, 03:30 pm-04:30 pm
- Meeting 27 : Mon, Sep 18, 11:00 am-11:50 am
- Meeting 28 : Tue, Sep 19, 10:00 am-10:50 am
- Meeting 29 : Wed, Sep 20, 09:00 am-09:50 am
- Meeting 30 : Thu, Sep 21, 12:00 pm-12:50 pm
- Meeting 31 : Mon, Sep 25, 11:00 am-11:50 am
- Meeting 32 : Tue, Sep 26, 10:00 am-10:50 am
- Meeting 33 : Wed, Sep 27, 09:00 am-09:50 am
- Meeting 34 : Tue, Oct 03, 12:00 pm-12:50 pm
- Meeting 35 : Wed, Oct 04, 09:00 am-09:50 am
- Meeting 36 : Fri, Oct 06, 03:30 pm-04:30 pm

(Compensatory Class for Aug 14th)<br>Greedy Algorithms. Optimizations. Greedy Choice. Motivating questions for further discussion.<br> Minimum Spanning Tree Problem. Growing a Tree. Safe Edges. Finding a Safe Edge. Cuts, Cross Edges, Lightest Crossing Edge. <br> Lemma : From "respecting cuts" to "safe edges". References Section 23.1-23.2 in CLRS Textbook Exercises Reading (Compensatory Class for Aug 14th)

Greedy Algorithms. Optimizations. Greedy Choice. Motivating questions for further discussion.

Minimum Spanning Tree Problem. Growing a Tree. Safe Edges. Finding a Safe Edge. Cuts, Cross Edges, Lightest Crossing Edge.

Lemma : From "respecting cuts" to "safe edges".References : Section 23.1-23.2 in CLRS Textbook From "Respecting Cuts" to "Safe Edges". Kruskal's algorithm. Union-Find Data Structure. The naive implementation using arrays and analysis of O(m log m + n^2) algorithm. Optimizations for Array based implementations. References Section 23.1-23.2 in CLRS Textbook Exercises Reading From "Respecting Cuts" to "Safe Edges". Kruskal's algorithm. Union-Find Data Structure. The naive implementation using arrays and analysis of O(m log m + n^2) algorithm. Optimizations for Array based implementations.References : Section 23.1-23.2 in CLRS Textbook Various optimizations. Final array based implementation to give O(k log k) bound over first k union operations. Amortized O(log n) bound for each "union" operation. References Section 4.6 in <a href="https://github.com/haseebr/competitive-programming/blob/master/Materials/Algorithm%20Design%20by%20Jon%20Kleinberg%2C%20Eva%20Tardos.pdf">Kleinberg-Tardos textbook</a>. Exercises Reading Various optimizations. Final array based implementation to give O(k log k) bound over first k union operations. Amortized O(log n) bound for each "union" operation.References : Section 4.6 in Kleinberg-Tardos textbook. Rank-based Implementation of Union-Find. Worst case O(log n) time for FindSet and Union. The idea of path compression. References Section 4.6 in <a href="https://github.com/haseebr/competitive-programming/blob/master/Materials/Algorithm%20Design%20by%20Jon%20Kleinberg%2C%20Eva%20Tardos.pdf">Kleinberg-Tardos textbook</a>.<br> Section 5.1.4 of <a href="http://drona.csa.iisc.ernet.in/~arnabb/daa17/notes/DPV.pdf">DPV Textbook</a>. Exercises Reading Rank-based Implementation of Union-Find. Worst case O(log n) time for FindSet and Union. The idea of path compression.References : Section 4.6 in Kleinberg-Tardos textbook.

Section 5.1.4 of DPV Textbook.Anlysing path compression. Typical situaions when Amortized analysis is useful. The binary counter example. Aggregate method and "Pocket money method" to analyse the binary counter. Analysing path compression. The overview of the strategy. References Section 5.1.4 of <a href="http://drona.csa.iisc.ernet.in/~arnabb/daa17/notes/DPV.pdf">DPV Textbook</a>. Exercises Reading Anlysing path compression. Typical situaions when Amortized analysis is useful. The binary counter example. Aggregate method and "Pocket money method" to analyse the binary counter. Analysing path compression. The overview of the strategy.References : Section 5.1.4 of DPV Textbook. Dividing the range of rank into buckets. log*(n) function. Assigning pocket money to non-root trees. Cost of each find operation as log*(n)+extra. Paying the extra from the pocket money. Argument that pocket money is sufficient for this pay off. Complete analysis. References Section 5.1.4 of <a href="http://drona.csa.iisc.ernet.in/~arnabb/daa17/notes/DPV.pdf">DPV Textbook</a>. Exercises Reading Dividing the range of rank into buckets. log*(n) function. Assigning pocket money to non-root trees. Cost of each find operation as log*(n)+extra. Paying the extra from the pocket money. Argument that pocket money is sufficient for this pay off. Complete analysis.References : Section 5.1.4 of DPV Textbook. Set system based problems. Example of Maximum weight spanning tree. Equivalence with Minimum weight spanning trees (when weights are positive). Formulating MaxST as a set system problem. The formulation of the generic greedy algorithm for system problems.<br> Maximum Weight Matching problem. Formulation as a set system problem. Counter example to the greedy strategy. Question : What made the greedy strategy work in the case of MaxST and not in MaxMatch? References Combinatorial Optimization by Papadimitriou and Steiglitz Chapter 12, Section 4 for different problems discussed. Exercises Reading Set system based problems. Example of Maximum weight spanning tree. Equivalence with Minimum weight spanning trees (when weights are positive). Formulating MaxST as a set system problem. The formulation of the generic greedy algorithm for system problems.

Maximum Weight Matching problem. Formulation as a set system problem. Counter example to the greedy strategy. Question : What made the greedy strategy work in the case of MaxST and not in MaxMatch?References : Combinatorial Optimization by Papadimitriou and Steiglitz Chapter 12, Section 4 for different problems discussed. What made the greedy choice work in the case MaxST and not in MaxMatch? Properties of the independent sets in the case of MaxST problem. Defining a matroid. Plan : to show that having a matroid property for the set system is enough for the generic greedy strategy to work. References CLRS Section 16.4. Exercises Reading What made the greedy choice work in the case MaxST and not in MaxMatch? Properties of the independent sets in the case of MaxST problem. Defining a matroid. Plan : to show that having a matroid property for the set system is enough for the generic greedy strategy to work.References : CLRS Section 16.4. Being a matroid implies greedy choice is correct and an optimal substructure. Implications to the correctness of the generic algorithm. References CLRS Section 16.4 Exercises Reading Being a matroid implies greedy choice is correct and an optimal substructure. Implications to the correctness of the generic algorithm.References : CLRS Section 16.4 Task scheduling problem as an example problem. Structure of solutions for Task Scheduling. Formulation of the set system problem equivalent to it. References CLRS Section 16.5 Exercises Reading Task scheduling problem as an example problem. Structure of solutions for Task Scheduling. Formulation of the set system problem equivalent to it.References : CLRS Section 16.5 Showing that the set system formulation of Task scheduling satisfies the three matroid properties. Characterization of independence of a set. References CLRS Section 16.5 Exercises Reading Showing that the set system formulation of Task scheduling satisfies the three matroid properties. Characterization of independence of a set.References : CLRS Section 16.5 The single-source shortest path problem. Dijkstra's algorithm. References Exercises Reading The single-source shortest path problem. Dijkstra's algorithm.References : None Proof of correctness of Dijkstra's algorithm. References Exercises Reading Proof of correctness of Dijkstra's algorithm.References : None Priority queues. Binary min-heaps. Implementation and analysis. References Exercises Reading Priority queues. Binary min-heaps. Implementation and analysis.References : None Decrease key is invoked more often. We need to improve the cost. Discussion of ideas towards it. Mergeable heaps. Binomial trees and their properties. References Exercises Reading Decrease key is invoked more often. We need to improve the cost. Discussion of ideas towards it. Mergeable heaps. Binomial trees and their properties.References : None Fibonacci Heaps. Insert, Union, Extract-min - the algorithmic description and examples. References Exercises Reading Fibonacci Heaps. Insert, Union, Extract-min - the algorithmic description and examples.References : None Extract-min. Decrease key operations in Fib heaps. Cut and Cascade-cut operations. References Exercises Reading Extract-min. Decrease key operations in Fib heaps. Cut and Cascade-cut operations.References : None Proving the bound on the number of trees for Fibanocci heaps. Recall amortized analysis. Binary counter. Aggregate method. Pocket money method, Potential function method. References Exercises Reading Proving the bound on the number of trees for Fibanocci heaps. Recall amortized analysis. Binary counter. Aggregate method. Pocket money method, Potential function method.References : None Potential function method. En-queue-Dequeue problem using stacks. Charging method. Amortized analsysis of Fib heaps operations using potential function method. Completing the analysis of Dijskstra's algorithm. References Exercises Reading Potential function method. En-queue-Dequeue problem using stacks. Charging method. Amortized analsysis of Fib heaps operations using potential function method. Completing the analysis of Dijskstra's algorithm.References : None **Theme 4 : Dynamic Programming Technique**- 9 meetings- Meeting 37 : Mon, Oct 09, 11:00 am-11:50 am
- Meeting 38 : Tue, Oct 10, 10:00 am-10:50 am
- Meeting 39 : Wed, Oct 11, 09:00 am-09:50 am
- Meeting 40 : Thu, Oct 12, 12:00 pm-12:50 pm
- Meeting 41 : Mon, Oct 16, 11:00 am-11:50 am
- Meeting 42 : Tue, Oct 17, 10:00 am-10:50 am
- Meeting 43 : Thu, Oct 19, 12:00 pm-12:50 pm
- Meeting 44 : Mon, Oct 23, 11:00 am-11:50 am
- Meeting 45 : Tue, Oct 24, 10:00 am-10:50 am

Shortest paths in the presence of negative edge weights. Failure of a natural way of transforming into one with positive weights. Incrementally computing the shortest paths up to a given length. Formulation of an incremental computation of shortest path based on length of the path. References Exercises Reading Shortest paths in the presence of negative edge weights. Failure of a natural way of transforming into one with positive weights. Incrementally computing the shortest paths up to a given length. Formulation of an incremental computation of shortest path based on length of the path.References : None Bellman-Ford Algorithm. Detecting Negative cycles in the graph. References Exercises Reading Bellman-Ford Algorithm. Detecting Negative cycles in the graph.References : None Recursive formulation of shortest path problem. Subproblem overlaps. View of the Bellman-Ford Algorithm as a Subproblem solutions taking care of the overlapping. References Exercises Reading Recursive formulation of shortest path problem. Subproblem overlaps. View of the Bellman-Ford Algorithm as a Subproblem solutions taking care of the overlapping.References : None Cellphone Tower problem. Trivial exponential time algrotihm. Fibonacci recurrence. Overlapping Subproblems. Dynamic Programming solution. References Exercises Reading Cellphone Tower problem. Trivial exponential time algrotihm. Fibonacci recurrence. Overlapping Subproblems. Dynamic Programming solution.References : None Maximum Subarray problem from the Quiz 2. Recursive formulation and arguments. One variable DP. DPs requiring multiple variables. Longest Common Subsequence problem. Recursive formulation. Proof of correctness. References Exercises Reading Maximum Subarray problem from the Quiz 2. Recursive formulation and arguments. One variable DP. DPs requiring multiple variables. Longest Common Subsequence problem. Recursive formulation. Proof of correctness.References : None Details of the proof of the recursive formulation of LCS. DP algorithm. Retrieving the longest common subsequence. Tree-based DPs. References Exercises Reading Details of the proof of the recursive formulation of LCS. DP algorithm. Retrieving the longest common subsequence. Tree-based DPs.References : None Harder combinatorial optimization problems. Vertex Cover, Independent set. Trivial exponential time algorithm to solve the problem. Special case of line graphs (connecting to the mobile towers example). Special case of trees. DP formulation of MaxIS problem in trees. Correctness proof. References Exercises Reading Harder combinatorial optimization problems. Vertex Cover, Independent set. Trivial exponential time algorithm to solve the problem. Special case of line graphs (connecting to the mobile towers example). Special case of trees. DP formulation of MaxIS problem in trees. Correctness proof.References : None Review of DP formulations. Independent set and vertex cover problem. Analysing running time in terms of input size. Example - primality checking. Knapsack problem. Trivial exponential time algorithm. DP formulation of Knapsack with repetitiion. The input size. Pseudopolynomial running time. Knapsack with repetition. Proof of correctness. Running time analysis. References Exercises Reading Review of DP formulations. Independent set and vertex cover problem. Analysing running time in terms of input size. Example - primality checking. Knapsack problem. Trivial exponential time algorithm. DP formulation of Knapsack with repetitiion. The input size. Pseudopolynomial running time. Knapsack with repetition. Proof of correctness. Running time analysis.References : None (Completed Details of Knapsack DP formulation. Knapsack without repetition. Memoization.) <br> New algorithm design paradigm - iterative improvement of a global solution rather then building partial solutions and improving it locally. References Knapsack was done from DPV section 6.4.<br> Iterative programming was just an introduction and was not done particularly from any source. Exercises Reading (Completed Details of Knapsack DP formulation. Knapsack without repetition. Memoization.)

New algorithm design paradigm - iterative improvement of a global solution rather then building partial solutions and improving it locally.References : Knapsack was done from DPV section 6.4.

Iterative programming was just an introduction and was not done particularly from any source.**Theme 5 : Iterative Improvement Technique**- 6 meetings- Meeting 46 : Wed, Oct 25, 09:00 am-09:50 am
- Meeting 47 : Thu, Oct 26, 12:00 pm-12:50 pm
- Meeting 48 : Mon, Oct 30, 11:00 am-11:50 am
- Meeting 49 : Tue, Oct 31, 10:00 am-10:50 am
- Meeting 50 : Wed, Nov 01, 09:00 am-09:50 am
- Meeting 51 : Thu, Nov 02, 12:00 pm-12:50 pm

Flow networks, Maximum flow problem. Examples. Improving flow values. Trivial strategies to improve flow. Counter examples. Residual network and the capacities. Paths in the residual networks. References Exercises Reading Flow networks, Maximum flow problem. Examples. Improving flow values. Trivial strategies to improve flow. Counter examples. Residual network and the capacities. Paths in the residual networks.References : None Formal definition of residual network. Augmenting using flows in residual network to improve the flow in the original network. Augmented function gives a flow. References Exercises Reading Formal definition of residual network. Augmenting using flows in residual network to improve the flow in the original network. Augmented function gives a flow.References : None Augmentation always improves the flow value. Bound on the number of augmentation iterations. Plan of the proof that "no augmenting path" in residual network implies flow cannot be improved. Cuts, capacity of cuts and netflow across cuts. Examples. References Exercises Reading Augmentation always improves the flow value. Bound on the number of augmentation iterations. Plan of the proof that "no augmenting path" in residual network implies flow cannot be improved. Cuts, capacity of cuts and netflow across cuts. Examples.References : None For any flow, the net flow across any cut is equal to the flow value. If there is no augmenting path in the residual network of a flow, then there is a cut for which the flow value is equal to capacity of the cut, and hence the flow cannot be improved further. <br><br> Psuedopolynomial time. Ideas for improvement to polynomial time algorithm due to Edmond-Karp (without formal proof). References Exercises Reading For any flow, the net flow across any cut is equal to the flow value. If there is no augmenting path in the residual network of a flow, then there is a cut for which the flow value is equal to capacity of the cut, and hence the flow cannot be improved further.

Psuedopolynomial time. Ideas for improvement to polynomial time algorithm due to Edmond-Karp (without formal proof).References : None Maximum Bipartite Matching Problem. Transformation to an instance of maximum flow problem. Using the maximum flow to solve the maximum cardinality matching problem. Integrality of flow values. Formal proof of correctness of the transformation. If M is a matching of size k, then there is an integral flow of value k. References Exercises Reading Maximum Bipartite Matching Problem. Transformation to an instance of maximum flow problem. Using the maximum flow to solve the maximum cardinality matching problem. Integrality of flow values. Formal proof of correctness of the transformation. If M is a matching of size k, then there is an integral flow of value k.References : None Continuing the correctness of transformation. From an integral flow of value k to a matching in the bipartite graph of cardinality k. <br> Minimum vertex cover problem on bipartite graphs. Konig's theorem. Minimum vertex cover size is equal to minimum capacity of a cut. Using the maxflow-mincut theorem to complete the argument for Konig's theorem. References Exercises Reading Continuing the correctness of transformation. From an integral flow of value k to a matching in the bipartite graph of cardinality k.

Minimum vertex cover problem on bipartite graphs. Konig's theorem. Minimum vertex cover size is equal to minimum capacity of a cut. Using the maxflow-mincut theorem to complete the argument for Konig's theorem.References : None **Theme 6 : Intractability**- 4 meetings- Meeting 52 : Mon, Nov 06, 11:00 am-11:50 am
- Meeting 53 : Tue, Nov 07, 10:00 am-10:50 am
- Meeting 54 : Wed, Nov 08, 09:00 am-09:50 am
- Meeting 55 : Thu, Nov 09, 12:00 pm-12:50 pm

Formulating some algorithmic problems of relevance. Independent Set, Vertex Cover, Clique, Minckt Problem, Composites, Graph Isomorphism, Satisfiability checking. Need of efficient algorithms. Trivial exponential time algorithms. References Exercises Reading Formulating some algorithmic problems of relevance. Independent Set, Vertex Cover, Clique, Minckt Problem, Composites, Graph Isomorphism, Satisfiability checking. Need of efficient algorithms. Trivial exponential time algorithms.References : None Two questions. Do they have any structural differences as computational problems? Do they compare with each other in terms of difficulty - will having algorithm for one will imply for another? After Question 1 - "Easily" verifiable "short" certificates. Examples in the above set of problems. Formal definition of the class NP. Proof that P is contained in NP. The P vs NP problem. Discussion of the possibilities and their impact. References Exercises Reading Two questions. Do they have any structural differences as computational problems? Do they compare with each other in terms of difficulty - will having algorithm for one will imply for another? After Question 1 - "Easily" verifiable "short" certificates. Examples in the above set of problems. Formal definition of the class NP. Proof that P is contained in NP. The P vs NP problem. Discussion of the possibilities and their impact.References : None Question 2 - notion of reductions. The running time of the reduction. "many-one" reductions. Examples, Clique vs Independent Set. Formal proof of correctness of the reduction. Independent set vs Vertex Cover. References Exercises Reading Question 2 - notion of reductions. The running time of the reduction. "many-one" reductions. Examples, Clique vs Independent Set. Formal proof of correctness of the reduction. Independent set vs Vertex Cover.References : None The hardest problems in NP. Notion of NP-hardness and NP-completeness. Cook-Levin Theorem (Statement). 3SAT to Vertex Cover Reduction and its correctness. 2SAT can be solved in polynomial time. References Exercises Reading The hardest problems in NP. Notion of NP-hardness and NP-completeness. Cook-Levin Theorem (Statement). 3SAT to Vertex Cover Reduction and its correctness. 2SAT can be solved in polynomial time.References : None **Evaluation Meetings**- 4 meetings- Meeting 56 : Wed, Aug 23, 09:00 am-09:50 am
- Meeting 57 : Sat, Sep 09, 10:00 am-12:00 pm
- Meeting 58 : Thu, Oct 05, 12:00 pm-12:50 pm
- Meeting 59 : Sat, Oct 14, 10:00 am-12:00 pm

Short Exam - I References Exercises Reading Short Exam - IReferences : None Quiz - I References Exercises Reading Quiz - IReferences : None Short Exam - II References Exercises Reading Short Exam - IIReferences : None Quiz - II References Exercises Reading Quiz - IIReferences : None