D. J. Bernstein
Data structures and program structures
MCS 401 and EECS 460, Fall 1998

# What I did in class

19980824: [First-day handout.] Topic of course: Solving problems quickly. Student goals: learning many standard algorithms; learning techniques to analyze speed of new algorithms; learning techniques to design faster algorithms. The sequential-search problem. The standard sequential-search algorithm. Number of comparisons performed by algorithm, as a function of the input. The worst-case number of comparisons, for all inputs of length n. The average number of comparisons, if each output is equally likely. The average number of comparisons, if the input is never found. Input distributions. Example of a uniform input distribution for this algorithm. Other factors affecting the time taken by an algorithm. An improved sequential-search algorithm.

19980826: Definitions of speedup, slowdown. Example of a constant speedup. Example of a nonconstant speedup: bubblesort vs. quicksort. Intuitive concept of order. List of asymptotic notations: O(), o(), Omega(), Theta(). Definition of O() as a set of functions. Restatement of the definition: condition for g to be in O(f). Examples. Common language for talking about O(). More examples.

19980828: [Homework assignment #1.] Review of O(). Graph of function in O(1) but without a limit. Definition of Omega(). Warning about different use of Omega(). Examples. Definition of Theta(). Examples. Definition of o(). Examples. Typical situation in course: improving Theta(n^3) to Theta(n^2) or better. Important criteria in selecting an algorithm: simplicity; speed. The myth that shorter algorithms are faster. Knuth quote on premature optimization.

19980831: [General comments on homework.] Algorithm correctness. Algorithm space. Sample memory hierarchy for a typical computer. How space affects time: algorithms that don't use much memory have faster memory access. The idea of space-time tradeoffs: using extra space to save time. The sorting problem. Example: sorting phone numbers by name for a phone book. Worst-case run time, average run time, and space for my ten favorite sorting algorithms.

19980902: Example of insertion sort. Straight insertion sort. Number of moves is at most number of comparisons in straight insertion sort. Number of comparisons for a reverse-order list. How to compute 1+2+...+k. Upper bound for number of comparisons. Average number of comparisons when all key orders are equally likely. Speed of binary insertion sort. Binary search.

19980904: [Homework assignment #2.] C code for straight insertion sort, with combined compare-move loop. Example of how code sorts M, A, T, H. Total run time is sum of time spent in each line of code. C code for straight insertion sort, with separate find() and shift() functions. Example of how code sorts M, A, T, H. Comments on constant-factor issues. C code for binary search. Example of searching for F in A, D, E, H, I, J, K. Time for binary search. Idea of proof. Comparison time for binary insertion sort. Move time for binary insertion sort.

19980907: No class.

19980909: Insertion sort is fast for sequences close to sorted order. C code for h-sorting. Warning that h-sorting does not necessarily put entire sequence into order; it only puts h-subsequences into order. 2-sort followed by 1-sort. Average time for 1-sort after 2-sort. Shell sort: h-sorting for any sequence of h's ending with 1. Problem of choosing best sequence of h's. Shell-Pratt sort. ``Combsort'': Shell sort using partial h-sorting.

19980911: Using binary search in Shell sort: bad idea. Many more variants; too many to analyze completely. Merging. Merging example. C code for merging. Merging takes time linear in input size. Merge sort picture. Merge sort example. Recursive statement of merge sort. Intuitive analysis of merge sort time, according to picture. Mathematical analysis of merge sort time.

19980914: [Homework assignment #3.] General divide-and-conquer approach: split, recurse, combine. Merge sort as example of divide-and-conquer. Drawing divide-and-conquer picture when size of problem is total size of subproblems. Divide-and-conquer time, in terms of picture, when splitting and combining take linear time. Example of bad splits: picture with too many rows. Merge sort results. Possibility of doing minimum-space merging. Switching over to (e.g.) insertion sort for (e.g.) 10 keys. Finding the best cutoff. C code for bubble sort. Bubble sort example.

19980916: Bubble sort time. Variants of bubble sort: slower than insertion sort in practice. Results of pivoting. C code for pivoting. Energy proof that pivoting takes linear time. Quick sort. Choosing the pivot as the first element; sorted input is worst case. Choosing the pivot as the middle element; best case, worst case, average case. Choosing the pivot as median-of-3; better average case. Choosing the pivot as median, using linear-time median algorithm. Radix exchange: choosing the pivot as middle of alphabet. Choosing the pivot as mean of smallest and largest input keys. Choosing the pivot as mean of keys. Historical notes. Difficulty of analyzing radix exchange.

19980918: Problems with quick sort. Selection sort. Properties of a heap: no extra space, O(lg n) add time (and O(n) build time), O(lg n) remove-maximum time. Heap sort time. Definition of a heap. Heap picture. Example. Many examples of removing maximum element from heap. Upper bound for number of comparisons in top-down and bottom-up heap sort. Improved upper bound for number of comparisons in binary heap sort.

19980921: [Homework assignment #4.] C code for bottom-up heap sort. Example of how code sorts S, O, R, T, I, N, G. Difference between bottom-up heap sort and top-down heap sort. Time for heap construction. Taking derivative of (x^n-1)/(x-1) to compute 1 + (2)2 + (3)2^2 + ... + (n-1)2^{n-2}.

19980923: The number of permutations of n distinct keys. Stirling's formula for n!. The n lg n lower bound for bits of information necessary to sort n keys. Average for uniformly distributed keys; the 99.9% argument. Speed of LSD radix sort. Speed of MSD radix sort. Distribution sort for single-byte keys. MSD radix sort as a divide-and-conquer algorithm. Best sorting algorithm in practice.

19980925: [Solutions to homework assignments #1, #2, #3.] Fibonacci numbers. Computing F_n recursively: linear space, exponential number of additions. Example for n = 7. Table of F_n. Approximate size of F_n. Computing F_n by dynamic programming: linear space, n additions. Saving space by discarding unused information: constant space, n additions. Powers of the Fibonacci matrix. Proof that powers of the Fibonacci matrix involve Fibonacci numbers. Computing powers in O(log n) matrix multiplications.

19980928: Review for midterm.

19980930: [Midterm #1.]

19981002: [Homework assignment #5.] Comments on midterm. Definition of a digraph. Example of a digraph. Drawing picture of a digraph. Another example of a digraph. Common digraph assumptions: finite vertex set; no loops.

19981005: Example of digraph used by the make program. Definition of weighted digraph. Example of weighted digraph: flight costs. Example of weighted digraph: network capacities. Definition of graph. Example of digraph that is not a graph. Example of graph. Definitions of in-degree, out-degree, degree. Definition of adjacency relation for a graph. Definition of (simple) path, path length. Definition of (simple) cycle. Examples of paths. Definition of acyclic digraph (DAG): digraph with no cycles of length >= 1. Definition of tree: connected graph with no cycles of length >= 3.

19981007: Definition of connected. Examples of connected, non-connected digraphs. Matrix representation of digraph. Edge-list representation of digraph. Space used by matrix. Space used by edge list. Time needed to check for an edge between two vertices. Sorting the edge list. Time needed to find all edges to a vertex. Time needed to find all edges from a vertex; sorting edge list two ways. Adding vertex pointers into edge list. Shortest-path, single-source shortest path, all-pairs shortest path. Lowest-weight paths in weighted digraph. Time used by naive shortest-path algorithm. Time used by Dijkstra's algorithm for single-source shortest path.

19981009: [Homework assignment #6.] Example of finding lowest-weight paths by naive algorithm. Questions asked by Dijkstra's algorithm: what is lowest-weight vertex? second-lowest-weight vertex? etc. Example of Dijkstra's algorithm. General statement of Dijkstra's algorithm. Time taken by Dijkstra's algorithm: #V^2. Speeding up Dijkstra's algorithm with heaps and edge lists. Another example of Dijkstra's algorithm.

19981012: More examples of Dijkstra's algorithm. Time to solve all-pairs shortest path. Floyd's algorithm: reorganization of Dijkstra's algorithm for all-pairs shortest path, easier to program. Warshall's algorithm: Floyd's algorithm when all weights are 1. Run time of all-pairs shortest path using fast matrix multiplication. Definition of minimum spanning tree. Examples of spanning trees. Prim's algorithm for building a minimum spanning tree. Example of Prim's algorithm.

19981014: Connected component of a vertex in a graph. Example. Dijkstra's algorithm with all edge weights 0. Example. Speeding up Dijkstra's algorithm by changing heap to list. Depth-first search: use list as a stack. Examples. Breadth-first search: use list as a queue. Example.

19981016: [Homework assignment #7.] Scheduling in the make program. Topological sorting. Impossibility of topological sorting for cyclic digraphs. Recursive C code for attempted topological sorting by depth-first search. Examples. Result of code if digraph is acyclic: topological sorting. Result of code if digraph is cyclic: can easily verify that digraph is cyclic. Time and space for attempted topological sorting. Extra time for checking whether digraph is acyclic.

19981019: Definition of string containment. Example: all strings contained in xyzzy. String-matching problem. C code for obvious algorithm. Bound on number of character comparisons. Best case when string is not found. Worst case. Typical case. Results of Knuth-Morris-Pratt. Results of Boyer-Moore. Examples of the Boyer-Moore idea. Using a character-indexed array to avoid multiple comparisons. Simplest form of the Boyer-Moore idea. More complicated forms of the Boyer-Moore idea. Further improvements.

19981021: C code for Knuth-Morris-Pratt. Lack of backtracking in Knuth-Morris-Pratt. Potential energy in Knuth-Morris-Pratt; bound on number of comparisons. Examples.

19981023: [Solutions to homework assignments #5, #6.] Final comments on Knuth-Morris-Pratt. Review for midterm.

19981026: Fast matrix multiplication.

19981028: Using fast matrix multiplication for fast transitive closure.

19981030: [Midterm #2.]

19981102: Comments on midterm. Big questions: How to design fast algorithms? How to analyze speed of algorithms? Do fast algorithms always exist? Some answers coming up: caching; solving recurrences; NP-completeness. Caching generally. Examples of caching: memory caches, common subexpression elimination, strength reduction. Dynamic programming: caching all computed results. Dynamic-programming algorithms already seen in course.

19981104: The prefix sum problem. Left-to-right algorithm. Number of additions. Dynamic programming, step 1: saving all computed results. Dynamic programming, step 2: reusing previously computed results. Number of additions for left-to-right algorithm with dynamic programming. Right-to-left algorithm. Number of additions for right-to-left algorithm with dynamic programming. Dependence of dynamic programming results on the order of operations. The substring sum problem. Obvious cubic-time algorithm. Quadratic-time algorithm.

19981106: [Homework assignment #8.] Changing order of operations in quadratic-time algorithm for the substring sum problem. Computing best answer ending at each position. Reducing time to linear. Reducing space to constant. Examples. The paragraphing problem. The badness of a line. Examples.

19981109: Input and output of paragraphing. Final-line badness. Example of paragraphing: 3 1 4 1; 5; 9 2; 6. Improvement: 3 1 4; 1 5; 9 2; 6. Obvious exponential-time algorithm. Computing best sum of badnesses for line breaks ending at each position. Example. Total work for paragraphing n words: O(n^2) badness computations. Example of multiplying many matrices. Time for rectangular matrix multiplication.

19981111: Recursive definition of parenthesization of a product. Example. Statement of the many-matrix-multiplication problem. Recursive definition of time for a parenthesization, given the time for rectangular matrix multiplication. Straightforward exponential-time recursive algorithm. Examples of number of recursive calls. Modifying algorithm by caching the result for each input; cubic time. Examples.

19981113: [Homework assignment #9.] The set of vertices of distance n from a vertex in a digraph. Recursive statement of the definition of distance. Recursion for the distance-n sets. Dijkstra's algorithm as dynamic programming. Example. Generalization: weighted digraphs with positive integer weights. Example.

19981116: Greedy algorithms. Previous examples of greedy algorithms. Possibility that a greedy algorithm does not exist for a problem. Backtracking algorithms. Time for backtracking algorithms. Addition chains. Using addition chains for multiplication. Binary addition chain for 15. General upper bound on chain length from binary method. Lower bound. Better addition chain for 15. Problem of finding shortest addition chains. Another binary method as a greedy algorithm. Finding shortest addition chain by backtracking.

19981118: Backtracking heuristics. Heuristic for addition chains. Solving recurrences by guesswork. Example: T(n) = T(n-1)+T(n-2)+...+T(1). Recognizing patterns for small values of n. Many-matrix-multiplication example: T(n) = 2T(n-1)+2T(n-2)+...+2T(1)+2T(0)+2n. Estimating base for an exponential function by division. Difference between finding solution and proving solution. What division does to a polynomial example.

19981120: Guesswork with undetermined coefficients. Reducing recurrences. Strassen example.

19981123: [Homework assignment #10.] Simplifying a recurrence for T(n) by considering T(n)-T(n-1). Example. Reducing a recurrence for T(n) to a recurrence for T(n)-T(n-1). Examples. Theorems for divide-and-conquer recurrences. Proofs.

19981125: Asymptotics of balanced divide-and-conquer recurrences. Examples. Asymptotics of overbalanced divide-and-conquer recurrences. Examples. Asymptotics of underbalanced divide-and-conquer recurrences. Examples. Definition of P. ``Is this array sorted?'' is in P. ``Is there a path from x to y in this digraph of length under L?'' is in P. A problem that is not in P. Definition of NP.

19981127: No class.

19981130: Proving compositeness of 91. Proving compositeness of 1234321. ``Is the input composite?'' is in NP. How close we are to proving that ``Is the input composite?'' is in P: there is a nearly-polynomial-time algorithm that answers the question; there is a polynomial-time algorithm that is believed to answer the question; there is a random algorithm that takes polynomial time on average. ``Is the input k-colorable?'' is in NP. Example: 4-coloring a complete graph. Example: 2-coloring a bipartite graph. Relevance of graph coloring to code optimization. The question of whether there are fast algorithms for graph coloring. The P=NP question. NP-completeness of CNF-SAT. Definition of NP-completeness.

19981202: [Solutions to homework assignments #8, #9, #10.] Recognizing problems in NP. Example. Recognizing NP-complete problems. What to do if a problem is NP-complete. Review for final.

19981204: Review for final.

19981209: Final exam, 8-10.