algo home

Resources for Algorithms


This page does not change much over time. It contains course notes and videos, as well as other useful information, organized by topic.


Course Topics       (not necessarily in order of schedule; not all topics will be covered; refer to the schedule page)

  1. Things to review before starting this class
  2. big-O notation
  3. Recurrences
  4. Sorting
  5. Hashing
  6. Deterministic Selection (includes separate notes on partitioning)
  7. Indicator Random Variables
  8. Quicksort and RandSelect
  9. Binary search trees
  10. Augmenting data structures
  11. Amortized Analysis
  12. Dynamic programming
  13. Graphs: intro, BFS, DFS, topological sort, SCC
  14. Minimum spanning trees
  15. Single-source shortest paths
  16. P vs NP
Typically sections 7 and 16 are not covered, and 8 is partially covered.
Please refer to the schedule in the course forum.

Informal summary PDF

This PDF contains some brief informal descriptions of the course material. It's useful if you're looking for a quick summary, but it only complements the class notes, it does not replace them. If you spot any errors or particularly unclear descriptions, please let me know.



A note about videos

Most of the topics have recordings of live lectures from the Fall 2016 and Spring 2017 semesters at Tufts University. Some live lectures were recorded in Spring 2018 at NYU. Since then I have refreshed the course notes a bit, without making huge changes. I also created narrated presentations in 2021, to go with the refreshed notes. You should be able to follow the course using either format.
Some of the older videos, in particular for material corresponding to the first third of the course, might have some occasional audio static. (At some point I was given a much better microphone).
If you find a particular clip to be unclear, please let me know.




  1. Things to review before starting this class
    1. Discrete math
      • Common functions: do a light review of pages 53-59 in CLRS.
        See my discrete math notes
      • Geometric series (sum). This is an important and useful property that many algorithms rely on. I will recap what matters when necessary. The link above shows where to read about it in some common textbooks.
      • Proofs. In particular, by contrapositive, by contradiction, and by induction. This is critical for section 3.1.
      • Recursion.
      • Basic properties about graphs and trees (e.g., vertex degree, number of leaves in a full tree)
    2. Data structures
      • Arrays, linked lists, stacks, queues, heaps, binary search trees. We will quickly recap the last two.
        • You should understand that you can't access an arbitrary item in a linked list in constant time, or run binary search.
        • You should understand that you can't insert an element into the middle of a full array in constant time.
    3. Basic probability
        It will help if you already know what expected value is. This is defined in section 7 on IRVs, below.


  2. big-O notation (aka Theta notation)
    • Prerequisite knowledge: it helps to be familiar with common functions (see 1.1 above)


  3. Recurrences
    1. The recursion tree. Using induction.
    2. The Master method
    3. Examples of using recurrences; recursive algorithms. (Just browse, this is not exam material)
      1. Divide and conquer, and binary search.
        • Book: chapter 4, p.65-67
      2. Computing the power of a number
      3. Computing Fibonacci numbers
      4. Matrix multiplication (Strassen's method)


  4. Sorting
    1. Insertion sort (used as motivation for big-O; see section 2)
      • Book: chapter 2, p.16-29
    2. Merge sort (used as motivation for studying recurrences; see section 3.1)
      • Book: chapter 2, p.30-38
    3. Heap sort
    4. Decision trees. Sorting lower bound.
    5. Counting sort and radix sort
    6. Other
    7. Quicksort is covered in section 8.


  5. Hashing (basics)
    1. Recap of competing data structures, and introduction
    2. Chaining
    3. Open addressing, linear and quadrating probing, double hashing, analysis with uniform hashing assumption.
    4. Advanced further reading, not part of this course: Universal hashing, Perfect hashing, Cuckoo hashing (and more about the cuckoo mafia)


  6. Deterministic Selection (aka order statistics; median-finding)
    1. Partitioning
    2. General approaches towards getting linear time algorithms (including "prune and search")
      • Prerequisite knowledge: Recurrences (by induction; section 3.1)
    3. Algorithm
      • Sections 6.1 and 6.2 are not critical for this, but they might help. You will need to understand section 3.1 though (recurrences by induction).
    4. Book: Chapter 9, p.220-222.
    5. A quote by Bernard Chazelle.


  7. Indicator Random Variables
    • New class notes: full slideshow and condensed
    • New video: Slideshow with audio
    • Old class notes: full slideshow and condensed
    • Old video:
    • Book: Chapter 5
    • Examples
      1. The hat check problem.
        • FYI - Link 1 and link 2 calculate the probability of an outcome for this problem, without IRVs.
      2. The hiring problem.
        • Link. Note that the second problem in this link, involving birthdays, is mentioned below.
        • Harmonic series (related to the hiring problem).
      3. Finding local maxima in permutations.
        • Jump to the 30:00 mark in this youtube video, which is part of Harvard's Stat 110 course. This part of the lecture lasts 9 minutes. After that is an explanation of the St. Petersburg paradox which is fun to think about. Here's the wiki on that.
      4. Counting inversions in permutations. (involves IRVs with 2 subscripts)
        • Look at problem 2 in this homework set from a course at WVU. This follows the hat check problem. Problem 3 is not related to IRVs, but is interesting.
      5. A birthday problem. (involves IRVs with 2 subscripts)
        • The 2nd example in this link is a variant of our good old birthday problem. I discuss this and one more variant here (note: this takes you to my NYU discrete math page).
      6. A problem with balls and bins.
        • See the second example in this link. Evaluation of the expected value of each IRV is a bit more complicated in this example than in previous ones. Note that the first example in this link is equivalent to the hat check problem. It deals with fixed points in permutations. In a permutation of the integers 1...n, a fixed point occurs when integer k is placed at position k.


  8. Quicksort and RandSelect
    1. Quicksort
      1. Introduction and short derivation of expected runtime
      2. Analysis 1: expected runtime, with more detailed analysis
        • Recommendation: Look at the similar analysis for RandSelect first.
      3. Analysis 2: expected runtime via expected number of comparisons
    2. RandSelect (Randomized Selection)


  9. Binary search trees
    1. Basics
    2. Building a BST randomly, average depth, and relation to Quicksort.
      • Prerequisite knowledge: Expected value.
    3. Red-black trees (an example of dynamic balanced BST)


  10. Augmenting data structures
    • Prerequisite knowledge: binary search trees (section 9)
    1. Intro, dynamic rank finding and dynamic selection
    2. Range counting (almost the same as dynamic rank queries)
    3. Interval trees


  11. Amortized analysis

  12. Dynamic programming
    1. Counting paths
    2. Longest common subsequence
    3. Longest increasing subsequence
    4. Rod cutting


  13. Graphs: intro, BFS, DFS, topological sort, SCC
    1. Representation and types of graphs (and a little bit about planarity)
    2. Breadth-first search (BFS) and depth-first search (DFS)
    3. Topological sort
    4. Strongly connected components


  14. Minimum spanning trees
    • Prerequisite knowledge: graph basics, familiarity with priority queues (e.g. heaps).
    1. Intro and properties
    2. Kruskal's algorithm
    3. Prim's algorithm
      • Prerequisite knowledge: Familiarity with heaps as priority queues.


  15. Single source shortest paths
    • Prerequisite knowledge: graph basics.
    1. General properties, relaxing edges.
    2. Dijkstra's algorithm
      • Prerequisite knowledge: Familiarity with heaps as priority queues.
    3. Bellman-Ford algorithm
    4. Algorithm for DAGs
      • Note: the proof of correctness relies on the main lemma that comes before Bellman-Ford.


  16. P vs NP
    1. Intro
      • Class notes: full slideshow and condensed
      • Video: My 2016 lecture until 28:00, then continues with reductions.
      • Book: Chapter 34, p.1048-1053, and 1061-1069. However, I introduce the topic more informally, for instance, without mentioning "languages".
    2. Examples of reductions
      • Class notes: full slideshow and condensed
      • Video: My 2016 lecture part 1 (starting from 28:00, i.e., continuing from above), then part 2
      • Book: Chapter 34, from p.1070 and on. This goes into much more detail than I will. Just browse through accordingly.
    3. Other
      • Use your new-found knowledge to torment your friends.
      • Here are two papers about well-known video games that can be generalized to be really hard. __1__ __2__
      • A brief, mostly non-mathematical, introductory video about P vs NP, shown to me by a former Algo student.