Coursera

Week 1

Why study algorithms?

Integer Multiplication

Input: two $n$-digits intergers $x$ and $y$

Output: the product of $x \dot y$

Primitive operation: add or multiply a 2 single-digit numbers

Upshot: number of operations overall $\leq$ constant $\dot n^2$.

The Algorithm’s Designer’s Mantra:

“Perhaps the most important principle for the good algorithm designer is to refuse to be content.”

– Aho, Hopcroft, and Ullman, The Design and Analysis of Computer Algorithms, 1974

Karatsuba Multiplication

Example

Let $x = 5678, y = 1234$

Let $a = 56, b = 78, c = 12, d = 34$.

  1. Compute $a\dot c = 672$
  2. Compute $b\dot d = 2652$
  3. Compute $(a + b)(c + d) = 134\dot 46 = 6164$
  4. Compute (3) - (2) - (1) = 2840.
  5. Pad the (1) with 4 zeros to get 6720000, pad the (2) with no zeros, pad the (4) with 2 zeros. Hence
6720000
   2652
 284000
-------
7006652 (answer)

A Recursive Algorithm

Rewrite $x = 10^{\frac{n}{2}}a + b, y = 10^{\frac{n}{2}}c + d$ where $a, b, c, d$ are $\frac{n}{2}$-digit numbers.

Then, $xy = 10^{n}ac + 10^{\frac{n}{2}}(ad + bc) + bd$.

Idea: continue to compute $ac, ad, bc, bd$ recursively.

The algorithm

Recall: $xy = 10^{n}ac + 10^{\frac{n}{2}}(ad + bc) + bd$.

  1. Recursively compute $ac$
  2. Recursively compute $bd$
  3. Recursively compute $(a + b)(c + d) = ac + ad + bc + bd$.
  4. Gauss’s trick: $(3) - (1) - (2) = ad + bc$.

Upshot: only need 3 recursive multiply (and some additions).

Merge Sort

Why study merge sort?

Running time of Merge

Upshot: running time of merge on array of $m$ numbers is $\leq 4m + 2$.

Claim: Merge Sort requires $\leq 6n\log(n) + 6n$ operations to sort $n$ numbers.

Total: $\leq 6n(\log(n) + 1)$ operations at most.

Guiding principles for Analysis for Algorithms

Worst-case analysis:

As opposed to:

Worst-case usually easier to analyze.

Won’t pay much attention to constant factors, lower-order terms.

Asymtotic analysis: focus on running time for large input sizes $n$.

“Fast” algorithm $\approx$ worst-case running time grows slowly with input size.

Quiz

Question 1

3-way-Merge Sort : Suppose that instead of dividing in half at each step of Merge Sort, you divide into thirds, sort each third, and finally combine all of them using a three-way merge subroutine. What is the overall asymptotic running time of this algorithm? (Hint: Note that the merge step can still be implemented in $\mathcal{O}(n)$ time).

Answer: $n\log{n}$

Question 2

You are given functions $f$ and $g$ such that $f \in \mathcal{O}(g(n))$. Is $f(n)\log(f(n)^c) \in \mathcal{O}(g(n)\log(g(n)))$ - with $c$ is some positive constant? You should assume that $f$ and $g$ are non-decreasing and always bigger than 1

Answer: True

Question 3

Assume again two (positive) nondecreasing functions $f$ and $g$ such that $f(n) \in \mathcal{O}(g(n))$. Is $2^{f(n)} \in \mathcal{O}(2^{g(n)})$?

Answer:

Question 4

k-way-Merge Sort. Suppose you are given $k$ sorted arrays, each with $n$ elements, and you want to combine them into a single array of $kn$ elements. Consider the following approach. Using the merge subroutine taught in lecture, you merge the first 2 arrays, then merge the $3^{rd}$ given array with this merged version of the first two arrays, then merge the $4^{th}$ given array with the merged version of the first three arrays, and so on until you merge in the final ($k^{th}$) input array. What is the running time taken by this successive merging algorithm, as a function of $k$ and $n$?

Answer: $\Theta(nk^2)$

Question 5

Arrange the following functions in increasing order of growth rate (with $g(n)$ following $f(n)$ in your list if and only if $f(n) = \mathcal{O}(g(n))$):

First case:

  1. $n^2\log(n)$
  2. $2^n$
  3. $2^{2^n}$
  4. $n^{\log(n)}$
  5. $n^2$

Answer: $n^2 \leq n^2\log(n) \leq n^{\log(n)} \leq 2^n \leq 2^{2n}$

Details: Take logarithm (any base):

  1. $\log(n^2\log(n)) = \log(n^2) + \log(\log(n)) = 2\log(n) + \log(\log(n))$
  2. $\log(2^n) = n$
  3. $\log(2^{2^n}) = 2^n$
  4. $\log(n^{\log(n)}) = \log(n)\log(n) = (\log(n))^2$
  5. $\log(n^2) = 2\log(n)$

If $f(x) \leq g(x)$ then $\log(f(x)) \leq \log(g(x))$ and vice versa. Hence we have the answer.