Input: two $n$-digits intergers $x$ and $y$
Output: the product of $x \dot y$
Primitive operation: add or multiply a 2 single-digit numbers
Upshot: number of operations overall $\leq$ constant $\dot n^2$.
The Algorithm’s Designer’s Mantra:
“Perhaps the most important principle for the good algorithm designer is to refuse to be content.”
– Aho, Hopcroft, and Ullman, The Design and Analysis of Computer Algorithms, 1974
Let $x = 5678, y = 1234$
Let $a = 56, b = 78, c = 12, d = 34$.
6720000
2652
284000
-------
7006652 (answer)
Rewrite $x = 10^{\frac{n}{2}}a + b, y = 10^{\frac{n}{2}}c + d$ where $a, b, c, d$ are $\frac{n}{2}$-digit numbers.
Then, $xy = 10^{n}ac + 10^{\frac{n}{2}}(ad + bc) + bd$.
Idea: continue to compute $ac, ad, bc, bd$ recursively.
Recall: $xy = 10^{n}ac + 10^{\frac{n}{2}}(ad + bc) + bd$.
Upshot: only need 3 recursive multiply (and some additions).
Why study merge sort?
Running time of Merge
Upshot: running time of merge on array of $m$ numbers is $\leq 4m + 2$.
Claim: Merge Sort requires $\leq 6n\log(n) + 6n$ operations to sort $n$ numbers.
Total: $\leq 6n(\log(n) + 1)$ operations at most.
Worst-case analysis:
As opposed to:
Worst-case usually easier to analyze.
Won’t pay much attention to constant factors, lower-order terms.
Asymtotic analysis: focus on running time for large input sizes $n$.
“Fast” algorithm $\approx$ worst-case running time grows slowly with input size.
3-way-Merge Sort : Suppose that instead of dividing in half at each step of Merge Sort, you divide into thirds, sort each third, and finally combine all of them using a three-way merge subroutine. What is the overall asymptotic running time of this algorithm? (Hint: Note that the merge step can still be implemented in $\mathcal{O}(n)$ time).
Answer: $n\log{n}$
You are given functions $f$ and $g$ such that $f \in \mathcal{O}(g(n))$. Is $f(n)\log(f(n)^c) \in \mathcal{O}(g(n)\log(g(n)))$ - with $c$ is some positive constant? You should assume that $f$ and $g$ are non-decreasing and always bigger than 1
Answer: True
Assume again two (positive) nondecreasing functions $f$ and $g$ such that $f(n) \in \mathcal{O}(g(n))$. Is $2^{f(n)} \in \mathcal{O}(2^{g(n)})$?
Answer:
k-way-Merge Sort. Suppose you are given $k$ sorted arrays, each with $n$ elements, and you want to combine them into a single array of $kn$ elements. Consider the following approach. Using the merge subroutine taught in lecture, you merge the first 2 arrays, then merge the $3^{rd}$ given array with this merged version of the first two arrays, then merge the $4^{th}$ given array with the merged version of the first three arrays, and so on until you merge in the final ($k^{th}$) input array. What is the running time taken by this successive merging algorithm, as a function of $k$ and $n$?
Answer: $\Theta(nk^2)$
Arrange the following functions in increasing order of growth rate (with $g(n)$ following $f(n)$ in your list if and only if $f(n) = \mathcal{O}(g(n))$):
First case:
Answer: $n^2 \leq n^2\log(n) \leq n^{\log(n)} \leq 2^n \leq 2^{2n}$
Details: Take logarithm (any base):
If $f(x) \leq g(x)$ then $\log(f(x)) \leq \log(g(x))$ and vice versa. Hence we have the answer.