Recall the Partition subroutine that we used in both Quick Sort and RSelect. Suppose the following array has just been partitioned around some pivot elements:
$$ A = [3, 1, 2, 4, 5, 8, 7, 6, 9] $$
Which of these elements could have been the pivot element?
Answer: $[4, 5, 9]$ - since they all have remaining bigger/smaller numbers partitioned around them correctly.
Here is an array of ten integers:
$$ A = [5, 3, 8, 9, 1, 7, 0, 2, 6, 4] $$
Suppose we run MergeSort on this array. What is the number in the 7th position of the partially sorted array after the outermost two recursive calls have completed (i.e., just before the very last Merge step)? (When we say “7th” position, we’re counting positions starting at 1; for example, the input array has a “0” in its 7th position.)
Answer: 2. Draw and merge by hand.
What is the asymptotic worst-case running time of Merge Sort, as a function of the input array length $n$?
Answer: $\Theta(n\log(n))$
What is the asymptotic running time of Randomized Quick Sort on arrays of length $n$, in expectation (over the choice of random pivots) and in the worst case, respectively?
Answer: $\Theta(n\log(n))$ and $O(n^2)$, respectively.
Let $f$ and $g$ be two increasing functions, defines on the natural numbers, with $f(1), g(1)\geq 1$. Assume that $f(n) \in \mathcal{O}(g(n))$. Is $2^{f(n)}\in \mathcal{O}(2^{g(n)})$?
Let $0 < \alpha < 0.5$ be some constant. Consider running the Partition subroutine on an array with no duplicate elements and with the pivot element chosen uniformly at random (as in QuickSort and RSelect). What is the probability that, after partitioning, both subarrays (elements to the left of the pivot, and elements to the right of the pivot) have size at least $\alpha$ times that of the original array?
Answer: $1-2\alpha$
Suppose that a randomized algorithm succeeds with probability $p\ (0 < p < 1)$. Let $\epsilon$ be a small positive number (less than $1$). How many independent times do you need to run the algorithm to ensure that, with probability at least $1 - \epsilon$, at least one trial succeeds?
Answer: Supposed we run the algorithm for $n$ times, therefore the probability to success and to fail is $(1 - p)^n$ and $1 - (1-p)^n$, respectively.
Hence,
$$ \begin{aligned} 1-(1-p)^n &> 1 - \epsilon\ -(1-p)^n & >-\epsilon \ (1 - p)^n &< \epsilon \ n\log(1-p) &< \log(\epsilon)\ n &> \frac{\log(\epsilon)}{\log(1-p)} & \end{aligned} $$
Suppose you are given $k$ sorted arrays, each with $n$ elements, and you want to combine them into a single array of $kn$ elements. Consider the following the approach:
Answer: $\Theta(nk\log(k))$
Running time of Strassen’s matrix multiplication algorithm: suppose that the running time of an algorithm is governed by the recurrence
$$ T(n)=7\times T\left(\frac{n}{2}\right) + \mathcal{O}(n^2) $$
What is the overall asymptotic running time?
Answer:
$$ a = 7, b=2, d=2, b^d=2^2=4<a \Rightarrow T(n) \in \mathcal{O}(n^{\log_b(a)})=\mathcal{O}(n^{\log_2{7}}) $$
Recall the Master Method and its three parameters $a, b, d$. Which of the following is the best interpretation of $b^d$, in the context of divide-and-conquer algorithm?
Answer: The rate at which the work-per-subproblem is shrinking (per level of recursion).