2019-09-26 23:30:53 -07:00
|
|
|
|
Q: What's the best-case, worst-case, and average-case time complexities of quicksort.
|
|
|
|
|
Briefly explain each case.
|
|
|
|
|
A: Quicksort has the worst-case complexity of O(n^2). This is because in the worst case,
|
|
|
|
|
it will have to iterate over n, n-1, n-2,...,1 items. If the pivot is not picked
|
|
|
|
|
randomly, this is guranteed to occur when the list is sorted in either direction.
|
|
|
|
|
If the pivot is picked randomly, there is still a chance that the pivot will be either the largest or the smallest of the subarray in question.
|
|
|
|
|
|
|
|
|
|
The best-case complexity is O(n*log(n)) because each recursive operation will cut the size
|
|
|
|
|
of the input in half. Since the total number of items sorted at a particular depth is
|
|
|
|
|
always n, and the depth is logarithmically related to the number of items, the complexity
|
|
|
|
|
is O(n*logn(n)).
|
|
|
|
|
|
|
|
|
|
On average, Quicksort is also O(n*log(n)). It's quite difficult to consistently pick
|
|
|
|
|
a pivot that is either the smallest or the largest. I am unfamilliar with proof
|
2019-09-30 21:04:48 -07:00
|
|
|
|
techniques that help formalize this, but we can think of a case in which
|
|
|
|
|
some non-half fraction (say j/k) of the elements
|
|
|
|
|
is on the left of the pivot. In this case, the depth ends up being a multiple
|
|
|
|
|
of log_k(n), meaning that the depth is still logarithmic and the complexity is
|
|
|
|
|
still O(n*log(n)).
|
2019-09-26 23:30:53 -07:00
|
|
|
|
|
|
|
|
|
Q: What's the best-case, worst-case, and average-case time complexities? Briefly explain.
|
|
|
|
|
A: For the same reason as quicksort, in the worst case, the complexity is O(n^2).
|
|
|
|
|
If the algorithm consistently places all the elements to one side of the pivot,
|
|
|
|
|
it will need to continue sorting n, n-1, n-2, ..., 1 items.
|
|
|
|
|
|
|
|
|
|
In the best case, the input size is halved. This means that first n numbers
|
|
|
|
|
are processed, then n/2, then n/4, and so on. We can write this as n*(1+1/2+1/4+...).
|
|
|
|
|
The formula for the sum of a finite number of terms from a geometric series
|
|
|
|
|
is n(1-r^k)/(1-r). This simplifies to 2n(1-r^k). Since 1-2^k < 1,
|
|
|
|
|
n*(1+1/2+1/4+...) < 2n. This means the complexity is O(n).
|
|
|
|
|
|
2019-09-30 21:04:48 -07:00
|
|
|
|
Similarly to quicksort, we can assume j/k elements are on the left
|
|
|
|
|
of the pivot. Then, the the longest possible computation will end up
|
|
|
|
|
looking at nj/k elements, then nj^2/k^2, and so on. This is effectively
|
|
|
|
|
n times the sum of the geometric series with r=j/k. This means
|
|
|
|
|
the sum is n * c, and thus, the complexity is O(n).
|
2019-09-26 23:30:53 -07:00
|
|
|
|
|
|
|
|
|
Q: What are the time complexities for the operations implemented?
|
2019-09-30 21:04:48 -07:00
|
|
|
|
A: The complexity of sorted is O(n).
|
|
|
|
|
Since I use an accumulator array, array append is O(1). Then, all
|
|
|
|
|
that's done is an in-order traversal of the tree, which is O(n),
|
|
|
|
|
since it visits every element of the tree.
|
2019-09-26 23:30:53 -07:00
|
|
|
|
|
|
|
|
|
Since insert and search both use _search, and perform no steps above O(1), they are
|
|
|
|
|
of the same complexity as _search. _search itself is O(logn) in the average case,
|
|
|
|
|
and O(n) in the worst case, so the same is true for the other algorithms. These
|
|
|
|
|
complexities are as such because _search is a simple binary search.
|
|
|
|
|
|
|
|
|
|
Debriefing:
|
|
|
|
|
1. Approximately how many hours did you spend on this assignment?
|
|
|
|
|
~1 hour.
|
|
|
|
|
2. Would you rate it as easy, moderate, or difficult?
|
|
|
|
|
Very easy, especially since it followed from the slides on day 1.
|
|
|
|
|
3. Did you work on it mostly alone, or mostly with other people?
|
|
|
|
|
Just me, myself, and I.
|
|
|
|
|
4. How deeply do you feel you understand the material it covers (0%–100%)?
|
|
|
|
|
80%. Determining actual complexities of recursive functions has not yet been
|
|
|
|
|
taught in class, and I haven't consulted the books. For best and worst case,
|
|
|
|
|
though, It's pretty simple.
|
|
|
|
|
5. Any other comments?
|
|
|
|
|
I'd prefer the code for "broken qsort" to be available on Canvas. Maybe I missed it :)
|