Course web page for Data Structures H343 Fall 2022
View the Project on GitHub IUDataStructuresCourse/course-web-page-fall-2022
Definition (Big-O) For a given function g, we define O(g) as the the set of functions that grow similarly or slower than g. More precisely, f ∈ O(g) iff ∃ k c. ∀ n ≥ k. f(n) ≤ c g(n).
Notation We write f ≲ g iff f ∈ O(g), and say that f is asymptotically less-or-equal to g.
Show that:
2 log n ≲ n/2
Hint: experiment with values of n that are powers of 2 because it is easy to take the log of them:
log(2) = 1
log(4) = 2
log(8) = 3
log(16) = 4
...
Two words are anagrams of each other if they contain the same characters, that is, they are a rearrangement of each other.
examples: mary <-> army, silent <-> listen, doctor who <-> torchwood
For the following algorithms, what’s the time complexity? space complexity?
Algorithm 0: Generate all permutations of the first word. This is O(n!) time and O(n) space.
Algorithm 1: For each character in the first string check it off in the second string. This is O(n²) time and O(n) space.
Algorithm 2: Sort both strings then compare the sorted strings for equality This is O(n lg n) time and O(1) space.
Algorithm 3: Count letter occurences in both words and then compare the number of occurences of each letter. This is O(n) time and O(k) space (where k is the number of characters in the alphabet).
public static void insertion_sort(int[] A) {
for (int j = 1; j != A.length; ++j) {
int key = A[j];
int i = j - 1;
while (i >= 0 && A[i] > key) {
A[i+1] = A[i];
i -= 1;
}
A[i+1] = key;
}
}
What is the time complexity of insertion_sort?
Answer:
Definition (Omega) For a given function g(n), we define Ω(g) as the set of functions that grow at least as fast a g(n):
f ∈ Ω(g) iff ∃ k c, ∀ n ≥ k, 0 ≤ c g(n) ≤ f(n).
Notation f ≳ g means f ∈ Ω(g).
Definition (Theta) For a given function g(n), Θ(g) is the set of functions that grow at the same rate as g(n):
f ∈ Θ(g) iff ∃ k c₁ c₂, ∀ n ≥ k, 0 ≤ c₁ g(n) ≤ f(n) ≤ c₂ g(n).
We say that g is an asymptotically tight bound for each function in Θ(g).
Notation f ≈ g means f ∈ Θ(g)
Symmetry: f ≈ g iff g ≈ f
Transpose symmetry: f ≲ g iff g ≳ f
Θ(g) ⊆ O(g)
Θ(g) ⊆ Ω(g)
Θ(g) = Ω(g) ∩ O(g)
Divide and conquer!
Split the array in half and sort the two halves.
Merge the two halves.
private static int[] merge(int[] left, int[] right) {
int[] A = new int[left.length + right.length];
int i = 0;
int j = 0;
for (int k = 0; k != A.length; ++k) {
if (i < left.length
&& (j ≥ right.length || left[i] <= right[j])) {
A[k] = left[i];
++i;
} else if (j < right.length) {
A[k] = right[j];
++j;
}
}
return A;
}
public static int[] merge_sort(int[] A) {
if (A.length > 1) {
int middle = A.length / 2;
int[] L = merge_sort(Arrays.copyOfRange(A, 0, middle));
int[] R = merge_sort(Arrays.copyOfRange(A, middle, A.length));
return merge(L, R);
} else {
return A;
}
}
What’s the time complexity?
Recursion tree:
c*n = c*n
/ \
/ \
c*n/2 c*n/2 = c*n
/ \ / \
/ \ / \
c*n/4 c*n/4 c*n/4 c*n/4 = c*n
...
Height of the recursion tree is log(n).
So the total work is c\, n\, log(n).
Time complexity is O(n \, log(n)).