Sunteți pe pagina 1din 21

Algorithm Analysis

big-O session

TA Li Xin
Review Big-O, Omega-O
– To describe how long the algorithm takes using big-O and
Omega-O.
 T(N)=O(f(N)): the growth rate of T(N)<= that of f(N)
– T(N)<=c*f(N)
 T(N)=Ω(g(N)):the growth rate of T(N)<=that of g(N)
– T(N)>=c*f(N)
When we say that T (N)=O(f(N)), we are guaranteeing that the
function T(N) grows at a rate no faster than f(N). Thus, f(N) is
an upper bound on T(N).
Similarly, when T(N)=Ω(g(N)), we say g(N) is a lower bound on
T(N).

2
Review Big-O, Omega-O

If the blue line stands for the function cf(n),the black


line stands for the function T(n),it denotes a Big-O
example
3
little-o notation
Definition: A theoretical measure of the execution of an algorithm, usually the time or
memory needed, given the problem size n, which is usually the number of items.
Informally, saying some equation f(n) = o(g(n)) means f(n) becomes insignificant
relative to g(n) as n approaches infinity. The notation is read, "f of n is little oh of g
of n".
 Formal Definition: f(n) = o(g(n)) means for all c > 0 there exists some k > 0 such
that 0 ≤ f(n) < cg(n) for all n ≥ k. The value of k must not depend on n, but may
depend on c.
 Generalization (I am a kind of ...)
big-O notation.
 Note: As an example, 3n + 4 is o(n²) since for any c we can choose k > (3+
√(9+16c))/2c. 3n + 4 is not o(n). o(f(n)) is an upper bound, but is not an
asymptotically tight bound.
 Definition: When the asymptotic complexity of an algorithm exactly matches the
theoretically proved asymptotic complexity of the corresponding problem.
Informally, when an algorithm solves a problem at the theoretical minimum.

4
ωnotation
 Definition: A theoretical measure of the execution of
an algorithm, usually the time or memory needed,
given the problem size n, which is usually the number
of items. Informally, saying some equation f(n) = ω
(g(n)) means g(n) becomes insignificant relative to f(n)
as n goes to infinity.
 Formal Definition: f(n) = ω (g(n)) means that for any
positive constant c, there exists a constant k, such that
0 ≤ cg(n) < f(n) for all n ≥ k. The value of k must not
depend on n, but may depend on c.

5
Examples
1. What is the big-O of 2 n2 + 1000n + 5 ?
2. T(n) = 2 n2 + 1000n + 5; f(n) = n2
3. T(n) <= c f(n) ? 2 n2 + 1000n + 5 <= c n2 ?
4. for c = 3, n0 = 1001,check: 2(10012) + 1000(1001) + 5
= 3005007;3(10012) = 3006003
5. answer: O(n2)
6. An algorithm actually takes n5 + n3 + 7, we say its
complexity is O(n5).
7. This is only for addition! Obviously if the algorithm
takes n5 * n3 its complexity is O(n8)!

6
Question
 If an algorithm reads a whole list the length of
which is n, it will takes n time.
– What is its complexity?
 If a modified algorithm reads through the whole
list twice.
– What is its complexity?

Answer:
Both are O(n)
For T(N)<=c*f(N), here c=2, f(N)=n

7
Class O(1)

 Function/order
– Constant time
 Examples
– Find the ith element in an array.
– A[i]
 Remarks
– The running time of the algorithm doesn't depend on
the value of n.

8
Class O(logan)
 Function/order
– Logarithmic time
 Examples
– binary search
 Remarks
– Typically achieved by dividing the problem into smaller
segments and only looking at one input element in each
segment.
– Binary Search: Every time you go through the recursion
(binary search uses recursion), the problem is reduced in half.
So the most amount of times you can go through the recursion
is log2n.

9
Class O(n)

 Function/order
– Linear time
 Examples
– Find the minimum element, printing, listing
 Remarks
– Typically achieved by examining each element in
the input once.
– The running time of the algorithm is proportional to
the value of n.

10
Consider below algorithm

 add element to end of array list


– (constant)
 add element to start of array list
– (linear)

11
Class O(n * log(n))
 Examples
– heapsort, mergesort
 Remarks
– Typically achieved by dividing the problem into subproblems, solving
the subproblems independently, and then combining the results.
Unlike the log N algorithms, each element in the subproblems must
be examined.

12
Class O(n2)

 Function/order
– Quadratic time
 Examples
– bubblesort, insertion sort
 Remarks
– Typically achieved by examining all pairs of data elements
– Why is insertion sort quadratic?
– Comparisons :
(1 + 2 + ... + n) = n(n + 1)/2 = n2/2 + n/2 = O(n2)

13
Class O(n!)
 Function/order
– Factorial time
 Examples
– TSP
 Remarks
– TSP is the Travelling Salesman Problem. Let's say you are a
salesperson and you have to visit each of n cities to sell your
product. You want to visit all the cities and return to the city you
came from. You want to find the shortest path possible, visiting
all the cities.Obviously, it would be naive to try all the
permutations. There are (n-1)! permutations, and each
permutation requires computation worth n (add n distances), for
a total of n!. Many tried their hand on this problem, but few have
14 found a good solution for it.
Figure showing

15
Algorithm Analysis(1)

sum = 0;
for (i=0; i<3; i++)
for (j=0; j<n; j++)
sum++;

O(n) : outer loop is O(1), inner loop is O(n)

16
Algorithm Analysis(2)

for (i=0; i<n; i++) {


for (j=0; j<n; j++)
A[i] = random(n); // assume random() is
O(1)
sort(A, n); // assume sort() is O(n log n)
} O(n2 log n) : outer loop is O(n), inner loop is
O(n), but sorting is O(n log n) so, the
complexity of the algorithm is n(n + n log n) =
17 O(n2 log n)
Algorithm Analysis(3)
sum = 0;
for (i = 0; i < n; i++) {
if (is_even(i)) {
for (j = 0; j < n; j++)
sum++; }
else
sum = sum + n; }
O(n2) : outer loop is O(n)
inside the loop: if “true” clause executed for half
the values of n -> O(n),if “false” clause executed
for other half -> O(1);the innermost loop is O(n),so
the complexity is n(n + 1) = O(n2)
18
Algorithm Analysis(4)
List *SearchList(List *a, int key) { // The list has n
elements
if (a == NULL)
return NULL; // not found
else if (a->data == key)
return a;
else
return SearchList(a->next, key);
}
O(n) : This is tail recursion, and it only calls itself
once. Draw a picture of the recursive calls, and you
will see that this is O(n).

19
Exercises
 1. Classify each of the following functions as O(1), O(n),
O(n^2), O(log n), or O(nlog n), O(1)

(a) 5n + 7
(b) n^2− 5n
(c) 5log n + 10
(d) n(log n + n)
(e) n(log n + 1)
(f) 3 + sinnπ

20
Three algorithms for the same problem have complexity
functions as follows:
Algorithm A: 4n + 10
Algorithm B: 2n + 40
Algorithm C: n^2+ 5
 Illustrate the comparative performance of the three algorithms by
sketching their complexity functions together in a graph, and
hence determine, for each algorithm, the range of values of n for
which it performs more efficiently than the other two algorithms.

21

S-ar putea să vă placă și