Sunteți pe pagina 1din 6

Q1: Write an algorithm that finds the greatest common divisor of two integers.

Calculate the
total number of basic operations in this algorithm. Determine the worst-case time
complexity?

Algorithm:
Step1: GCD(x,y)
// x >= y where x & y are integers
Step2: if(y==0)
return x
Step3: else
return (GCD(y,x%y))

Basic Operation:
GCD(x,y) #operations
{
if(y==0) 1 operation
return x

if(x==0) 1 operation
return y

else
return (GCD(y, x%y))
}
So, we can say that in GCD algorithm the number of basic operation are the comparisons in an
algorithm.

Worst case time complexity:


The worst case complexity of the GCD is O(log n).

Q2: Under what circumstances, when a searching operation is needed, would sequential
Search (Algorithm 1.1) not be appropriate?

Ans: Computer systems are often used to store large amounts of data from which
individual records must be retrieved according to some search criterion. Thus, the
efficient storage of data to facilitate fast searching is an important issue.
Sequential search is not appropriate technique the reason is that if we have
50000 records and we search that record that is store on 49999 index so it will take a
lot of time for searching the record as compared to binary search. If we have n
records then sequential search take n to search the record and also increase the time
complexity which is a big issue. So that’s why sequential search is an appropriate
searching technique and its worst-case time complexity is O(n).
Q. No. 3
Algorithm A performs 8n2 basic operations, and algorithm B performs 200 ln n basic
operations. For what value of n does algorithm B start to show its better performance?
Ans: if the value of n is zero the algorithm B show the best performance and also if the value of n is
greater than 7 the algorithm B show the best performance because in these two values the
algorithm B growth is less than the algorithm A and also time complexity less than the algorithm A.

Q. No. 4
There are two algorithms called Alg1 and Alg2 for a problem of size n. Alg1 runs in n^2
microseconds and Alg2 runs in 100n log n microseconds. Alg1 can be implemented using 4
hours of programmer time and needs 2 minutes of CPU time. On the other hand, Alg2
require 15 hours of programmer time and 6 minutes of CPU time. If programmers are paid
20 dollars per hour and CPU time costs 50 dollars per minute, how many times must a
problem instance of size 500 be solved using Alg2 in order to justify its development cost?
Ans:
Algorithm 1: To run it once.

Fixed cost: 4 hrs * $20/hr = $80

Runtime cost:

n^2 microsecond = (500)^2 microsecond = 250,000 microsecond = 250 second

we need to change seond in to minute to get runtime cost in min/hr

((250second * 1min)/(60second)) * $50/min = $208

Run the algorithm x times

C₁(x)= 80 + (250/60)*50x

Algorithm 2: To run it once

Fixed cost: 15 hrs * $20/hr = $300

Runtime cost:

100n(logn) = 100*500*(log500) = 134948 microsecond = 135 second

We need to change second into minute to get runtime cost in min/hr


((135 second*1min)/(60 second)) * $50/min = $112.5

Run the algorithm x times

C₂(x) = 300 + (135/ 60) * 50x

C₁(x) = C₂(x)

80 + 208X = 300 + 112X

208x – 112x = 300 – 80

96x = 220

X = 2.2

The problem must be solve 2 times instance of size 500 using algorithm 2 in order to justify its
development cost.

Q. No. 5
Show directly that f(n) = n2 + 3n3∊ Theta(n3). That is, use the definitions of Big O and Big
Omega to show that f(n) is in both Big O(n3) and Big Omega (n3).
First: we use the definition of Big oh.

f(n) ≤ cg(n) for all n ≥ k


now f(n)=n^2 + 3n^3 and g(n)=o(n^3)
by using the definition of big oh.
n^2 + 3n^3 ≤ c n^3
if the value of n=1 and c=4 then the condition is satisfied.
n^2 + 3n^3 ≤ 4n^3 // c=4 & n=1
put value of c and c
28 ≤ 32
So f(n) is O(n ^3).

Second: we use the definition of Big omega

f(n) ≥ cg(n) for all n ≥ k


now f(n)=n^2 + 3n^3 and g(n)= Ω(n^3)
n^2 + 3n^3 ≥ c n^3
if the value of n=1 and c=1 then the condition is satisfied.
n^2 + 3n^3 ≥ c n^3 // c=1 and n=1
put the value of c and n
4≥ 1
So f(n) is Ω( (n ^3).

Because we have shown that f(n) is O(n 3 ) and f(n) is Ω(n 3 ), it must be that f(n) is Θ(n 3 ).
Q. No. 6
Suppose you have a computer that requires 1 minute to solve problem instances of size n=
100. Suppose you buy a new computer that runs 100 times faster than the old one. What
instance sizes can be run in 1 minute, assuming the following time complexities T(n) for our
algorithm?
a) T(n)= n , b) T(n)=n3, c) T(n)=10n2

Ans:
We know that T(100) = 1' in the old machine and T(100) = 1'/100 in the new one (because
the new one is 100 faster.)

a) T(n) = n
This means 100 op in 1' in the old machine. In the new machine (100 times faster) is 100 op
in 1'/100. So 10000 op in 1' in the new machine, and the answer is 1 million. In other words,
the new machine will compute 10000 operations in 1' (this is 1000 times more operations
per minute.)

b) T(n) = n^3
Here we have 100^3 op in 1' (old machine) and 100^3 op in 1'/100 (new machine). So, in
the new machine: 100^3 100 op in 1', or 100^3 10^3 op in 1', i.e., 100^3 op in 1', and the
answer is 1000000.

c) T(n) = 10n
We have 10 * 100 op in 1' (old), 10 * 100 op in 1'/100 (new). So, 10 * 10000 op in 1', and
the answer is again 100000.

As we can see the answers for T(n) = n and T(n) = 10 n are the same. In fact, they would
also be equal to the answer for T(n) = C n, no matter the value of C > 0. To see this, we
only have to replace 10 with C above

This is why we talk about O(n), O(n^2), O(nˆ3), etc., regardless of the constant hidden
inside the O.
Q8:

The complexity of for loop is


O(n)
And the complexity of while loop is
N,n/2,n/2^2,n/2^3,n/2^4………..,n/2^k
It value will be false only if n/2^k<=1
n/2^k<=1
n<=2^k
taking log on both sides
log n<=log2^k
logn<=klog2
log2 approx =1
logn<=k
so the total complexity is O(n*logn)

Q9:
The complexity of first while loop is
N,n/2,n/2^2,n/2^3,n/2^4………..,n/2^k
It value will be false only if n/2^k<=1
n/2^k<=1
n<=2^k
taking log on both sides
log n<=log2^k
logn<=klog2
log2 approx =1
logn<=k
the complexity of second while loop is
2^1,2^2,2^3……2^k
2^i=n
Taking log on both sides
Log2^i=logn
ilog2=logn
i=logn
so the total complexity of the given algorithm is
O(log^2n)

Q10:
Rem(int x ,int y, int p)
{
If(n==1)
Return x%p;
Else
Return rem(x,n/2,p);
}

S-ar putea să vă placă și