Sunteți pe pagina 1din 4

Lab 1 Report

Author: Jorge Berumen


CS2302 – Data Structures

Introduction
Algorithm analysis allows a programmer to quantify the running times of particular code without having the need to implement it or
run it. However, in order to truly understand the significance of this powerful tool, we must, in fact, run experimental trials that
attempt to prove or disproof this analysis in the effort of obtaining optimal comprehension. Nonetheless, this paper primarily focuses
on analyzing multiple algorithms that obtain the same result, yet in different means – namely the Fibonacci sequence and a selection
problem.

Proposed solution design and implementation


-Exercise 1.5
There are two possible approaches that can be used to solve this problem. In order to find out the number of ones in any
integer, one can use the bit-wise operators. You can also use the hint provided in the textbook:

Approach No.1
Since shifting the digits to the right accomplishes division by the base of that number system, raised to the number of
digits shifted, we can calculate how many times we have seen a one. For example, if the digits of 1000 2 , the
equivalent of 810 , are shifted one place to the right, the resulting number will be 1002 or 10 2 if shifted twice.
In conjunction with using the shifting operator, the bit-wise operator AND (&) allows the algorithm to systematically
count each bit until the number is shifted to zero. To accomplish this recursively, in the fashion that the textbook requires,
one simply supplies one base case: if n=0 then return 0, otherwise return 1 or 0 depending if n AND 1=1 plus the
recursive method with n shifted once to the right, as argument.

Approach No.2
Since the number of ones in n are equal to the number of ones in n/ 2 , plus one (granted that the number is odd), we
can simply return the value of the recursive call, with n /2 as argument, plus one. However, if the number is even,
then we simply return the recursive call, with n1 as argument, minus one. Adding one to n, in the recursion, turns
the right-most bit to 1, meaning that the final result will be 1 unit more. We subtract 1 to the result to account for this
discrepancy. Finally, when n reaches 1, then simply return 1, hence, the base case.

Because dividing n by 2 accomplishes the same operation as shifting bytes to the right once, Approach No.2 rests on the same
premise as No.1, and simply differs in semantics. In addition, the shifting operator is the equivalent of dividing n by two and
the additional condition statements replace the functionality of the AND operator. Approach No.1 should be more efficient
due the fact that the shifting operator is faster than repeated subtraction, which happens to be what division is.

-Fibonacci numbers and complexity


The Fibonacci implementations do not require much elaboration, due them being very straightforward. The closed-form relies
n
−−1/ 
on the method derived from solving the recurrence relation of the Fibonacci sequence: Fib n= where
5
1 5 . Since this method assumes the first term is one, the closed-form algorithm subtracts 1 from n so that it can be
=
2
compared to the rest of the algorithms, which assume the first term is zero. The loop form automatically calculates the first 3
terms, thus performs at O1 when n3 and at O n when n≥3 . The recursive method is written similar to the
recurrence relation in the sense that it depends on previous terms, but returns 0 when n=0 (Olac Fuente's version). The
recursive method performs at O 2n  .

-Selection problem and complexity


In order to find the kth largest element, in an ordered list, one can simply access the element at the kth position. If the list is
unordered however, the programmer must find a way to find which element happens to be the kth largest element.

Selection sort – implement a selection-sort algorithm that will order the entire list, so to access the kth element. Performs at
2
O n  .
Quick sort – implement a quick-sort algorithm that will order the list, so to access the kth element. Performs at O(nlogn) .

Modified Quick-sort – implement a variant of the quick-sort algorithm, that will keep track of the size of sub-arrays
constructed, during the partitioning of the entire list. Once the size is known, the algorithm can zero-in to the kth element
without having the need to order previous elements.
The entire program is structured in multiple classes that encapsulate each aspect of the lab. There are classes for sorting, solving the
selection problem, the Fibonacci sequence, exception handling, and for benchmarks: the methods which will wrap everything together.
Each benchmark is encased in parts. Exercise 1.5 is in Part1.java, the Fibonacci algorithms in Part2.java, and the selection problem in
Part3.java. The file Main.java organizes these parts in a menu that adequately handles exceptions and wrong inputs.

Both the Part 2 and Part 3 display the running times of the algorithms. The user has a choice to receive either the number of steps that
the algorithm completes, or the amount of milliseconds that elapse during the execution.

To run the program, execute the Run.bat in the root folder. For a demo session run, execute the Demo.bat instead. The demo version
of the program allows the user to insert the elements of the array (Part 3), instead of the program randomly generating them.

Experimental Results

-Fibonacci Algorithms
There were three algorithms to test: the closed-form, iterative, and recursive algorithms. Because the computer works too
fast, both the closed iterative/loop form terminate virtually instantly. To keep track of the complexity of the algorithms, the
number of steps each algorithm took was recorded. Below are the results of all three algorithms at several inputs:

Figure 1.1 – Illustrates the the running time of all three


algorithms for comparison. N Closed Loop Recursive
10 1 7 177
Table 1.1 – Theoretical running times for Fibonacci algorithms
15 1 12 1973
Table 1.2 – Experimental running “steps” 20 1 17 21891
25 1 22 242785
30 1 27 2692537
Fibonacci Variant Theoretical 35 1 32 29860703
Running Time
40 1 37 331160281
Closed-Form O1
45 1 42 3672623805
Iterative/Loop O n 50 1 47 40730022147
n
Recursive O 2  55 1 52 451702867433

Table 1.1 Table 1.2


-Selection Problem Algorithms
The selection problem was solved using three algorithms: with quick sort, selection sort, and a quick sort variant modified
for this particular problem. In the visual below, the quick sort variant resulted in terminating the quickest. The reader should
note that this graph displays the results in milliseconds, in contrary to the number of steps like the last example.

Figure 1.2 – Illustrates the the running time of all the three algorithms.

Table 1.3 – Theoretical running times for selection problem algorithms


Table 1.4 – Experimental running times

Modified Normal Selection


Theoretical N
Quick-Sort Quick-Sort Sort
Running Time
1000 1 4 18
Modified Quick O log n 5500 1 1 74
Sort
10000 3 19 246
Normal Quick O n log n
50000 3 13 6189
Sort
80000 25 23 31788
Selection Sort O n2 
100000 9 33 65135
Table 1.3 300000 17 113 1248752
500000 54 256 3452731
800000 126 617 8038124
1436068
1000000 88 565
6

Table 1.4
Conclusion
The purpose of this laboratory was to learn how algorithm analysis can server as a tool to avoid inefficient, resource demanding code.
In addition, this tool also allows the programmer to estimate an algorithm's running time without having the need to implement it.
However, in this project, we took a step further.

The algorithms were implemented and compared to each other based on running times or number of steps needed to complete. The
Big-O notation proved to be an effective tool to predict how fast an algorithm could execute. For instance, the quick sort terminated
considerably sooner than the selection sort, hence O log nO n log n . So the moral of the story is that one must not only
focus on how to resolve a problem while composing an algorithm, but one must also be concerned with the resources at hand and how
much of it is reasonable to invest.

S-ar putea să vă placă și