Data Structures

Definition, Classification of Data structures, Operations on data structures, Abstract data Type (ADT), Preliminaries of algorithms, Time and space complexity. Searching — Linear search, Binary search, Fibonacci search. Sorting — insertion sort, Selection sort, exchange sort(bubble sort, quick sort), distribution sort(radix sort) and merging (merge sort) algorithms.

Pravallika Devireddy
About Data Structures
9 min readFeb 21, 2021

--

DATA STRUCTURES : Data may be organized in many different ways. The logical or mathematical model of a particular organization of data is called a data structure.

Example: Arrays, Linked lists, Trees, Structures, Stacks, Queues, Graphs, Algebraic expressions

PERFORMANCE ANALYSIS

  1. Space Complexity
  2. Time Complexity

Space Complexity: The space complexity of an algorithm is the amount of memory it needs to run to completion.

Time Complexity: The time complexity of an algorithm is the amount of computer time it needs to run to completion.

Space Complexity: The space needed by the algorithms is seen to be the sum of the following components:

1) A fixed part that is independent of the characteristics of the inputs and outputs. This part typically includes the instruction space, space for simple variables, space for constants etc..

2) A variable part that consists of the space needed by component variables whose size is dependent on the particular problem instance being solved; the space needed by reference variables and the recursion stack space.

The space requirement S(P) of any algorithm P may therefore be written as S(P)= c + Sp (instance characteristics).

In this algorithm the problem instance is characterized by the specific values of a,b and c. making the assumption that one word is adequate to store the values of each of a,b and c, and the result, we see that space needed by abc is independent of the instance characteristics, Sp (instance characteristics)=0

Space requirement s[p] = 3+0 =0 One space for each a,b,c Space complexity is O(1).

Space needed by n is one word; space needed by a is n words; space needed by i and sum one word each.

S(addition)=3+n

Space complexity O(n)

The instances are characterized by n. The recursion stack space includes space for the formal parameters, the local variables, and the return address. Assume that the return address requires only one word of memory. Each call to add requires at least three words( n, return address, pointer to a[]). Since the depth of recursion is n+1, the recursion stack space needed is >=(3(n+1).

Recursion –stack space = 3(n+1) = 3n + 3 = O(n)

Time Complexity: The time taken by a program P is the sum of the compile time and the run time. The compile time does not depend on the instance characteristics. Also we may assume that a compiled program will run several times without recompilation. Consequently we concern ourselves with just the run time of a program. This runtime is denoted by tp(instance characteristics).

A program step is loosely defined as a syntactically or semantically meaningful segment of a program that has an execution time that is independent of the instance characteristics. For example : Let us consider the statement in the following algorithm.

The entire statement could be regarded as a step since its execution time is independent of the instance characteristics. The number of steps any program statement is assigned depends on the kind of statement. For example comments count as 0 steps. Assignment statement which does not involve any calls to other algorithms is counted as one step. In an iterative statement such as the for, while and repeat-until statements, we consider the step counts only for the control part of the statement.

The number of steps per execution and the frequency of each of the statements in addition algorithm have been listed. The total number of steps required by the algorithm is determined to be 2n+3. It is important to note that the frequency of for statement is n+1 and not n. this is so because i has to be incremented to n+1 before the for loop can terminate.

Notice that under the s/e column, the else clause has been given a count of 1 + tadd(a,n-1) it includes all steps that get executed as a result of the invocation of add() from the else clause. The frequeny and total steps columns have been split into two parts: one for the case n=0 and the other for the case n>0.

The frequency of first for loop is m+1. Similarly the frequency of frequency for the second for loop is m(n+1).

When you have obtained sufficient experience in computing step counts, you can avoid constructing the frequency table and obtain the step count as in the following example.

Example: The fibonacci sequnce of numbers starts as 0,1,1,2,3,5,8,13,21,34,55…..

Each new term is obtatined by taking the sum of the two previous terms.

To analyse the time complexity of this algorithm, we need to consider the two cases (1) n=0 or 1 and (2) n is > 1. The total steps for the case n>1 is 4n+1

Output

Binary Search

Output

BEST CASE, WORST CASE AND AVERAGE CASE EFFICIENCIES

1) Best Case: It is the minimum number of steps that can be executed for a given parameter.

2) Worst Case: It is the maximum number of steps that can be executed for a given parameter.

Average Case: It is the average number of steps executed for a given parameter.

Ex 1) Linear search (Sequential search)

Best Case:

If we want to search an element 2, whether it is present in the array or not, first A[1] is compared with 2. Match occurs. So the number of comparisons is only one. It is observed that search takes minimum number of comparisons, so it come under best case.

Time complexity is O(1).

Average Case: If we want to search an element 8, whether it is present in the array or not . first A[1] is compared with 8, no match occurs. Compare A[3] and A[4] with 8, no match occurs. Up to now 4 comparisons take place. Now compare A[5] and 8 so match occurs. The number of comparisons are 5. It is observed that search takes average number of comparisons. So it comes under average case. If there are n elements, then we require n/2 comparisons. Time complexity is O(n/2) which is O(n). we can neglect constant.

Worst Case : If we want to search an element 13, whether it is present in the array or not. First A[1] is compared with 13. No match occurs. Continue this process until element is found or the list exhausted. The element is found at 9th comparison. So number of comparisons are 9. It is observed that search takes maximum number of comparisons. So it comes under worst case.

Time complexity is O(n)

Note : If the element is not found in the list then we have to search entire list, so it comes under worst case.

Ex 2) Binary Search:

Assume that the number of elements is considered as 2m as every time the list is divided into two halfs.

Fibonacci Search Technique

The Fibonacci search technique is a method of searching a sorted array using a divide and conquer algorithm that narrows down possible locations with the aid of Fibonacci numbers. Compared to binary search, Fibonacci search examines locations whose addresses have lower dispersion. Therefore, when the elements being searched have non-uniform access memory storage (i.e., the time needed to access a storage location varies depending on the location previously accessed), the Fibonacci search has an advantage over binary search in slightly reducing the average time needed to access a storage location. The typical example of non-uniform access storage is that of a magnetic tape, where the time to access a particular element is proportional to its distance from the element currently under the tape’s head. Note, however, that large arrays not fitting in cache or even in RAM can also be considered as non-uniform access examples. Fibonacci search has a complexity of O(log(x))

Output:

Refer to the sorting Techniques

--

--