Time Complexity


The amount of time needed by a program to complete its execution is known as Time complexity.
The measurement of time is done in terms of number of instructions executed by the program during its execution.
Thus Time Complexity depends on the Size of the program and type of the algorithm being used.

Space Complexity


The amount of memory needed by a program during its execution is known as Space complexity.
Total memory space need by the program is the sum of following two memory:
(1) Fixed size Memory: It contains the space required for simple variables, constants, instructions and fixed size structured variable such as array.
(2) Variable size Memory: It contains the space required for structured variable to which memory is allocated run time. It also contains space required while function is calling itself.

     

Best case Time Complexity


The measurement of minimum time that is required by an algorithm to complete its execution is known as Beast Case Time Complexity.
Time complexity of particular algorithm can be calculated by providing different input values to the algorithm.
Consider an example of sorting N elements. If we supply input values that is already sorted, an algorithm required less time to sort them. This is known as Best case time complexity.
However best case time complexity does not guarantee that the algorithm will always execute within this time for different input values.

Average case Time Complexity


The measurement of average time that is required by an algorithm to complete its execution is known as Average Case Time Complexity.
Time complexity of particular algorithm can be calculated by providing different input values to the algorithm.
Consider an example of sorting N elements. Average time complexity can be calculated by measuring the time required to complete the execution of an algorithm for different input values and then calculate the average time required to sort N elements.

Worst case Time Complexity


The measurement of maximum time that is required by an algorithm to complete its execution is known as Worst Case Time Complexity.
Time complexity of particular algorithm can be calculated by providing different input values to the algorithm.
Consider an example of sorting N elements. If we supply input values that is in reverse order, an algorithm required maximum time to sort them. This is known as worst case time complexity.
Thus, worst case time complexity always guarantees that the algorithm will always execute within this time for different input values.

Asymptotic Notations


Asymptotic Notations are used to describe the complexity of an algorithm. Complexity of an algorithm indicates how much time needed by an algorithm to complete its execution for given set of input data.
The same problem can be solved using different algorithms. In order to select the best algorithm for a problem, we need to determine how much time the different algorithma will take to run and then select the better algorithm.
There are various Asymptotic Notations are available to describe complexity of an algorithm. Which are
1. Big-Oh Notation
2. Big-Omega Notation
3. Big-Theta Notation
4. Little-oh Notation
5. Little-omega Notation

Big ‘O’ Notation


Big - O Notation is used to describe the Time complexity of an algorithm. It means how much time is needed by an algorithm to complete its execution for the input size of N. For Example a sorting algorithm take longer time to sort 5000 elements than 50.
Following are commonly used Orders of an algorithm.
(1) O(1): An algorithm that will always execute in the same time regardless of the size of the input data is having complexity of O(1).
(2) O(n): An algorithm whose performance is directly proportional to the size of the input data is having complexity of O(n). It is also known as linier complexity. If an algorithm uses looping structure over the data then it is having linier complexity of O(n). Linier Search is an example of O(n).
(3) O(n2): An algorithm whose performance is directly proportional to the square of the size of the input data is having complexity of O(n2). If an algorithms uses nested looping structure over the data then it is having quadratic complexity of O(n2). Bubble sort, Selection Sort are the example of O(n2).
(4) O(logn): An algorithm in which during each iteration the input data set is partitioned into to sub parts is having complexity of O(logn). Quick Sort, Binary Search are the example of O(logn) complexity.

Download Projects


Download Programs