With Time and Space complexity we can analyze an algorithms time taking and space-occupying efficiency, but time complexity and space complexity has their limitation if we take time complexity to analyse a linear search algorithm, the time complexity of the linear search can vary from best time complexity to worst time complexity. So, to analyse the varying complexity of the algorithm we use Asymptotic analysis.
Asymptotic Analysis
In Asymptotic Analysis, we use mathematic functions to bind and limit the complexity of an algorithm according to its run-time performance. When we bind an algorithm with its complexity limits, we can easily calculate the time complexity of that algorithm. As obvious with increase in input elements the time and space complexity of the algorithm will vary from low to high. For Example: For example, if there are two algorithms first algorithm complexity in mathematical function represent f(n) and the other represent g(2n) for first few elements the function of g(2n) will raise exponentially as compared f(n) but when they will have a huge number of elements their graph representation will look same which would be linear so for huge number of n the complexity of f(n) will be almost equal to g(n). We often divide the complexity of an Algorithm in 3 cases:
- Worst Case
- Average Case
- Best Case
In Worst-Case the algorithm takes the maximum amount of time to give an appropriate result. Average case algorithm take average time to show the output In Best Case algorithm take the minimum time to show the correct result.
Asymptotic Notation
Asymptotic Notation is used to calculate the running time complexity of an algorithm there are three methods of representing an Asymptotic notation.
- Big oh Notation (?)
- Omega Notation (?)
- Theta Notation (?)
Big Oh Notation (O)
Big Oh Notation is used to compute the worst-case time complexity of an algorithm and in programming, Big Oh Notation is used as a standard notation to show the time complexity of any algorithm because, when we write a program using an algorithm, we mostly concern about the worst-case time complexity of that program.
Omega Notation (?)
The omega Notation work opposite of Big Oh Notation, it is used to represent the best time complexity of an algorithm. We require both the worst and the best time complexity of an algorithm to calculate its efficiency.
Theta Notation ( ? )
With ? notation, we can represent both best and worst time complexity of an algorithm and we can also calculate the average time complexity. In programming, we do not consider ? and ? notation that much.
Big Oh Notation for different constraints complexity
Complexity | Big Oh Notation |
Constant time complexity (the best time complexity) | O(1) |
Linear time complexity | O(n) |
Logarithmic time complexity | O(log n) |
n log n time complexity | O(n log n) |
exponential time complexity | 2O(n) |
Cubic time complexity | O(n 3 ) |
Polynomial-time complexity | nO(1) |
Quadratic time complexity | O(n2) |
People are also reading: