TIME COMPLEXITY | An Important Concept of DS and Algorithms...


Photo by Andrea Piacquadio from Pexels

Time Complexity of an Algorithm estimates how much time the algorithm will use for some inputs in a program.

Now, as we have read many things about Time Complexity, we should also know some rules or we can say some Calculation Rules to find the time complexity of an algorithm.

Calculation Rules:- 


Time Complexity of an algorithm is denoted by "O(...)", "O(n)". Here the '...' and 'n' represents some functions or input size.  

1. Loops:-

Now for calculating Time Complexity, the first thing we need to check for is "Loops". The loops are the most common reasons for slowing down the algorithm. The more nested loops the algorithm contains, the slower it becomes.

*Note:- If there are 'k' nested loops, the Time Complexity is O( n^k ).

Example:- Here we have a code which has only only 1 for loop.

    for (int i=1; i<=n; i++)
    {
    	//Some sample code here
    	//Type your code here
    } 

Now, for this code Time Complexity is O(n^1) i.e. O(n) and for the same code if we another loop inside this loop then,

    for (int i=1; i<=n; i++)
    {
    	for (int j=1; j<=n; j++)
        {
    		//Some sample code here
    		//Type your code here
        }
    }

for the above code, Time Complexity is O(n^2).

So, the gist is, the more nested loops in a program there will be an increase in the power of n. If we have 3 loops then the T.C. will be O(n^3) and so on.

2. Phases:-

Let's continue to learn more about the Time Complexity, the next thing we have to watch out for, is the Phases in a code or an Algorithm.

Now, what does the term "Phases" mean, it means one particular section of the program/code. Let's say we have one function in a code, then till the end of this function it is a Phase.

And not only functions, it can be anything like loops, classes, etc. for better understanding let's have a glimpse this example:-


If the Algorithm consists of consecutive phases or many phases, the total complexity is always same as the Time Complexity of the largest phase. It is very similar to the  Rate of Chemical Reaction in chemistry, determines the rate determining step of the complete reactions, the same way the Time Complexity of the longest phase is the Time Complexity of the Algorithm/Code/Program.

The reason behind this is that the small phases get executed quickly and hence their contribution towards Time Complexity is negligible, they do contribute but not significantly, whereas the large phases that takes time for execution have more Time complexity and resulting into the Time Complexity of the program.

In the above example we can see that the Time Complexity of Phase 1 is O(n), because it has no nested loops. The Time Complexity for Phase 2 is O(n^2) because it has a nested for loop inside a while loop. For the last Phase 3 the Time Complexity is O(n). 

3. Several Variables:-

As we have learn how the phases of program/algorithm matters for Time Complexity, there are some more rules in which Time Complexity rely on.

Now, what do we mean by several variables and how it matters for Time Complexity?

Several Variables means using too many variables in a loop. Suppose we had used too many variables in a loop then during execution/processing of the program the Compiler will have to look after each and every variable and completing their execution. This process will take quite more time rather than processing and completing the execution of a single variable, lets have al look at the example below:-

    for (int i=1; i<=n; i++)
    {
    	for (int j=1; j<=m; j++)
        {
    		//Some sample code here
    		//Type your code here
        }
    }
In the above example the Time Complexity is O(m*n) as booth the loops are concerned with different variables.


4. Order of Magnitude:- 

A Time Complexity does not tell us the exact number of times the code inside a loop is executed, but it only shows the order of magnitude. 


	for (int j=1; j<=3*n; j++)
        {
    		//Some sample code here
    		//Type your code here
        }
    

	for (int j=1; j<=n+5; j++)
        {
    		//Some sample code here
    		//Type your code here
        }
    

	for (int j=1; j<=n; i+=2)
        {
    		//Some sample code here
    		//Type your code here
        }
    

In the above examples, the code inside the loop is executed 3n, n+5, [n/2] times, but the Time Complexity of each code is O(n).

5. Recursion:- 

The Time Complexity of a recursive function depends on the number of times the function is called and the Time Complexity of a single call. The total Time Complexity is the product of these values.

For example, consider the following function,
	void f(int n)
        {
    		if (n==1) return;
            f(n-1);
        }
    

The call f(n) causes n functions calls, and the Time Complexity of each call is O(1). Thus, the total time complexity is O(n).

As we have learnt few rules about how to manage Time Complexity in out algorithms, lets have a look on some common Time Complexities of algorithm:-

1. O(1) :-  It is a direct constant-time algorithm. Its a direct formula calculates the answer.

2. O(logn):- A logarithmic algorithm often halves the input Size at each step. It is an efficient and fast Time Complexity in algorithm. 

3. O(n):- A square root algorithm is slower than O(logn) but quicker than O(n).

4. O(n):- A linear algorithm goes through the input a constant number of times. This is often the best possible Time Complexity because it is necessary to access each input at least once.

5. O(nlogn):- This is one of the most efficient Time Complexity. This time complexity often indicates that the algorithm sorts the input.

6. O(n^2):- A quadratic algorithm often contains two nested loops.

7. O(n^3):- A cubic algorithm often contains three nested loops.

8. O(2^n):- This Time Complexity often indicates that the algorithm iterates through all subsets of the input elements. For example, the subsets of {1,2,3} are {}, {1},{2}, {3}, {1,2}, {1,3}, {2,3} and {1,2,3}.

9. O(n!):- This time complexity often indicates that the algorithm iterates through all permutations of the input elements. For example, the permutations of {1,2,3} are (1,2,3), (1,3,2), (2,1,3), (2,3,1), (3,1,2) and (3,2,1).


In this post, we had came across many concepts regarding Time Complexity, so below are some points that can be kept in mind while coding so that we don't get issues of Time Complexity in Competitive programming.

*Note:- 
1. Try to us less nested loops or avoid using them.
2. Try not to make complex phases.
3. Try to make codes and algorithms as simple as possible.

#KeepLearningKeepGrowing

Post a Comment

0 Comments