TERMINOLOGIES RELATED TO DATA STRUCTURES AND ALGORITHMS PART-2 | TIME AND SPACE COMPLEXITY | BIG O | BIG THETA | BIG OMEGA

 


In this post we will look at some more terminologies related to data structures and algorithms. As I had mention in previous post that you need not memorize any terminology here in but you must know about them if your interviewer asks about them or of you have to simply explain it to someone.

So Let's get started,


I had explained in short about time & space complexities while concluding the previous post. Now let's dig deeper into these topics and their Big O notations along with some simple real life analogous examples.

Starting with example itself... Suppose a cook has enough resources and raw materials and he has to cook meal for 5 people, and he is fast enough to make the meal in just half an hour... But if the same cook is now told to prepare the meal for 500 people, then he will have to gather enough amount of ingredients and then prepare food for these many people  which will surely take much more time... OR take an example from programing perspective... Suppose computer have to search how many times a word is repeated in a page then it may do it in a few seconds, but if the computer is ask to perform same task for a document of 1000 pages then the computer may take much longer time to count the number of times a word has repeated... 

This is time complexity i.e., when the input size increases then how does it affect the processing time of  an algorithm.

Space complexity is also a similar topic for example...

If you want to store 1kg of rice then you can easily store it in a container of the proper size and keep it in your home... Now if you want to store 10kg rice or 100kg tice then you will require larger containers or sacks to store it in. Now, suppose you want to store 100 quintals of rice, then in this case you will need a large room or a Go-down to store it. In similar way when the size of input data increases then how does it affect the size of memory(or space) to store the data is called space complexity. 

As we are aware what  is time & space complexity lets look at some notations:


Consider Raj and Rohan created an algorithm to sort n numbers.

When ran for input size n, the following results were recorded.

Sr. No. Number of Elements Raj's Algorithm Rohan's Algorithm
1. 10 Elements 80 milliseconds 101 milliseconds
2. 50 Elements 112 milliseconds 125 milliseconds
3. 100 Elements 170 milliseconds 133 milliseconds
4. 1000 Elements 2.3 Seconds 810 milliseconds

We can see that initially, Raj’s algorithm was shining for smaller input but as the number of elements increases Rohan’s algorithm looks good.


Time complexity : sending game example 

Let us say you have a friend living 5 km away from your place. You want to send him a game.

Final exams are over and you want him to get this 60 GB file from you. How will you send it to him?

Note that both of you are using 4G mobile data with a 1.5 Gb/day data limit.

The best way to send him the game is by delivering it to his house. Copy the game to a hard-disk and send it.

Will you do the same for sending the game like minesweeper which is in KB's of size? No, because you can easily send it via the internet.

As the file size grows, time taken by online sending increases linearly – O(n)

As the file size grows, time taken by physical sending remains constant. O(n^0) or O(1).

Will you do the same for sending the game like minesweeper which is in KBS of size? No, because you can easily send it via the internet.

As the file size grows, time taken by online sending increases linearly – O(n’)

As the file size grows, time taken by physical sending remains constant. O(n0) or O(1).

Calculating time complexity order (O) in terms of n :
Assume the expression for time required by an algorithm looks like this :

  t =  a×(n^2) + b*n + c 

  {where a, b, c are constants}

The term of highest order is n^2. 

To calculate O in terms of n we assume the worst case scenario where n is a very large value. in that case n^2 will be much larger than n and thus we can ignore the terms b*n and c.

The expression can be now written as:

   t ≈ a×(n^2), now, ignoring constant a we say that 

   t ≈ n^2 

Thus time complexity is O(n^2).

Consider one more expression

  t = a×(b^3) + c^2 + d

here we see the highest terms seems to be a a×(b^3) { assuming a & b > c, d}

But the matter of fact is that the expression is independent of n as the expression contains only constants and it doesn't depend on n or it doesn't change if the value is changed thus

The algorithm runs in constant time and this time complexity is O(n^0) or O(1).

If we plot the graph of O(1) and O(n).


We see that For small input O(n) is efficient as it takes less time but for larger inputs and worst case scenario. O(1) works best.



#KeepLearningKeepGrowing

Post a Comment

0 Comments