Time Complexity Analysis

How to analyze Time Complexity?                       

Running time of an algorithm depends upon multiple factors:
  • Single vs Multiple Processors(It is using a single processor, then we cannot run our program parallelly)
  • Read and write speed of the program to the memory or the disk.
  • 32-bit vs 64-bit architecture
  • Configuration of the machine
  • Input

Consider a Model Machine with:

  • Single Processor
  • 32 bit
  • Sequential Execution
  • 1 unit of time for arithmetic and logical operations
  • 1 unit of time for assignment and return statements
Let's define a function which calculates the sum of two numbers:
sum(n1, n2):
    return (a+b)
Now, we know that 1 unit of time is taken for arithmetic operations and 1 unit of time for return statement. Therefore, irrespective of the inputs, the above program is executed in two units of time.

Let's define a function which calculated the sum of the list:
sum(list, n):
    total = 0                ---> this line takes 1 unit of time ---> execute one time
    for i in range(n):   ---> for iteration, we have to increment the value of i and compare it, so it takes two units of time ---> execute n+1 times
        total = total + list[i]       ---> arithmetic and assignment statement, so it takes two units of time --->execute n times
    return total                         ---> it takes 1 unit of time ---> execute one time


Time Taken = 1(1) + 2(n+1) + 2n + 1
                    = 1 + 2n + 2 + 2n + 1
                    = 4n + 4

Consider:
T1(n) = 6(n^2) + 3(n) + 4
T2(n) = 6(n^2) + 8

What will be the rate of the both functions when n tends to infinity?
Let's evaluate when n = 10000 and n = 100000

Now, after substituting the values of n, we get to know that the value of minimum terms have no effect on the equation. So neglecting the terms other than (n^2), we come to the conclusion that when n tends to infinity T1(n) is approximately equal to T2(n).

So, we can say that similar to such cases, there are other functions which behave the same when n tends to infinity. So, we keep all these functions under the same list.

Let's define the first category of the sets, which is called as Asymptotic notations / big(o) notations.

O(g(n)) = { f(n): there exists c and (n0) such that f(n) <= c*g(n) for n>=(n0)}

Let's say:
f(n) = 5(n^2) + 2(n) + 1
g(n) = (n^2)

We figured out that when n0 = 1 and c = 8, f(n)<=8(n^2) when n>=n0 (where n0 = 1)

O -> upper bound of rate of growth of time


Let's define the second category of the sets -> Omega Notation Ω
Ω g(n)={f(n): there exists constant c and n0 such that c(g(n))<=f(n) for n>=n0}

f(n) = 5(n^2) + 2(n) + 1
g(n) = (n^2)
 c = 5 and n0 = 0

-> f(n)>=5(n^2) for n>=0, c=5 and n0 = 0


Ω->lower bound of rate of growth of time


Let's discuss 3rd category of sets Θ notation:

Θg(n) = {f(n): there exists a constants c1, c2 and n0 such that c1g(n)<=f(n)<=c2g(n)


f(n) = 5(n^2) + 2(n) + 1
g(n) = (n^2)

c1 = 5. c2 = 8, n0 = 1

5(n^2)<f(n)<8(n^2)

Θ(n^2) = tight bound

Note: It is always a good idea to evaluate the run time of an algorithm in Θ notation


We analyze time complexity for :

  • very large input size
  • Worst-Case scenario
Consider t(n) = n^3 + 3(n^2) + 4n + 2 = O(n^3)
                t(n) ~ n^3 when n -> infinity
RULE:
  • drop lower order terms
  • drop constant multipliers

Running Time = Σ Running time of all Time Fragments

Consider:
 int a = 5;
 a ++;  ---> Time complexity is O(1)

Consider:
for (i=0;i<=n;i++){
//simple statements
}

Time complexity is O(n)

Nested loops:
for (i=0;i<=n;i++){
    for(j=0;j<=n;j++{
//simple statements
}
}

Time complexity is O(n^2)


Consider a program:

#include <stdio.h>

int main()
{
    int a = 5;
    a++; ---> O(1)
    for(i=o;i<=n;i++){
        //simple statements
    }---> O(n)
    for(i=o;i<=n;i++){
        for(j=0;j<=n;j++){
         //simple statements   
        }
    }--->O(n^2)
    return 0;
}

Running time of the program is O(1) + O(n) + O(n^2)
For very large values of n, min terms become insignificant and therefore, running time is O(n^2)


Consider an other example:

#include <stdio.h>

int main()
{
   if{
    for(i=o;i<=n;i++){
        //simple statements
    }
   }--->O(n)
   else{
    for(i=o;i<=n;i++){
        for(j=0;j<=n;j++){
         //simple statements   
        }
    }
   }--->O(n^2)
    return 0;
}


Worst Case Time complexity: O(n^2)  (We always consider the worst case to calculate the time complexity)


Space Complexity:

This measures the efficiency of the code in terms of the memory used. The analysis of space complexity is similar to time complexity.

vector<int> V;
for (int i = 0; i < N; i++) V.push_back(i);
The above code creates a vector of size N. So space complexity is O(N).
Even here, we consider the worst case space complexity.























Knowledge Source : InterviewBit and CodewithHarry









Comments

Popular posts from this blog

THREE LEVELS OF DATA INDEPENDENCE

Python Syntax, Comments, Variables, Indentation

Python-HackerRank Problem List Comprehensions