Thursday, July 10, 2014

Array - III - Complexity


Complexity play a big roll in data structure that including Arrays.

What is COMPLEXITY?

Complexity is total amount of time or memory-space of an algorithm of an executable element.

There are 2 types of complexity –

1.   Time complexity – Time complexity is the total amount of time of an algorithm.

2.   Space complexity – Space complexity is the total amount of memory-space of an algorithm. Spaces required by instruction is known as Instruction space and occupy by the data is known as Data Space. However, spaces can be reused in the duration of an algorithm completion.


To denote complexity we use 3 ASYMPLOTIC notations

Big oh (O) – This notation used in Tc. It takes maximum time to execute. It is WORST case for Comparisons.

Big omega (Ω) – This notation used in Tc. It takes minimum time to execute. It is BEST case for Comparisons.

Big theta (Ѳ) – This notation used in Tc. It takes medium time to execute. It is AVERAGE case for Comparisons.




                  


Big oh (O) – If f(n)  & g(n) be two nonnegative functions.

                          Such that, f(n) ≤ cg(n)      [For all n≥n0  & where c is a constant & n0 is any natural number than we ca write –

                          f(n) =O[g(n)]

Ex – If f(n)=n2+n+1 & f(n)=O[g(n)] then g(n)=?

Sol. –         n2+n+1 ≤ c(n2 )       [where c≥2]

                 Or, f(n) ≤ O(n2) for all n ≥ 1.

                g(n) = n2  

f(n) will normally represent the computing time of some algorithm. When we say that the computing time of some algorithm is O[g(n)]. E mean that its execution takes  no more than a constant time g(n), where n is a parameter which characterizes input/output (n might be the number of input/output their sum or magnitude of one of them.)

The common computing functions are : n, n2, log2n, nlog2n, 2n etc.

The rate of growth of all these above functions are shown below – 


No comments: