Big O, Omega and Theta Notations.

Big O, Omega and Theta Notations are used to describe not only the way an algorithm performs but the way an algorithm scales to produce a output. It measures the efficiency of an algorithm with respect to time it takes for an algorithm to run as a function of a given input. They are used to determine the Worst case complexity, Best case complexity and the Average case complexity.

Big Omega notation describes the best case of a running algorithm. In contrast, Big O notation describes the worst case of a running algorithm.

Big O Notation is denoted as O(n)  also termed as Order of n, or also termed as O of n.

Big Omega Notation is denoted as Ω(n) also termed as Omega of n.

Apart from Big O (O) and Big Omega (Ω), there are Big Theta (Θ), Small O (o) and Small Omega (ω) notations that are used in computer science programming. You can get more information from http://en.wikipedia.org/wiki/Big_O_notation#Big_Omega_notation.

O(1) – Constant Time

This means that the algorithm requires the same fixed number of steps regardless of the size of the task.

Examples:

1). Finding an element in a HashMap is usually a constant time, which is O(1). This is a constant time because, a hashing function is used to find an element, and computing a hash value does not depend on the number of elements in the HashMap.

2). Push and Pop operations for a stack containing n elements.

3). Insert and Remove the operations of a queue.

O(log n) – Logarithmic Time

This means that the algorithm requires the Logarithmic amount of time to perform the task.

Examples:

1). Binary search in a sorted list or Array list of n elements.

2). Insert and Find operations for a binary search tree with n nodes.

3). Insert and Remove operations for a heap with n nodes.

4). Fast insertion, removal and lookup time of a TreeMap (a.k.a balanced tree because a TreeMap maintains key/value objects in a sorted order by using a red black tree) is O(log n)

O(n) – Linear Time

This means that the algorithm requires a number of steps directly proportional to the size of the task.

Examples:

1). Traversal or searching of a list(a linked list or a array) with n elements. This is linear because you will have to search the entire list. This means that if a list is twice as big, searching will take twice as long.

2). Finding the maximum or minimum element in a list, or sequential search in an unsorted list of n elements.

3). Traversal of a tree with n nodes.

4). Calculating iteratively n-factorial, for example finding iteratively the nth Fibonacci number.

O(n log n) – N times Logarithmic time

This means that the algorithm requires N times the Logarithmic time of solving a algorithm.

Examples:

1).  Advanced Sorting Algorithms like quick sort and merge sort.

O(n2) – Quadratic Time

Examples:

1). Simple sorting algorithms, for example a selection sort of n elements.

2). Comparing 2 two dimensional arrays of size n by n.

3). Finding duplicates in an unsorted list of n elements.

Note: If a solution to a problem has one single iteration, in other words, if the solution is achieved by either only one for loop or one while loop or one do-while loop or a single recursive function, then that algorithm is said to perform with O(n) else if the solution is achieved by 2 nested loops, then the algorithm is said to perform with O(n2) and if it is achieved by 3 nested loops, then the algorithm is said to perform with O(n3) and so on..

O(n3) – Polynomial Time

Examples:

1). Given a expression 23n3 + 12n2 + 9, and n = large numbers, the execution time for n3 increases drastically which takes O(n3) to perform the operation.

O(an) for a > 1 – Exponential Time

Examples:

1). Recursive Fibonacci implementation

2). Problem to solve Towers of Hanoi

3). Generating all permutations of n symbols.

Here is the order of execution time, in which the way Big O notations worst case behavior is determined.

O(1) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) <O(an)

Constant Time < Logarithmic Time < Linear Time < N times of Logarithmic Time < Quadratic Time < Polynomial Time < Exponential Time.

If algorithm is __ then its performance is __

algorithm performance
o(n) < n
O(n) ≤ n
Θ(n) = n
Ω(n) ≥ n
ω(n) > n
4 important Big O rules:

1). If you have 2 different steps in your algorithm, add up those steps

Example:

function something() {
    doStep1();	// O(a)
    doStep2();	// O(b)
}

Overall runtime: O(a+b)

2). Drop Constants

function minMax1(array) {
	min, max = null
	for each e in array
		min = MIN(e, min)	//O(n)
	for each e in array
		max = MAX(e, max)	//O(n)
}

Another example:

function minMax2(array){
	min, max = null
	for each e in array
		min = MIN(e, min)
		max = MAX(e, max)	
}

Overall runtime: O(2n)
So dropping the constants overall runtime equals to O(n)

3). Different Arrays is equivalent to (=>) Different Variables.

function getSize(arrayA, arrayB){
	int count = 0;
	for a in arrayA{
		for b in arrayB{
			if(a == b){
				count = count++;
			}
		}
	}
	return count;
}

Overall runtime is NOT O(n^2), but instead it is O(a * b)

Note that, the loops are emulated on 2 different array’s.

4). Drop non dominant terms.

function printSize(array){
	max = null;
	
	// runtime for the below loop is O(n)
	for each a in array{
		max = MAX(a, max);
	}
	
	//runtime for the below nested loop is O(n^2)
	for a in array{
		for b in array{
			print a, b;
		}
	}
}

Overall runtime: O(n + n^2)

However, since O(n) is too small and can be neglected.
In other words,

O(n^2) <= O(n + n^2) <= O(n^2 + n^2)
O(n^2) <= O(n + n^2) <= O(2 * n^2)

Now, dropping the constants based on Rule 2,

O(n^2) <= O(n + n^2) <= O(n^2)

if left and right are equivalent, then center is too..

i.e O(n + n^2) is equivalent to O(n^2)

Hence, overall runtime is O(n^2)


Sources:

Click to access BIG-O.pdf


http://bigocheatsheet.com/

This post was more or less taken from the afore mentioned sites, with little or more changes based on my knowledge and observation. I will be adding more examples to one or more above mentioned time analysis Big O notations as and when I come across in the near future of programming career.