Oct 9, 2017

Measuring Performance of Algorithm

Asymptotic performance of an algorithm refers to defining expression or curve that describes execution time of an algorithm. It only consider significant part of the expression, for example f(n), g(n2).  If I have an algorithm which has 4 + n(n + 5) instructions then we consider g(n2) as asymptotic performance of the algorithm, we simply consider highest order in the expression.

Refer following for example of f(n)

  var printListItem = function(list){
    for (var i = 0; i < list.length; i++) {
      console.log(list[i]);
  }


Refer following for example of f(n2)

var combination = function(list1, list2){
    list1.forEach((item1)=>{
      list2.forEach((item2)=>{
        console.log(item1  + ' ' + item2)
      });
    });
  }

Refer following for example of f(log(n))

  var binarySearchRecursiveStyle = function(sortedList, item, start, end){
    var length = 1+end-start;
    var mid = length/2;
    
    if (sortedList[mid]) {
      return true;
    }
    
    else if (sortedList[mid] > item) {
      if (startIndex === mid-1) {
       return sortedList[mid-1] === item
      }
      
      return binarySearch(sortedList, item, startIndex, mid-1);
    }
    else{
      return binarySearch(sortedList, item, mid, endIndex)
    }
   
    return false;
  }

Refer following for curve representing f(n2), f(n), f(log(n))






Big Theta
This represents actual performance of an algorithm, refer above for f(n), f(n2) etc

Big O
This represent performance in the worst case. Take example of the following code. It may be possible that we have to loop to the end of the list and not find the item, in which case we can represent performance as f(n). This is what is referred as Big O
    

  var contains = function(list, item){
    for (var i = 0; i < list.length; i++) {
      if (list[i]==item) {
        return true;
      }
     }
    return false;
  }


Big Omega
This represents performance in best case. In the same example above the best case could be the situation in which we find the item at the first place in which case the performance will be f(1). This is referred as Big Omega.

Amortized complexity
There are some algorithm where you have to do some house keeping at certain intervals. For example List in c#, here you don't define the size, it grows dynamically. Once it reaches array limit it creates a new array with double the size of original array and copy each item from the original array to new array.

List -
Add - O(1), This has amortized complexity of resizing (and copying.)
Remove - O(n) This may need to replace all, if first element is being removed.
Go To Index - O(1)
Find - O(n)

Linked List
Add/Remove/Merge O(1)
Find - O(n)
Go To Index - O(n)

Dictionary
When we store an object in a dictionary, it’ll call the GetHashCode method on the key of the object to calculate the hash. The hash is then adjusted to the size of the array to calculate the index into the array to store the object. Later, when we lookup an object by its key, GetHashCode method is used again to calculate the hash and the index. Key should be unique
Add/Remove/Find O(1)

HashSet
It represent set of values. Here every value should be unique which is determined by the value returned from GetHashCode .
Add/Remove/Contains O(1)

Stack
Last In First Out
Pop O(1)
Contains O(n)
Push O(1) This has amortized complexity of resizing (and copying.)

Queue
First In First Out
Pop O(1)
Contains O(n)
Push O(1) This has amortized complexity of resizing (and copying.)