Complexity analysis: asymptotic notation
Asymptotic notation is used to describe how the (time) complexity of an algorithm grows when the size of the problem becomes asymptotically large.
Say, we need to find out the number of characters in a string. One of the ways we can implement a solution is by counting the characters until we have reached the end of the string. An implementation in C could be:
#include <stdio.h>
int main()
{
char message[] = "hello";
int i = 0;
for (;; i++) // notice the null statement `;`, we have no loop
// initialization or condition check in our for loop, the loop
// is exited with `break;` statement
{
if (!message[i]) // string literal is stored as array with the end
// denoted by `\0`, we keep on checking each character
// until we have reached `\0`
{
printf("Length = %d\n", i);
break;
}
}
// we can implement using `while` loop as well
i = 0;
while (message[i])
{
i++;
}
printf("Length (while loop implementation) = %d\n", i);
return 0;
}
The above algorithm is linear in time. The complexity of the problem grows linearly with the size of the string. We denote it by: .
We could have another implementation, where we store the string and its length in an object (struct). Here we are not interested in how we determined the length of string in the first place.
#include <stdio.h>
#include <string.h> // strlen
struct message_struct
{
char text[10]; // can hold maximum of 9 characters, plus the null `\0`
// character to signify the end of string
int length;
};
int main()
{
struct message_struct message = {"hello", 5};
printf("Message: %s\nLength = %d\n", message.text, message.length);
printf("Length using built-in method = %lu\n", strlen(message.text));
return 0;
}
In the above implementation, no matter how big the string is, we can always get its length in constant time, i.e.,
Let's consider another example: looking up for a contact in a phone book. We have contacts sorted alphabetically, and we have to find the phone number of person. One way could be to start from the beginning until we have found the person i.e., a linear search algorithm. That would be a algorithm.
We could have another algorithm, where we first look up in the middle of the list (which is sorted alphabetically), if the name matches we stop there. Otherwise, if the name starts with an alphabet before the one we are looking in the middle, we again divide the first half of the list and look in its middle, and continue this process until we have found the person's name. This algorithm has complexity.
To be precise, here the logarithm has base 2. However, all logarithmic functions with base greater than 1, are asymptotically equivalent.
as is a constant.
In the best case scenario, we can find the person in first step, if it's exactly in the middle of the list. We denote best case scenario by .
In cases where the best case and asymptotic case have the same complexity, we denote it by , tight asymptotic bound. For example our second implementation of string length example above, where we store the length as a data member, the best case and worst case scenario both have same complexity .
Nested loops:
for (int i = 0; i < N; i++)
{
for (int j = 0; j < M; j++)
{
arr[i][j] = i * M + j;
}
}
In such cases, the complexity grows as . If and are of same order, we can say complexity is .
Mathematical definition
has complexity if
for all ; and are constants.
has complexity lower bound if
has complexity tight bound if and only if
(1) is
(2) is
Mathematically if is , it is also or . However, we are only interested in the tightest possible upper bound.
Complexity of recursive functions
Recursion could be very useful and elegant way to solve certain type of problems. Using recursion the problem is reduced or divided into smaller parts, and once we reach the base case the problem has constant time complexity.
In case of Fibonacci series calculation (C++ example code):
Where is constant for .
In this program, the number of function call grows exponentially with complexity .
Master theorem
Let and be constants, let be a function, and let be a function over the positive numbers defined by the recurrence:
If , where , then
if ,
if ,
if .
Example: Complexity of binary search problem is , complexity of merge sort is .