The big O notation is very important for the analysis of algorithms. We need to have a solid understanding of this notation and how to use this in the future. We are going to discuss the big O notation throughout this section.
Our algorithm for finding the books and placing them has n number of items. For the first book search, it will compare n number of books for the worst case situation. If we say time complexity is T, then for the first book the time complexity will be:
T(1) = n
As we are removing the founded book from the list, the size of the list is now n-1. For the second book search, it will compare n-1 number of books for the worst case situation. Then for the second book, the time complexity will be n-1. Combining the both time complexities, for first two books it will be:
T(2) = n + (n - 1)
If we continue like this, after the n-1 steps the last book search will only have 1 book left to compare. So, the total complexity will look like:
T(n) = n + (n - 1) + (n - 2) + . . . . . . . . . . . + 3 + 2 + 1
Now if we look at the preceding series, doesn't it look familiar? It is also known as the sum of n numbers equation as shown:
So we can write:
T(n) = n(n + 1)/2
Or:
T(n) = n2/2 + n/2
For asymptotic analysis, we ignore low order terms and constant multipliers. Since we have n2, we can easily ignore the n here. Also, the 1/2 constant multiplier can also be ignored. Now we can express the time complexity with the big O notation as the order of n squared:
T(n) = O(n2)
Throughout the book, we will be using this big O notation to describe complexity of the algorithms or operations. Here are some common big O notations:
Type |
Notation |
Constant |
O (1) |
Linear |
O (n) |
Logarithmic |
O (log n) |
n log n |
O (n log n) |
Quadratic |
O (n^{2}) |
Cubic |
O (n^{3}) |
Exponential |
O (2^{n}) |