It is a well established fact that merge sort runs faster than insertion sort. Using asymptotic analysis we can prove that merge sort runs in O(nlogn) time and insertion sort takes O(n^2). It is obvious because merge sort uses a divide-and-conquer approach by recursively solving the problems where as insertion sort follows an incremental approach.
If we scrutinize the time complexity analysis even further, we’ll get to know that insertion sort isn’t that bad enough. Surprisingly, insertion sort beats merge sort on smaller input size. This is because there are few constants which we ignore while deducing the time complexity. On larger input sizes of the order 10^4 this doesn’t influence the behavior of our function. But when input sizes fall below, say less than 40, then the constants in the equation dominate the input size ‘n’.
So far, so good. But I wasn’t satisfied with such mathematical analysis. As a computer science undergrad we must believe in writing code. I’ve written a C program to get a feel of how the algorithms compete against each other for various input sizes. And also, why such rigorous mathematical analysis is done on establishing running time complexities of these sorting algorithms.
I have compared the running times of the following algorithms:
- Insertion sort: The traditional algorithm with no modifications/optimisation. It performs very well for smaller input sizes. And yes, it does beat merge sort
- Merge sort: Follows the divide-and-conquer approach. For input sizes of the order 10^5 this algorithm is of the right choice. It renders insertion sort impractical for such large input sizes.
- Combined version of insertion sort and merge sort: I have tweaked the logic of merge sort a little bit to achieve a considerably better running time for smaller input sizes. As we know, merge sort splits its input into two halves until it is trivial enough to sort the elements. But here, when the input size falls below a threshold such as ’n’ < 40 then this hybrid algorithm makes a call to traditional insertion sort procedure. From the fact that insertion sort runs faster on smaller inputs and merge sort runs faster on larger inputs, this algorithm makes best use both the worlds.
- Quick sort: I have not implemented this procedure. This is the library function qsort() which is available in
. I have considered this algorithm in order to know the significance of implementation. It requires a great deal of programming expertise to minimize the number of steps and make at most use of the underlying language primitives to implement an algorithm in the best way possible. This is the main reason why it is recommended to use library functions. They are written to handle anything and everything. They optimize to the maximum extent possible. And before I forget, from my analysis qsort() runs blazingly fast on virtually any input size!
- Input: The user has to supply the number of times he/she wants to test the algorithm corresponding to number of test cases. For each test case the user must enter two space separated integers denoting the input size ’n’ and the ‘num_of_times’ denoting the number of times he/she wants to run the analysis and take average. (Clarification: If ‘num_of_times’ is 10 then each of the algorithm specified above runs 10 times and the average is taken. This is done because the input array is generated randomly corresponding to the input size which you specify. The input array could be all sorted. Our it could correspond to the worst case .i.e. descending order. In order to avoid running times of such input arrays. The algorithm is run ‘num_of_times‘ and the average is taken.)
clock() routine and CLOCKS_PER_SEC macro from
is used to measure the time taken.
Compilation: I have written the above code in Linux environment (Ubuntu 16.04 LTS). Copy the code snippet above. Compile it using gcc, key in the inputs as specified and admire the power of sorting algorithms!
- Results: As you can see for small input sizes, insertion sort beats merge sort by 2 * 10^-6 sec. But this difference in time is not so significant. On the other hand, the hybrid algorithm and qsort() library function, both perform as good as insertion sort.
The input size is now increased by approximately 100 times to n = 1000 from n = 30. The difference is now tangible. Merge sort runs 10 times faster than insertion sort. There is again a tie between the performance of the hybrid algorithm and the qsort() routine. This suggests that the qsort() is implemented in a way which is more or less similar to our hybrid algorithm i.e., switching between different algorithms to make the best out of them.
Finally, the input size is increased to 10^5 (1 Lakh!) which is most probably the ideal size used in practical scenario’s. Compared to the previous input n = 1000 where merge sort beat insertion sort by running 10 times faster, here the difference is even more significant. Merge sort beats insertion sort by 100 times!
The hybrid algorithm which we have written in fact does out perform the traditional merge sort by running 0.01 sec faster. And lastly, qsort() the library function, finally proves us that implementation also plays a crucial role while measuring the running times meticulously by running 3 milliseconds faster! 😀
This article is contributed by Aditya Ch. If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to firstname.lastname@example.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.
Attention reader! Don’t stop learning now. Get hold of all the important DSA concepts with the DSA Self Paced Course at a student-friendly price and become industry ready.
- Analysis of Algorithms | Set 1 (Asymptotic Analysis)
- Analysis of Algorithms | Set 3 (Asymptotic Notations)
- Lower bound for comparison based sorting algorithms
- Sorting without comparison of elements
- Analysis of different sorting techniques
- Analysis of Algorithms | Set 4 (Analysis of Loops)
- Analysis of Algorithms | Big-O analysis
- Analysis of algorithms | little o and little omega notations
- Analysis of Algorithms | Set 5 (Practice Problems)
- Analysis of Algorithms | Set 2 (Worst, Average and Best Cases)
- Stability in sorting algorithms
- Algorithms Sample Questions | Set 3 | Time Order Analysis
- Time Complexities of all Sorting Algorithms
- Sorting Algorithms Visualization | Selection Sort
- Sorting Algorithms Visualization : Bubble Sort
- Loop Invariant Condition with Examples of Sorting Algorithms
- Analysis of Algorithm | Set 5 (Amortized Analysis Introduction)
- Properties of Asymptotic Notations
- Know Your Sorting Algorithm | Set 1 (Sorting Weapons used by Programming Languages)
- Know Your Sorting Algorithm | Set 2 (Introsort- C++’s Sorting Weapon)