Here are some of the most popular sorting algorithms. We use them to put our information in ascending or descending order. Usually we need to sort arrays, so all examples and explanations assume that we sort an array.
These algorithms are simple and relatively easy to implement. They are good for learning purposes. Some of them are also used to complement the more sophisticated sorters from the O(n*log n) group.
is a stable comparison sorting algorithm. It has an average complexity of O(n2), where 'n' is the number of elements to be sorted. It has a best case of O(n) when the input array is already sorted or nearly sorted. It is relatively easy to implement, but is too slow to be used in practice.is an unstable sorter. Its best, average and worst complexities are quadratic. It is too slow for sorting big arrays. Selection has one major advantage – it makes only “n” number of swaps, where “n” is the number of elements to be sorted. This makes it useful when the swap is very expensive operation(for instance if the elements are large objects and their keys are small, or when using flash memory).
is one of the most efficient among the O(n2) sorters. It has a worst and average time complexity of O(n2). Its best case O(n) is when the input is already sorted. It is stable and adaptive. It is very efficient for sorting a small data set. For that reason it is used in practice to complement advanced sorters like quicksort and shell sort.Merge sort uses the "divide and conquer" technique. It is very efficient - runs in O(n * log n) and makes a low number of compares. One disadvantage is the amount of extra space that it requires. Normally this sorting is stable, meaning that it preserves the order of equal elements. It has various implementations and applications in practice.
Heap sort is a very fast sorting algorithm. It runs in O(n*log n) time and requires only constant additional space O(1). It is kind of evolved selection sort, which uses a complete binary tree structure. This reduces the time to find the max element and thus makes the routine very efficient. Heap sort is unstable, because its structure could change the order of elements with equal keys. One downside is that this sorting is not adaptive to nearly sorted input.
Quicksort
Quicksort is probably the most used sorting algorithm today. The idea is to exchange elements at bigger distance. The algorithm chooses a pivot element and puts all smaller elements before the pivot and the bigger ones - after it. Then it is called recursively for the two parts.
At first glance it seems that quicksort is of quadratic complexity. And indeed its worst case is n*n, but it is a rare case. Although its time complexity is hard to prove, it is a n * log(n) and the constant factor is about 2 times better than heap and merge sort.