The Sorting Algorithm Visualizer is a dynamic and educational tool that brings to life the mechanics of different sorting algorithms. Its primary aim is to help users understand the intricacies of how various sorting techniques function, the differences between them, and their efficiency in various scenarios.
- Clone this repository onto your machine
git clone https://github.com/Dylan-Burns/Sorting-Algorithm-Visualizer.git
- change into root directory
cd main
- install node packages
npm install
- start the application
npm start
These visualizations don't show the efficiency of the algorithms.
They only visualize movement of the elements within the array.
But the array access and comparison counters are indicators of the time complexity of the algorithms.
For more information about the time complexity, you can take a look at the Big O Table.
A list of potential updates to improve the sorting visualizer
- Add variant sorting algorithms
- Add new sorting visualizations
- Add Array Access counter
- Add Comparisons counter
- Increase efficiency allowing for larger Array sorting
- Add Audibilization to the sorting algorithms
Bubble sort is one of the most straightforward sorting algorithms, it makes multiple passes through a list.
- Starting with the first element, compare the current element with the next element of the array.
- If the current element is greater than the next element of the array, swap them.
- If the current element is less than the next element, move to the next element.
Like bubble sort, the insertion sort algorithm is straightforward to implement and understand.
- Iterate from arr[1] to arr[n] over the array.
- Compare the current element (key) to its predecessor.
- If the key element is smaller than its predecessor, compare its elements before.
- Move the greater elements one position up to make space for the swapped element.
Merge sort uses the divide and conquer approach to sort the elements.
It is one of the most popular and efficient sorting algorithms.
The sub-lists are divided again and again into halves until the list cannot be divided further.
Then we combine the pair of one element lists into two-element lists, sorting them in the process.
The sorted two-element pairs is merged into the four-element lists, and so on until we get the sorted list.
Like Merge Sort, Quick Sort is a Divide and Conquer algorithm. It picks an element as pivot and partitions the given array around the picked pivot. There are many different versions of Quick Sort that pick pivot in different ways.
- Always pick first element as pivot.
- Always pick last element as pivot.
- Pick a random element as pivot.
- Pick median as pivot. (implemented below)
The key process in quickSort is partition(). Target of partitions is, given an array and an element x of array as pivot, put x at its correct position in sorted array and put all smaller elements (smaller than x) before x, and put all greater elements (greater than x) after x.
All this should be done in linear time.
Big O notation is the language we use for talking about how long an algorithm takes to run.
It's how we compare the efficiency of different approaches to a problem.
With Big O notation we express the runtime in terms of how quickly it grows relative to the input, as the input gets arbitrarily large.
Let's break that down:
- how quickly the runtime grows
It's hard to pin down the exact runtime of an algorithm.
It depends on the speed of the processor, what else the computer is running, etc. So instead of talking about the runtime directly, we use big O notation to talk about how quickly the runtime grows. - relative to the input
If we were measuring our runtime directly, we could express our speed in seconds.
Since we're measuring how quickly our runtime grows, we need to express our speed in terms of...something else.
With Big O notation, we use the size of the input, which we call "n".
So we can say things like the runtime grows "on the order of the size of the input" (O(n)) or "on the order of the square of the size of the input" (O(n^2)). - as the input gets arbitrarily
Our algorithm may have steps that seem expensive when "n" is small but are eclipsed eventually by other steps as "n" gets huge.
For big O analysis, we care most about the stuff that grows fastest as the input grows, because everything else is quickly eclipsed as "n" gets very large.
These are the asymptotic notations that are used for calculating the time complexity of the sorting algorithms:
- Big O Notation, O:
- Omega Notation, Ω:
- Theta Notation, θ:
It measures the upper limit of an algorithm's running time or the worst-case time complexity. It is known by O(n) for input size 'n'.
It measures the minimum time taken by algorithms to execute, and calculates the best case time complexity in sorting. It is known by Ω(n) for input size 'n'.
It measures the average time taken by an algorithm to execute. Or, the lower bound and upper bound of an algorithm's running time. It is known by θ(n) for input size 'n'.
In applications such as in radix sort, a bound on the maximum key value "k" will be known in advance, and can be assumed to be part of the input to the algorithm.
However, if the value of "k" is not already known then it may be computed, as a first step, by an additional loop over the data to determine the maximum key value.
Algorithm | Time Complexity | Space Complexity | ||
---|---|---|---|---|
Best Case | Average Case | Worst Case | Worst Case | |
Quick Sort | Ω(n log(n)) |
Θ(n log(n)) |
O(n^2) |
O(log(n)) |
Merge Sort | Ω(n log(n)) |
Θ(n log(n)) |
O(n log(n)) |
O(n) |
Tim Sort | Ω(n) |
Θ(n log(n)) |
O(n log(n)) |
O(n) |
Heap Sort | Ω(n log(n)) |
Θ(n log(n)) |
O(n log(n)) |
O(1) |
Bubble Sort | Ω(n) |
Θ(n^2) |
O(n^2) |
O(1) |
Insertion Sort | Ω(n) |
Θ(n^2) |
O(n^2) |
O(1) |
Selection Sort | Ω(n^2) |
Θ(n^2) |
O(n^2) |
O(1) |
Tree Sort | Ω(n log(n)) |
Θ(n log(n)) |
O(n^2) |
O(n) |
Shell Sort | Ω(n log(n)) |
Θ(n(log(n))^2) |
O(n(log(n))^2) |
O(1) |
Bucket Sort | Ω(n+k) |
Θ(n+k) |
O(n^2) |
O(n) |
Radix Sort | Ω(nk) |
Θ(nk) |
O(nk) |
O(n+k) |
Counting Sort | Ω(n+k) |
Θ(n+k) |
O(n+k) |
O(k) |
Cube Sort | Ω(n) |
Θ(n log(n)) |
O(n log(n)) |
O(n) |