Quick Sort Algorithm is used to sort a given list or array into ascending or descending order depending on user preference. This sorting algorithm is very efficient and easy to implement.
Just like merge sort this sorting algorithm is based on the principle of divide and conquer.
FOR EXAMPLE:
INPUT Array: 38 27 43 3 9 82 10
OUTPUT Array: 3 9 10 27 38 43 82
In the following article, we will learn what is Quick Sort Algorithm and how to implement it using Python. We will first see how the sorting algorithm works and furthermore, we will study its algorithm, pseudocode, and lastly its actual implementation in Python.
Table of Contents:
- How does the quicksort algorithm work?
- Algorithm of QuickSort
- Python code for the quicksort algorithm
- The time complexity of the quick sort algorithm.
- The space complexity of quick sort algorithm.
- Is quick sort an in-place sorting algorithm?
- Does quicksort show a stable sorting algorithm behavior?
- Is quick sort an adaptive sorting algorithm?
- Properties
- Applications of the quick sort
Working of Quick Sort
Quick Sort is based on the principle of divide and conquer. This algorithm divides the given list or Array into two parts and then solve each part. The partition of the list is done using a pivot element. A pivot element is an element of the list itself.
There are many ways to decide on a pivot element:
- The very first element of the Array is chosen as the pivot.
- Choosing the last element as pivot.
- Picking middle element as pivot.
- Randomly choosing any element as a pivot.
The Array is divided such that all the elements to the right of the pivot element are greater than or equal to the pivot and the elements to the left are smaller than the pivot. We implement this sorting algorithm using a recursive call. Using recursion is far more simple and convenient than doing it by iteration.
To understand further let’s take an example
Input: 7, 8, 5, 4, 6, 9
Step 1: Take 7 as the pivot element and divide the array into 2 subarrays.
Step 2: Partition the subarrays further by taking 9 as pivot for the left subarray and 4 as pivot for the right subarray.
Step 3: Once again the partition will take place using 6 as the pivot element.
Step 4: After the sorting is done the final output will be:
Output: 4, 5, 6, 7, 8, 9
Algorithm of quick sort
Step1: Choose a pivot element.
Step2: Partition of the given list/array about the pivot element.
Step3: Passing these smaller arrays to the recursive calls.
We will use two functions to implement this sorting algorithm namely partition() and quick_sort(). The quick_sort() will first call the partition() function on the given array and then call itself recursively on those divided parts of the array.
Python code
# CODE BY COPYASSIGNMENT # The partition function takes last element as pivot, places # the pivot element at its correct position in sorted # array, and places all smaller (smaller than pivot) # to left of pivot and all greater elements to right # of pivot def partition(arr,low,high): i = ( low-1 ) # index of smaller element pivot = arr[high] # pivot for j in range(low , high): # If current element is smaller than the pivot if arr[j] < pivot: # increment index of smaller element i = i+1 arr[i],arr[j] = arr[j],arr[i] arr[i+1],arr[high] = arr[high],arr[i+1] return ( i+1 ) # Quick sort function def quickSort(arr,low,high): if low < high: # pi is partitioning index, arr[p] is now # at right place pi = partition(arr,low,high) # Separately sort elements before # partition and after partition quickSort(arr, low, pi-1) quickSort(arr, pi+1, high) # Driver code arr = [10, 7, 8, 9, 1, 5] n = len(arr) quickSort(arr,0,n-1) print ("Sorted array is:") for i in range(n): print ("%d" %arr[i])
Output
Time Complexity of Quick Sort Algorithm
Time Complexity: In programming Time Complexity is defined as the total time taken by a program to execute and completely run all its functions. When it comes to calculating Time Complexity we consider three main factors:
- Worst case: The worst-case scenario occurs when one of the partitioned arrays is of size n-1. This only happens when the pivot element is the largest of the smallest element from the list. In this case, the time complexity is O(n^2).
- Best case: The best-case scenario occurs when the divided partitions are equally balanced or differ by just one. The first case occurs when the pivot element is right in the middle of the list, then each partition has (n-1)/2 elements. The second case occurs when one partition of the subarray has an even number of elements and the other has an odd number of elements. In either case, there are utmost n/2 elements in a partition. So the best case time complexity of the quick sort algorithm comes out to be O(n(logn)).
- Average case: The average time complexity of the quick sort algorithm is also O(n(logn)).
Space complexity of Quick Sort Algorithm
Space Complexity: In programming, space complexity is defined as the memory space occupied by the program to run and execute all its functions. Similar to the time complexity there are three cases to consider while evaluating the space complexity:
- Worst case: The worst-case space complexity of the quicksort algorithm is O(n).
- Best case: The best-case space complexity of the quicksort algorithm is O(logn).
- Average case: Similarly to the best-case space complexity, the average case space complexity of the quicksort algorithm is also O(logn).
In-place sorting algorithm?
Yes, quick sort is an in-place sorting algorithm as it does not require any other data structure like another array to perform its operations. Furthermore, it does not create any copies of its subarrays.
Stable sorting algorithm?
No, quick sort is an unstable sorting algorithm. A stable sorting algorithm is one in which the same elements appear at the same relative position in the sorted array as they appeared in the given initial unsorted array. Quick sort algorithm does not guarantee the maintenance of this relative order.
Adaptive sorting algorithm?
As the time complexity of the quick sort depends on the initial input sequence, so yes quick sort is an adaptive sorting algorithm just like insertion and bubble sort.
Properties
- It uses the principle of divide and conquer.
- It does not require any extra space to perform its operations.
- Being tail-recursive there is room for optimization.
- It is a chase friendly sorting algorithm.
Applications of quick sort
- Whenever users don’t need a stable sorting algorithm they prefer quicksort.
- The divide and conquer approach enables the use of parallelization.
- In the problem of separating the k largest and smallest elements, we use this algorithm.
Thanks for reading my post.
Comment if you have any queries or if you found something wrong in the post or on our website.
Keep learning
Also Read:
Selection Sort Algorithm In Data Structures and Algorithms using Python
Introduction to Searching Algorithms: Linear Search Algorithm
Bubble Sort Algorithm In Data Structures & Algorithms using Python
Merge Sort Algorithm in Python
Sorting Algorithms and Searching Algorithms in Python