QuickSort Algorithm: Working, Time Complexity & Advantages

0
17
QuickSort Algorithm
QuickSort Algorithm

In the world of computer science, sorting algorithms are essential for managing data arrangement in an order. Whether it is about arranging names in alphabetical order or improving search results, sorting algorithms itself plays an important role.

We have already covered multiple algorithms like Bubble Sort, Merge Sort, Insertion Sort and many more. In this blog we will be covering the QuickSort Algorithm about its working, time and space complexity.

Let’s deep dive into this comprehensive guide of understanding the Quicksort Algorithm.

Quick Sort Algorithm
Quick Sort Algorithm

What is a Quicksort Algorithm?

The Quicksort algorithm is a highly used sorting algorithm that follows the divide and conquer principle. This algorithm was developed by British computer scientist Tony Hoare in 1959.

This Algorithm is famous for its fast performance, efficiency in working with large datasets. It works by selecting a pivot element from the array and then partitioning the other elements into two sub arrays:

  1. One with elements less than or equal to the pivot.
  2. Another with elements greater than the pivot.

These sub-arrays are then recursively sorted using the same process. The result is a sorted array and it built by combining the sorted sub-arrays with the pivot element.

Unlike other sorting algorithms, QuickSort algorithm performs sorting in-place, meaning it requires minimal additional memory. This makes it particularly useful when we talk about memory focused approach.

How Quicksort Algorithm Works?

The QuickSort algorithm sorts an array by breaking it down into smaller parts and sorting those parts one by one as it is a classic example of the divide and conquer method. Here is a detailed step by step explanation of how it works-

Choosing the Pivot Element

First step is to choose the pivot element and it is the key element that helps divide the array into two parts. There are multiple strategies to choose the pivot element:

  • First element: Easy but sometimes leads to poor performance on sorted or nearly sorted arrays.
  • Last element: Commonly used because it simplifies implementation.
  • Random element: Helps avoid worst-case performance by making pivot selection unsure.
  • Median element: Choosing the median as pivot often yields balanced partitions but requires extra effort.

Partitioning the Array Around the Pivot

Once the pivot is selected, the array is reorganized (partitioned) so that:

  • All elements less than or equal to the pivot are placed before it.
  • All elements greater than the pivot are placed after it.

This step ensures that the pivot is in its correct final position in the sorted array. Partitioning is done in-place, meaning elements are swapped within the original array to avoid extra memory usage.

Recursively Sorting the Sub-Arrays

After partitioning, the array is split into two sub-arrays:

  • The left sub-array contains elements less than or equal to the pivot.
  • The right sub-array contains elements greater than the pivot.

The QuickSort algorithm is then recursively applied to these sub-arrays. Each recursive call repeats the process: selecting a pivot, partitioning, and further dividing until the sub-arrays are of size one or zero, which needs to be sorted.

Implementation of Quicksort Algorithm

Quicksort Algorithm
def quick_sort(arr):
    # Base case: if the list has 0 or 1 elements, it's already sorted
    if len(arr) <= 1:
        return arr

    # Choosing the pivot as the middle element (similar to diagram: 45, then 25, 12, etc.)
    pivot = arr[0]

    # Partitioning the array into three parts
    left = [x for x in arr[1:] if x < pivot]    # Elements less than pivot
    right = [x for x in arr[1:] if x > pivot]   # Elements greater than pivot

    # Recursively applying quick sort on the left and right parts
    return quick_sort(left) + [pivot] + quick_sort(right)


# Test the function with the array from the diagram
arr = [45, 67, 12, 8, 89, 25]
sorted_arr = quick_sort(arr)
print("Sorted Array:", sorted_arr)

Start with pivot 45

Split into: <45: [25, 12, 8], >45: [89, 67]

Recursion on [25, 12, 8] with pivot 25, and [89, 67] with pivot 89

Continue until individual elements are sorted

Merge back the sorted parts

Complexity Analysis of QuickSort Algorithm

Best Case: O(n log n)

  1. Happens when the pivot divides the array into two equal parts.
  2. Each level of recursion works on half the array, and there are log n levels.
  3. Total work = n (per level) × log n (levels) = O(n log n)

Average Case: O(n log n)

  1. This is the most common case when elements are randomly distributed.
  2. The pivot usually divides the array fairly evenly.
  3. Performance remains efficient and close to the best case.

Worst Case: O(n²)

  1. Happens when the pivot is always the smallest or largest element (like in already sorted arrays).
  2. The array is divided into one big and one empty part at each step.
  3. It requires n levels, each doing n work O(n × n) = O(n²)
AlgorithmBest Case TimeAverage Case TimeWorst Case TimeSpace ComplexityStability
QuickSortO(n log n)O(n log n)O(n²)O(log n)No
Merge SortO(n log n)O(n log n)O(n log n)O(n)Yes
Heap SortO(n log n)O(n log n)O(n log n)O(1)No
Bubble SortO(n)O(n²)O(n²)O(1)Yes
Insertion SortO(n)O(n²)O(n²)O(1)Yes
Selection SortO(n²)O(n²)O(n²)O(1)No
Timsort (Python’s sort)O(n)O(n log n)O(n log n)O(n)Yes
Comparison of QuickSort Algorithm with others

Advantages of QuickSort

  1. QuickSort is very fast on average and works efficiently even for large lists of data.
  2. It uses very little extra memory because it sorts the list in the same place without needing another array.
  3. It handles big data well and usually performs better than other sorting methods like Bubble Sort or Insertion Sort.
  4. It works faster on modern computers because it makes good use of the system’s cache memory.
  5. It follows a divide-and-conquer method that is making it easy to solve the problem step by step and even run parts in parallel.

Disadvantages of Quicksort

  1. QuickSort can be slow in the worst case especially if the pivot is always the smallest or largest element.
  2. It may not be stable, which means the original order of equal elements might not be preserved.
  3. It uses recursion, so if the list is too large then it can cause a stack overflow error.
  4. Choosing a bad pivot can lead to poor performance and unbalanced partitions.
  5. It is harder to implement correctly compared to simpler sorting algorithms like Bubble Sort or Insertion Sort.

Applications of QuickSort

Here are some common and practical applications of QuickSort Algorithm:

  1. Sorting large datasets in computer programs, because QuickSort is fast and memory-efficient.
  2. Used in programming libraries like Python’s sort() (internally uses Timsort, but based on QuickSort ideas) and C/C++ standard libraries.
  3. Database systems use QuickSort to quickly sort query results before displaying them.
  4. In file systems, for sorting names or files alphabetically or by size.
  5. Used in competitive programming because of its high speed and efficiency on average.

FAQs on QuickSort Algorithm

What is QuickSort?

QuickSort is a divide-and-conquer sorting algorithm that selects a pivot element and partitions the array around the pivot, placing smaller elements before it and larger elements after it. The process is recursively repeated for sub-arrays.

Why is it called “Quick” sort?

It is called “Quick” because it performs well on average and typically sorts large datasets faster than other algorithms like Bubble Sort or Insertion Sort.

How does QuickSort work?

QuickSort involves:

  • Choosing a pivot.
  • Partitioning the array such that elements less than the pivot come before it and greater after.
  • Recursively applying the same logic to sub-arrays.

What is a pivot in QuickSort?

A pivot is an element chosen from the array to divide it into two partitions. Common strategies include:

  • First element
  • Last element
  • Random element
  • Median-of-three

What is the time complexity of QuickSort?

  • Best Case: O(n log n)
  • Average Case: O(n log n)
  • Worst Case: O(n²), usually when the smallest or largest element is always picked as the pivot.

What is the space complexity of QuickSort?

QuickSort is an in-place sorting algorithm, meaning it doesn’t require extra memory, so its space complexity is O(log n) due to recursive stack calls.

Is QuickSort stable?

No, QuickSort is not a stable sort. Equal elements may not retain their original order unless a stable variant is implemented.