Algorithm|Javascript Algorithm

The Secret of the Priority Queue: Heap Sort

2

Chris was bothered by the 'Worst Case (O(N^2))' of Quick Sort.

"Is there no perfect sorting algorithm that is always fast like Merge Sort (O(N log N)) but uses no extra memory like Quick Sort (O(1))?"

There is exactly one algorithm that satisfies Greedy Chris's demands. It is Heap Sort.

However, to understand this guy, you first need to know a unique data structure called the 'Heap'.

1. Heap: Only Remembering the Boss

A Heap is a "Complete Binary Tree designed to quickly find the Maximum (or Minimum) value."

Simply put, it is a tree that strictly follows the rule: "The Parent Node is always larger than the Child Node (Max Heap)." (The order between siblings doesn't matter. Only the hierarchy between parent and child is strict.)

Why use a Heap?

The Root (the very top) of the Heap always holds the largest value.

The hardest part of sorting is "Finding the biggest element," but with a Heap, this process is O(1). You just have to grab the one at the top!

2. Making a Tree with an Array

"Do I need to create Node objects and pointers to implement a tree?"

No. The charm of a Heap is that it can perfectly mimic a tree using just a single Array.

  • Parent Index: i
  • Left Child: 2 \times i + 1
  • Right Child: 2 \times i + 2
  • With just these formulas, you can traverse between parents and children using only array indices.

    3. The Process of Heap Sort

    Heap Sort is largely divided into two steps.

  • Heapify (Heap Creation): Convert the jumbled array into a Max Heap structure. Now, the front of the array (Index 0) is guaranteed to be the Maximum value.
  • Sort:
  • 4. Implementing in JavaScript

    The code might look a bit long, but the core logic is simply: "If my child is bigger than me, swap places."

    javascript
    // Function to repair the heap property (Sift Down)
    function heapify(arr, length, i) {
      let largest = i;       // Assume parent (i) is the largest for now
      const left = 2 * i + 1;
      const right = 2 * i + 2;
    
      // If left child is larger than parent? -> Change target
      if (left < length && arr[left] > arr[largest]) {
        largest = left;
      }
    
      // If right child is larger than the (changed) target? -> Change target
      if (right < length && arr[right] > arr[largest]) {
        largest = right;
      }
    
      // If one of the children is larger? -> Swap
      if (largest !== i) {
        [arr[i], arr[largest]] = [arr[largest], arr[i]];
    
        // The sub-tree that got swapped might have broken the heap rule, so recurse
        heapify(arr, length, largest);
      }
    }
    
    function heapSort(arr) {
      const n = arr.length;
    
      // 1. Initial Heap Build (Convert array to Heap structure)
      // Perform heapify starting from the last parent node upwards.
      for (let i = Math.floor(n / 2) - 1; i >= 0; i--) {
        heapify(arr, n, i);
      }
    
      // 2. Extract elements one by one to sort
      for (let i = n - 1; i > 0; i--) {
        // Send the Root of the Heap (Max Value) to the back (i).
        [arr[0], arr[i]] = [arr[i], arr[0]];
    
        // The element at the back is sorted. Re-heapify only the remainder (0 ~ i).
        heapify(arr, i, 0);
      }
    
      return arr;
    }
    
    heapSort([4, 10, 3, 5, 1]);

    5. Performance Analysis: Perfect but Unpopular?

    Pros: The Symbol of Reliability

  • Time Complexity: Worst, Best, and Average are all O(N log N). It has no fluctuations.
  • Space Complexity: O(1). It does not use extra memory (In-place).
  • It is loved in embedded systems or the Linux Kernel where memory is scarce and performance guarantees are essential.
  • Cons: Why it Loses to Quick Sort

  • Speed: Although the Big-O is the same, in actual measurements, it is slower than Quick Sort. Because it accesses array indices by jumping around (i \rightarrow 2i), the Cache Hit Rate is low.
  • Unstable Sort: The order of equal data gets shuffled.
  • Key Takeaways

  • Heap: A Complete Binary Tree structure designed to find the Max or Min value in O(1).
  • Principle: Convert the array to a Heap, extract the Root (Max), send it to the back, and repeat.
  • Performance: Time O(N log N), Space O(1). Theoretically the most efficient, but practically slower than Quick Sort.
  • Usage: It is essentially used when implementing a "Priority Queue" rather than for sorting itself.

  • All the sorting algorithms we have learned so far (Bubble, Selection, Insertion, Merge, Quick, Heap) share a common trait. They "Compare" two pieces of data.

    However, it has been mathematically proven that comparison sorts cannot break the limit of O(N \log N) no matter how fast they are.

    "Then can't we just NOT compare?"

    Let's go meet the magic that sorts data without comparison using the properties of digits: Radix Sort.

    🔗 References

  • GeeksforGeeks - Heap Sort
  • VisuAlgo - Binary Heap
  • Comments (0)

    0/1000 characters
    Loading comments...