Summary of Complexity for Various Data Structures and Algorithms

Summary of complexity for various data structures and algorithms, including selection sort, insertion sort, merge sort, quicksort, heapsort, tree sort, heap, binary search tree, and quickselect.

Sorting Algorithms

Sorting AlgorithmWorst Time ComplexityAverage Time ComplexityBest Time ComplexitySpace ComplexityStableIn-Place
Selection Sortn^2n^2n^21NoYes
Insertion Sortn^2n^2n1YesYes
Merge SortnlognnlognnlognnYesNo
Quick Sortn^2nlognnlognlognNoYes
Heap Sortnlognnlognnlogn1NoYes
Tree Sortn^2nlognnlogn1NoYes
  • The logn space complexity of quicksort comes from the recursion stack.
  • Whether quicksort is stable or in-place depends on the partition function. Common implementations are in-place but unstable.
  • Building a heap has a time complexity of n.
  • For Tree Sort, building the tree takes nlogn, and traversal takes n. If the tree becomes highly unbalanced, it degenerates into a linked list, resulting in n^2 time complexity.

Data Structures

Time complexity of operations for common data structures.

Heap

  • Insert element: logn
  • Remove top element: logn

Note: The main logic of insertion and removal is heapify. The time complexity is proportional to the tree height, hence logn. Building a heap is n, not nlogn.

Binary Search Tree (BST)

  • Fast lookup: logn, worst case n
  • Insert: logn, worst case n
  • Delete: logn, worst case n
  • Traversal: n

Note: BST degenerates to a linked list in the worst case, hence n; when balanced, it’s logn.

Quickselect

Uses the core idea of quicksort to find the k-th element (k-th smallest or largest) in Θ(n) time complexity.

Licensed under CC BY-NC-SA 4.0
Last updated on Sep 04, 2024 20:55 +1000
comments powered by Disqus
This is Yumi’s blog where I share technology and life.
Built with Hugo
Theme Stack designed by Jimmy