Asymptotic Notation: Basics
- Definition
- Big O
- Common Data Structure Operations
- Array Sorting Algorithms
Aspect | Definition | Symbol | Formal Definition | Representation | Interpretation | Common Use | Relationship |
---|---|---|---|---|---|---|---|
Big O – Upper Bound | Represents the upper bound or worst-case scenario of a function. It denotes the maximum rate of growth of the function | Upper bound | It's like (<=) the function grows no faster than the specified function for large enough inputs | Analyzing worst-case time complexity of algorithms | is a subset of | ||
Big Omega (Ω) – Lower Bound | Represents the lower bound or best-case scenario of a function. It denotes the minimum rate of growth of the function | Lower bound | It's like (>=) the function grows at least as fast as the specified function for large enough inputs | Analyzing best-case time complexity of algorithms | is a subset of | ||
Big Theta (Θ) – Tight Bound | Represents both upper and lower bounds of a function. It denotes the exact growth rate of the function | Tight bound | It's like (==) the function grows at the same rate as the specified function for large enough inputs | When both best-case and worst-case scenarios are identical | Both and are subsets of |
Horrible
Bad
Fair
Good
Excellent
Big-O Notation is a mathematical concept used in computer science to describe the performance or complexity of an algorithm.
Complexity
Time
Time complexity measures the amount of time an algorithm takes to execute, not the actual time, but how it scales with input length. It's important for evaluating an algorithm's efficiency and usefulness, and it's affected by the size and magnitude of the data processed
Space
Space complexity refers to the total amount of memory used by an algorithm or program, including both the space used by input values and auxiliary space (temporary or extra space). The best algorithms have the lowest space complexity, as it directly impacts execution speed. However, it's important to note that underlying hardware, OS, CPU, and processor can affect space and time complexities, but for simplicity, these factors are often disregarded when analyzing an algorithm's performance.
Space Complexity = Auxiliary space + Space used by input values
Importance
- Helps in comparing different algorithms and determining their efficiency and assists in making informed decisions about choosing the right algorithm for a specific task
- Provides insights into the worst-case scenarios of an algorithm
Analyzing Algorithms
- Identify the main operations involved in the algorithm (e.g., loops, arithmetic operations, function calls)
- Determine the number of times each operation is executed as a function of the input size
- Combine the results from step 2 to obtain the overall time complexity using Big O notation
Comparison Table
Big-O Notation | Time Complexity | Description | Example | Computations for 10 elements | Computations for 100 elements |
---|---|---|---|---|---|
O(1) | constant | constant, regardless of input size | accessing an array element by index | 1 | 1 |
O(log n) | logarithmic | increases logarithmically with input size | binary search | 3 | 6 |
O(n) | linear | increases linearly with input size | iterating through whole array | 10 | 100 |
O(n log n) | linearithmic | linearly with input size * logarithmic factor | merge sort | 30 | 600 |
O(n²) | quadratic | increases quadratically with input size | 2 (double) nested loops | 100 | 10 000 |
O(n³) | cubic | increases cubically with input size | 3 (triple) nested loops | 1000 | 1 000 000 |
O() | exponential | increases exponentially with input size | naive recursive Fibonacci | 1024 | 1.26e+29 |
O(n!) | factorial | increases factorially with input size | generating all possible permutations. Iterating through all subsets of a set | 3 628 800 | 9.3e+157 |
Data Structure | Time Complexity | Space Complexity | |||||||
---|---|---|---|---|---|---|---|---|---|
Average | Worst | Worst | |||||||
Access | Lookup | Insertion | Deletion | Access | Lookup | Insertion | Deletion | ||
Array | O(1) | O(n) | O(n) | O(n) | O(1) | O(n) | O(n) | O(n) | O(n) |
Stack | O(n) | O(n) | O(1) | O(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Queue | O(n) | O(n) | O(1) | O(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Singly-Linked List | O(n) | O(n) | O(1) | O(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Doubly-Linked List | O(n) | O(n) | O(1) | O(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Skip List | O(log n) | O(log n) | O(log n) | O(log n) | O(n) | O(n) | O(n) | O(n) | O(n log n) |
Hash Table | N/A | O(1) | O(1) | O(1) | N/A | O(n) | O(n) | O(n) | O(n) |
Binary Search Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(n) | O(n) | O(n) | O(n) | O(n) |
Cartesian Tree | N/A | O(log n) | O(log n) | O(log n) | N/A | O(n) | O(n) | O(n) | O(n) |
B-Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(n) |
Red-Black Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(n) |
Splay Tree | N/A | O(log n) | O(log n) | O(log n) | N/A | O(log n) | O(log n) | O(log n) | O(n) |
AVL Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(log n) | O(n) |
KD Tree | O(log n) | O(log n) | O(log n) | O(log n) | O(n) | O(n) | O(n) | O(n) | O(n) |
Algorithm | Time Complexity | Space Complexity | ||
---|---|---|---|---|
Best | Average | Worst | Worst | |
Quicksort | O(n log n) | O(n log n) | O(n²) | O(log n) |
Mergesort | O(n log n) | O(n log n) | O(n log n) | O(n) |
Timsort | O(n) | O(n log n) | O(n log n) | O(n) |
Heapsort | O(n log n) | O(n log n) | O(n log n) | O(1) |
Bubble Sort | O(n) | O(n²) | O(n²) | O(1) |
Insertion Sort | O(n) | O(n²) | O(n²) | O(1) |
Selection Sort | O(n²) | O(n²) | O(n²) | O(1) |
Tree Sort | O(n log n) | O(n log n) | O(n²) | O(n) |
Shell Sort | O(n log n) | O(n log n)² | O(n log n)² | O(1) |
Bucket Sort | O(n+k) | O(n+k) | O(n²) | O(n) |
Radix Sort | O(nk) | O(nk) | O(nk) | O(n+k) |
Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) |
Cubesort | O(n) | O(n log n) | O(n log n) | O(n) |