Computer Science • 1/15/2024
Understanding Big O Notation
Understanding Big O Notation
Big O notation is a mathematical notation used to describe the limiting behavior of a function when the argument tends towards infinity. In computer science it helps classify algorithms according to how their runtime grows with input size.
What is Time Complexity?
Time complexity describes how the runtime of an algorithm increases as the input size grows.
- O(1) – Constant time
- O(log n) – Logarithmic time
- O(n) – Linear time
- O(n log n) – Linearithmic time
- O(n²) – Quadratic time
- O(2ⁿ) – Exponential time
What is Space Complexity?
Space complexity measures how much memory an algorithm requires relative to the size of the input.