Big O Notation is a method to measure how fast an algorithm is running.
Temporal Complexity determines how long an algorithm takes to execute relative to the input size.
Spatial Complexity determines how much memory is allocated to find the item we need.
Big O Notation helps in determining how scalable an algorithm is.
The execution time of an algorithm is denoted using Big O Notation.
Temporal constant with O(1) defines the operations that take a constant execution time.
Linear time with O(n) defines that execution time increases in proportion to the size of an array.
Logarithmic time with O(log n) means input size increases linearly, however, execution time increases logarithmically.
Linearithmic/quasilinear time with O(n log n) is a moderately growing time complexity that is implied while performing logarithmic operations n times.
Quadratic time O (n²) is when the execution time increases quadratically with the number of inputs. It generally happens when reading a matrix or when nested loops are present.