Big O Notation: What It Is and Why You Should Care
2:38
Definition Of Big O Notation - Intro to Theoretical Computer Science
0:49
Big O Notation explained in 60 seconds
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. It is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation.
Formal definition
Usage
Properties
Matters of notation
Related asymptotic notations
Generalizations and related usages
History (Bachmann–Landau, Hardy, and Vinogradov notations)