Visual Definition of Big O notation

  • Explanation
    • ( is upper bounded by )
    • ( is tight bounded by )
    • ( is lower bounded by )
  • Terminology
      • (1st input) we only care about what happens after this
    • - some function we’re analyzing
    • - the “benchmark” function you’re comparing to
      • & - the same function but scaled up/down by constants
      • Multiplying by tilts the up/down, but it doesn’t change the fundamental structure or shape of
  • Explaining the graph
    • Tight bounded relationship
      • If you can find constants and such that for large , then “grows like” and we say .
      • When graphs are tightly bounded, they share same growth rate
      • So if we can prove if 2 graphs are tightly bounded, we know they share the same growth rate, hence we can say
    • This is why we ignore constants in asymptotic notation because they just stretch or shrink the graph vertically, but don’t change the fundamental growth rate
      • We’re interested in the dominant term, not the coefficients.

Small Example

f(n) = 3n + 50
g(n) = n
  • Question: “Does grow similarly to ?”
  • If always stays between and after some , then it’s Θ(g(n)).
    • Constants like 3 and 50 in f(n) don’t affect asymptotic growth — they just shift or scale the curve.
    • This is why in Big-O/Theta/Ω we drop constants and lower-order terms.
      • (Example: f(n) = 3n + 50 becomes Θ(n))

Asymptotic Notation

These functions are all different ways of doing an asymptotic comparison ⇒ An asymptotic comparison of functions

    • The set of functions with asymptotic behavior less than or equal to
    • Upper-bounded by for large enough values
      • Eventually, will become and stay bigger after ⇒ An algorithm whose running time is will eventually do fewer operations than an algorithm whose running time is .
      • An algorithm whose running time is is faster than an algorithm whose running time is
    • Tightly” within constant of for large
    • You satisfy the upper/lower bound
    • The set of functions that has this asymptotic behavior greater than or equal to
    • Lower-bounded by a constant times for large enough values

Examples

Example 1

  • We’re proving if is smaller than (if it belongs to the set )
  • Since we have and for , we can pick arbitrary values greater than 0 for and
  • what this means
    • Eventually, will grow slower than some constant multiple of starting from .
    • The relationship doesn’t flip after that never overtakes again.
      • We have , so once we find it will work for all

Example 2

Example 3

  • It’s a contradiction because must hold when and is a constant

Gaining intuition

  • When doing asymptotic analysis of functions
    • If multiple expressions are added together, ignore all but the biggest
    • Ignore all multiplicative constants
    • Ignore bases of logs
    • Do NOT ignore
      • non multiplicative and non-additive constants (ex. in exponents, bases of exponents)
      • logarithms themselves
  • Examples
    • O(n)
    • O(n log n)
    • O(2^3)
    • O(n log n)