渐进复杂度
约 745 个字 3 行代码 预计阅读时间 3 分钟
Prerequisite¶
Asymptotic Notations Assuming f(n), g(n) and h(n) be asymptotic functions the mathematical definitions are:
- If f(n) = Θ(g(n)), then there exists positive constants c1, c2, n0 such that 0 ≤ c1.g(n) ≤ f(n) ≤ c2.g(n), for all n ≥ n0
- If f(n) = O(g(n)), then there exists positive constants c, n0 such that 0 ≤ f(n) ≤ c.g(n), for all n ≥ n0
- If f(n) = Ω(g(n)), then there exists positive constants c, n0 such that 0 ≤ c.g(n) ≤ f(n), for all n ≥ n0
- If f(n) = o(g(n)), then there exists positive constants c, n0 such that 0 ≤ f(n) < c.g(n), for all n ≥ n0
- If f(n) = ω(g(n)), then there exists positive constants c, n0 such that 0 ≤ c.g(n) < f(n), for all n ≥ n0
Properties¶
Reflexivity¶
If \(f(n)\) is given then
Example: If \(f(n) = n^3 ⇒ O(n^3)\) Similarly,
Symmetry¶
Example: If \(f(n) = n^2\) and \(g(n) = n^2\) then \(f(n) = Θ(n^2)\) and \(g(n) = Θ(n^2)\)
Proof:
- Necessary part: \(f(n) = Θ(g(n)) ⇒ g(n) = Θ(f(n))\) By the definition of \(Θ\), there exists positive constants \(c1, c2\), no such that \(c1.g(n) ≤ f(n) ≤ c2.g(n)\) for all \(n ≥ no ⇒ g(n) ≤ (1/c1).f(n)\) and \(g(n) ≥ (1/c2).f(n)\) \(⇒\) \((1/c2).f(n) ≤ g(n) ≤ (1/c1).f(n)\) Since \(c1\) and \(c2\) are positive constants, \(1/c1 (and \(1/c2\) are well defined. Therefore, by the definition of \(Θ\),\) g(n) = Θ(f(n))\)
- Sufficiency part: \(g(n) = Θ(f(n)) ⇒ f(n) = Θ(g(n))\) By the definition of \(Θ\), there exists positive constants \(c1, c2\), no such that \(c1.f(n) ≤ g(n) ≤ c2.f(n)\) for all \(n ≥ no ⇒ f(n) ≤ (1/c1).g(n)\) and \(f(n) ≥ (1/c2).g(n)\) \(⇒\) \((1/c2).g(n) ≤ f(n) ≤ (1/c1).g(n)\) By the definition of Theta(\(Θ\)), \(f(n) = Θ(g(n))\)
Transistivity¶
Example: If \(f(n) = n, g(n) = n^2\) and \(h(n) = n^3\) \(⇒\) \(n\) is \(O(n^2)\) and \(n^2\) is \(O(n^3)\) then \(n\) is \(O(n^3)\)
Proof:
\(f(n) = O(g(n))\) and \(g(n) = O(h(n))\) \(⇒\) \(f(n) = O(h(n))\) By the definition of Big-Oh(O), there exists positive constants \(c\), no such that \(f(n) ≤ c.g(n)\) for all \(n ≥ no\) \(⇒\) \(f(n) ≤ c1.g(n)\) \(⇒\) \(g(n) ≤ c2.h(n)\) \(⇒\) \(f(n) ≤ c1.c2h(n)\) \(⇒\) \(f(n) ≤ c.h(n)\), where, \(c = c1.c2\)
By the definition, \(f(n) = O(h(n))\) Similarly,
f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n))
f(n) = Ω(g(n)) and g(n) = Ω(h(n)) ⇒ f(n) = Ω(h(n))
f(n) = o(g(n)) and g(n) = o(h(n)) ⇒ f(n) = o(h(n))
f(n) = ω(g(n)) and g(n) = ω(h(n)) ⇒ f(n) = ω(h(n))
Transpose Symmetry¶
Example: If \(f(n) = n\) and \(g(n) = n^2\) then \(n\) is \(O(n^2)\) and \(n^2\) is \(Ω(n)\)
Proof:
- Necessary part: \(f(n) = O(g(n)) ⇒ g(n) = Ω(f(n))\) By the definition of Big-Oh (O) \(⇒ f(n) ≤ c.g(n)\) for some positive constant \(c\) \(⇒ g(n) ≥ (1/c).f(n)\) By the definition of Omega (Ω), \(g(n) = Ω(f(n))\)
- Sufficiency part: \(g(n) = Ω(f(n)) ⇒ f(n) = O(g(n))\) By the definition of Omega (Ω), for some positive constant \(c\) \(⇒ g(n) ≥ c.f(n)\) \(⇒ f(n) ≤ (1/c).g(n)\) By the definition of Big-Oh(O), \(f(n) = O(g(n))\)
Analogies¶
Since these properties hold for asymptotic notations, analogies can be drawn between functions \(f(n)\) and \(g(n)\) and two real numbers \(a\) and \(b\).
- \(g(n) = O(f(n))\) is similar to \(a ≤ b\)
- \(g(n) = Ω(f(n))\) is similar to \(a ≥ b\)
- \(g(n) = Θ(f(n))\) is similar to \(a = b\)
- \(g(n) = o(f(n))\) is similar to \(a < b\)
- \(g(n) = ω(f(n))\) is similar to \(a > b\)
Obersavations¶
Proof:
Without loss of generality, assume \(f(n) ≤ g(n), ⇒ g(n) = max(f(n), g(n))\)
Consider, \(g(n) ≤ max(f(n), g(n)) ≤ g(n)\) \(⇒\) \(g(n) ≤ max(f(n), g(n)) ≤ f(n) + g(n)\) \(⇒\) \(g(n)/2 + g(n)/2 ≤ max(f(n), g(n)) ≤ f(n) + g(n)\)
From what we assumed, we can write \(⇒\) \(f(n)/2 + g(n)/2 ≤ max(f(n), g(n)) ≤ f(n) + g(n)\) \(⇒\) $(f(n) + g(n))/2 ≤ max(f(n), g(n)) ≤ f(n) + g(n) $
By the definition of Θ, \(max(f(n), g(n)) = Θ(f(n) + g(n))\)
Proof:
Without loss of generality, assume \(f(n) ≤ g(n)\) \(⇒\) \(O(f(n)) + O(g(n)) = c1.f(n) + c2.g(n)\) From what we assumed, we can write \(O(f(n)) + O(g(n)) ≤ c1.g(n) + c2.g(n) ≤ (c1 + c2) g(n) ≤ c.g(n) ≤ c.max(f(n), g(n))\)
By the definition of Big-Oh(O), \(O(f(n)) + O(g(n)) = O(max(f(n), g(n)))\)
Note:
- If \(\lim_{n\rightarrow \infty} f(n)/g(n) = c\), \(c ∈ R+\) then \(f(n) = Θ(g(n))\)
- If \(\lim_{n\rightarrow \infty} f(n)/g(n) ≤ c\), \(c ∈ R\) (\(c\) can be \(0\)) then \(f(n) = O(g(n))\)
- If \(\lim_{n\rightarrow \infty} f(n)/g(n) = 0\), then \(f(n) = O(g(n))\) and \(g(n) = O(f(n))\)
- If \(\lim_{n\rightarrow \infty} f(n)/g(n) ≥ c\), \(c ∈ R\) (\(c\) can be \(\infty\)) then \(f(n) = Ω(g(n))\)
- If \(\lim_{n\rightarrow \infty} f(n)/g(n) = \infty\), then \(f(n) = Ω(g(n))\) and \(g(n) = Ω(f(n))\)
Conclusion¶
The three main asymptotic notations used in complexity analysis of algorithms are Big O, Omega, and Theta. Here are the properties of each notation:
Big O Notation¶
- \(O(f(n))\) represents an upper bound on the growth rate of a function \(f(n)\).
- For a function \(g(n)\), \(g(n) = O(f(n))\) means that the growth rate of \(g(n)\) is no faster than the growth rate of \(f(n)\) asymptotically.
- \(O\) notation represents the worst-case scenario for the running time or space complexity of an algorithm.
Omega Notation¶
- \(Ω(f(n))\) represents a lower bound on the growth rate of a function \(f(n)\).
- For a function \(g(n)\), \(g(n) = Ω(f(n))\) means that the growth rate of \(g(n)\) is no slower than the growth rate of \(f(n)\) asymptotically.
- \(Ω\) notation represents the best-case scenario for the running time or space complexity of an algorithm.
Theta Notation¶
- \(Θ(f(n))\) represents an upper and lower bound on the growth rate of a function \(f(n)\).
- For a function \(g(n)\), \(g(n) = Θ(f(n))\) means that the growth rate of \(g(n)\) is bounded both above and below by the growth rate of \(f(n)\) asymptotically.
- \(Θ\) notation represents the average-case scenario for the running time or space complexity of an algorithm.
In general, these notations have the following properties:
- Reflexive: \(f(n) = O(f(n))\), \(f(n) = Ω(f(n))\), \(f(n) = Θ(f(n))\)
- Transitive: if \(f(n) = O(g(n))\) and \(g(n) = O(h(n))\), then \(f(n) = O(h(n))\)
- Symmetric: if \(f(n) = Θ(g(n))\), then \(g(n) = Θ(f(n))\)
- Addition: \(f(n) + g(n) = O(max(f(n), g(n)))\), \(f(n) + g(n) = Θ(max(f(n), g(n)))\)
- Multiplication: \(f(n) * g(n) = O(f(n) * g(n))\), \(f(n) * g(n) = Ω(f(n) * g(n)), f(n) * g(n) = Θ(f(n) * g(n))\)
- Understanding these properties is crucial in analyzing the efficiency of algorithms and selecti
Important
This article comes from properties-of-asymptotic-notations. If there is any infringement, please contact me to delete it.
Last update: April 24, 2026
Discussion