Asymptotic Notation (Bachmann–Landau notation) is a notation for comparing the behavior of functions at extreme values – both very large, and very small. This is a convenient way to compare the performance of algorithms on large inputs, as well as the accuracy of approximations. This page will focus on the behavior at very large values, which is useful for analyzing the behavior of algorithms (both time and space requirements), rather than very minute values.
The asymptotic behavior of a function is its behavior as its input grows large.
For example, as x grows large, the three functions above all start to take on the same asymptotic behavior. Asymptotic notation gives a way to express the relationship between two functions' asymptotic behaviors.
Suppose we have two functions f(x) and g(x). If
we say that (spoken as "f is little oh of g"). In other words, no matter how we scale , it will eventually overtake . One can also say that dominates , because, in the long run, will become arbitrarily small compared to .
An equivalent definition is that is true if (and only if)
.
While the standard notation uses an equal sign (i.e. ), a more precise notation might be , where is the set of functions that are, in some sense, smaller than (asymptotically). Nonetheless, using an equal sign is customary, and is the notation that will be used here.
The first three properties are essentially the same properties required of a strict ordering, which means that this notation gives us a way to order functions by how quickly they grow, as x approaches infinity (see the "Common Functions" section)
The 4th and 5th properties show that asymptotic notation does not care about adding or multiplying by constants. This is useful in computer science, where constant factors may vary by hardware, programming language, and other variables that are often hard to talk about precisely; asymptotic notation gives computer scientists a way to analyze algorithms independent of the hardware that is running it.
The last property suggests a powerful way of determining the asymptotic behavior of two functions – in fact, combined with L'Hôpital's Rule, it is often all one needs.
Consider and . Prove that .
Though this limit doesn't seem particularly approachable, L'Hôpital's Rule makes short work of it:
Therefore, by property 7, .
Let and . Again, we will use property 7, combined with L'Hôpital's Rule:
Unlike the previous example, applying L'Hôpital's Rule doesn't seem to have reduced the fraction to an easily evaluable limit. Fortunately, we are welcome to apply L'Hôpital's Rule as many times as we like, without changing the value of the limit.
Therefore .
Functions that are frequently seen while analyzing algorithms are of particular use and interest. Because little-Oh notation suggests a strict ordering, the following functions are listed from smallest to largest, asymptotically.
Given any particular function, it's generally pretty easy to find some function that grows asymptotically faster. For instance, given the function , one could claim . Nonetheless, the goal is to represent something useful about the function, which generally means bounding its asymptotic growth rate as tightly as possible. is better, while is even better than that.
The expression states that and grow at an asymptotically similar rate. In other words
A simple example is and . Here, , which, as promised, is between zero and infinity.