# Big Omega notation (algorithm analysis)

Learn the full details of the Big Omega notation: the second of the top 3 notations used in algorithm analysis to represent performance.

## Introduction to Big Omega notation

In the last article we know the first notation used in computer science to define asymptotic behavior of algorithms. The Big O notation. In this article we will teach you the second computational notation used for algorithm analysis: the Big Omega (or just Omega) notation, represented by the Greek letter Ω.

If you have not read the article about the Big O notation, click this link and go to the article where we explain in detail about it. It is highly recommended that you have some knowledge of Big O notation before reading this article!

As objectively as possible, the Ω notation provides us with simplified symbology to represent a lower performance limit for an algorithm. A minimum time limit that an algorithm takes to execute. That is, the notation Ω is the inverse of the Big O notation.

Here are some practical examples to get you started understanding the meaning and use of Ω notation.​

## Example for linear function

When we say that the function 4n-3 is Ω(n), we are saying that the asymptotic behavior of a linear function is equal to or less than hers. This means that the 4n-3 function will never have a lower growth behavior than a function with the linear behavior. Thus, Ω(n) is presented as a lower limit, and we know that the function will never exhibit growth behavior that is exceeded by this lower limit.

Example: for function 4n-3, there is another function of linear behavior that limits it inferiorly. Note in the graph below that, for values of n≥1, function 4n-3 outperforms function n. Similarly, it is also correct to say that 4n-3 is Ω(1), because the 4n-3 function will never exhibit growth behavior that is surpassed by constant behavior. Thus, Ω(1) also presents as an asymptotic lower limit: Now it would be wrong to say that 4n-3 is Ω(n2), because the 4n-3 function will never grow to the point where it exceeds quadratic behavior. Thus, Ω(n2) is not presented as an asymptotic lower bound: ### Summing up:

4n-3 is not Ω(n2), because it cannot overcome (limit inferiorly) a quadratic function.

4n-3 is Ω(n), because it can overcome (limit inferiorly) a linear function.

4n-3 is Ω(1), because it can overcome (limit inferiorly) a constant function.

When we say that the function 5n2-n-3 is Ω(n2), we are saying that the asymptotic behavior of a quadratic function is equal to or less than its own. It means that the function 5n2-n-3 will never have a lower growth behavior than a function with the quadratic behavior. Thus, Ω(n2) is presented as a lower limit, and we know that the function will never exhibit a growth behavior that is exceeded by this lower limit.

Example: for function 5n2-n-3, there is another quadratic behavior function that limits it inferiorly. Note in the graph below that, for values of n≥1, function 5n2-n-3 outperforms function n2. Similarly, it is also correct to say that 5n2-n-3 is Ω(n), because the 5n2-n-3 function will never exhibit a growth behavior that is surpassed by a linear behavior. Thus, Ω(n) also presents as an asymptotic lower bound: Now it would be wrong to say that 5n2-n-3 is Ω(n3), because the 5n2-n-3 function will never grow to the point where it exceeds cubic behavior. Thus, Ω(n3) is not presented as an asymptotic lower bound: ### Resumindo:

5n2-n-3 is not Ω(n3), as it cannot overcome (limit inferiorly) a cubic function.

5n2-n-3 is Ω(n2), because it can overcome (limit inferiorly) a quadratic function.

5n2-n-3 is Ω(n), because it can overcome (limit inferiorly) a linear function.

## Formal mathematical definition of Big Omega notation

Given two time complexity functions of algorithms: f(n) and g(n). The formal mathematical definition states that:

• f(n) = Ω(g(n)) if positive constants c and n0 exist such that f(n) ≥ c.g(n) for all n ≥ n0.

This definition seems a bit complicated to understand at first glance, but let’s demonstrate with some practical examples.

## First example of mathematical use of Big Omega notation

Let’s try to demonstrate by definition that 4n+1 = Ω(n) (reads “is Omega of n”).

By definition, all we have to do is demonstrate that there is a positive constant c and a positive initial value n0 so that for all n values greater than or equal to n04n+1 is always greater than or equal to c multiplied by n.

In summary: we need to prove that 4n+1 ≥ c.n, for a positive c constant and a positive initial n0 value.

To do so, the first decision we need to make is to choose a value for the constant c. In this case, let’s consider c equal to the value 4. Note below a table showing the inequality results for c=4 and for some values of n: Note that for all n values expressed in the table, the function of the 4n+1 algorithm is always a unit greater than c.n. When we plot a graph for both functions, we see more clearly how the 4n+1 function grows larger than the c.n function: Isso significa que Ω(n) representa um limite assintótico inferior para a função 4n+1, ou seja, existe uma constante c=4 e um n0=1, tal que 4n+1 ≥ c.n para todos os valores de n maiores ou igual a n0. Desta forma, está provado que, de fato, 4n+1 = Ω(n).

This means that Ω(n) represents a lower asymptotic limit for the function 4n+1, ie there is a constant c=4 and an n0=1, such that 4n+1 ≥ c.n for all n values greater than or equal to at n0. Thus, it is proved that, in fact, 4n+1 = Ω(n).

## Second example of the mathematical use of Big Omega notation

Let us try to demonstrate by definition that 5n2-n-3 = Ω(n2) (reads “is Omega of n squared”).

By definition, all we have to do is demonstrate that there is a positive constant c and a positive initial value n0 so that for all values of n greater than or equal to n05n2-n-3 is always greater than or equal to c multiplied by n2.

In summary, we need to prove that 5n2-n-3 ≥ c.n2, for a constant c and an initial n0 value.

Again, we need to choose a value for the constant c. In this case, we will also consider c equal to the value 4. Note below a table showing the inequality results for c=4 and for some values of n: Note that for all n>2 values expressed in the table, the function of the 5n2-n-3 algorithm is increasingly greater than c.n2. When we plot a graph for both functions, we see more clearly how the 5n2-n-3 function grows larger than the c.n2 function. This means that Ω(n2) represents a lower asymptotic limit for the 5n2-n-3 function. that is, there is a constant c=4 and n0=1, such that 5n2-n-3 ≥ c.n2 for all n values greater than or equal to n0. Thus, it is proved that, in fact, 5n2-n-3 = Ω(n2).

## Third example: a failed attempt

Let’s try to demonstrate by definition that 2n+1 = Ω(n2).

By definition, all we have to do is demonstrate that there is a positive constant c and an initial n0 value so that for all n values greater than n02n+1 is always greater than or equal to c multiplied by n2.

In summary, we need to prove that 2n+1 ≥ c.n2, for a constant c and an initial n0 value.

At first glance, it can be seen that 2n+1 has linear behavior and c.n2 has quadratic behavior. Therefore we are tempted to claim that this demonstration is false because a quadratic function (c.n2) cannot represent an asymptotic lower bound for a linear function (2n+1). But let’s take a closer look to see why this is impossible.

Let’s perform some algebraic transformations in order to isolate the constant c in inequality and understand its possible values: The resulting inequality already shows us that it is impossible for any constant c to make the expression valid for large values of n (n→∞), because regardless of how large the value of c is, The denominator n2 will always make the result of the function smaller and smaller.

Notice in the table below how the value of the function gets smaller and smaller when compared to the value of c (considering c=1, which is its minimum value). It can be seen that the values of the function approach zero while the value of n increases. That is, even with the smallest possible value for the constant c, the function will always decrease to the point of becoming smaller than the constant.

Therefore, we cannot demonstrate our goal. There is no constant c and n0 such that 2n+1 ≥ c.n2 for all n values greater than or equal to n0. Thus, it is proved that, in fact, 2n+1 ≠ Ω(n2) (reads “not omega of n squared”).

## Exercícios

Try to demonstrate mathematically that n2+800 = Ω(n).

Try to demonstrate mathematically that 2n+10 = Ω(n).

Try to demonstrate mathematically that n+10 = Ω(n2).

Try to demonstrate mathematically that 7n-2 = Ω(1).

Try to demonstrate mathematically that n2+20n+5 = Ω(n3). #### David V. Santiago

MSc in Systems and Computing, Professor of Computer Programming, Algorithms and Data Structures. 