Asymptotic behavior (algorithm analysis)

Deepen your knowledge in algorithm analysis. Know the importance of asymptotic behavior of the functions generated by the analysis process.

Introduction to asymptotic behavior

In a simplified and objective way, asymptotic behavior can be understood as the growth curve of the function generated by the algorithm analysis process.

It is known that the goal of algorithm analysis is to find a closed formula that represents the amount of instructions executed by the algorithm as a function of its input n. For example, the Bubble Sort sorting algorithm (at worst case) has complexity T(n)=5n2-n+1. This means that if your input n is equal to 4 (an array of size 4), the amount of instructions executed by the algorithm to sort this arrangement will be equal to:

If you still do not know how the algorithm analysis process is done, click this link and visit the article where we teach you the step by step to find the complexity formula.

The formulas found by the analysis process are of great importance to measure the efficiency of the algorithms. However, Computer Science makes use of a more formal methodology to separate algorithms into specific performance classes. This separation is based on a metric called asymptotic behavior.

To understand the concept of asymptotic behavior, we need to analyze the function in a graph and observe how the number of instructions grows as a function of the value of input n. To do so, let’s look at three algorithms covered in the previous article (where we talked about the analysis process) and compare their complexity formulas to understand some very interesting things. Are they:

  • Algorithm that reads a user-entered age. Complexity: T(n)=7
  • Algorithm that prints all numbers in an array. Complexity: T(n)=3n+2
  • Bubble Sort algorithm. Complexity: T(n)=5n2-n+1

Function curve growth analysis

For each of the algorithms, we will create a graph: (value of n) vs (number of instructions executed).

Asymptotic behavior of algorithm that reads an age: T(n)=7

This is a constant complexity algorithm. If we look at the graph of the function illustrated below, we notice the constancy of its behavior. The number of instructions executed (vertical axis) does not increase as a function of n (horizontal axis) values. The number of instructions always remains equal to 7.

Asymptotic behavior of algorithm that prints an array: T(n)=3n+2

This is a linear complexity algorithm. If we look at the graph of the function illustrated below, we realize that the number of instructions executed by the algorithm (vertical axis) grows linearly, relative to the values of n (horizontal axis).

Asymptotic behavior of Bubble Sort algorithm: T(n)=5n2-n+1

The Bubble Sort algorithm has quadratic complexity (at worst case). If we look at the graph of the function illustrated below, we realize that the number of instructions executed by the algorithm (vertical axis) grows quadratically relative to the values of n (horizontal axis).

Comparison between algorithms

When we overlay the graphs of the complexity functions of the three algorithms (image below), we can see the growing tendency of each of the functions. Notice how the quadratic function (red line) tends to display more and more instructions for each new size n of the problem! This means that the quadratic algorithm is less efficient than the other two, which have a smaller growth trend.

Therefore, what really matters in a T(n) function of an algorithm is understanding its growth pattern as the value of n grows. We call this growth pattern asymptotic behavior. This is the key aspect for algorithm analysis.

A wrong asymptotic behavior

Let’s take a very illustrative example to show the importance of the asymptotic behavior of algorithms. An analysis was performed on two algorithms (A and B) that resulted in the following complexity functions:

Algorithm A

Algorithm B

Question: What is the most efficient algorithm?

Below is a table showing the number of instructions executed by each of the initial values of n:

Apparently (looking at the table) we are tempted to say that algorithm A is more efficient because it has much less instructions for the same values of n. But note what happens when we analyze the amount of instructions for values greater than n:

Note that up to the value of n=209 algorithm A is more efficient than B, but from the value of n=210 things change and algorithm B becomes more efficient than A.

The multiplicative and additive constants, respectively 100 and 1000, in the function of algorithm B give the false impression that it is less efficient, but they only postpone the inevitable.

See below for the plotted graph for both functions. Note that from the value of n=210, the number of instructions executed by algorithm B is exceeded by that of algorithm A:

What we just did in this example was to analyze the asymptotic behavior of the two algorithms. We analyze the growth tendency of the algorithm instructions when n has a very large value, when n tends to infinity (n→∞).

The fact that algorithm A is quadratic already gives us the clue that for very large values of n (n→∞), its instruction quantity will exceed the instruction quantity of algorithm B, which is linear. For this reason, all additive and multiplicative constants of the complexity function of an algorithm have no impact on its behavior.

Thus, it is understood that neither denominator 2 (from algorithm A) nor the values 100 and 1000 (from algorithm B) can prevent algorithm B from being more efficient than A for large values of n.

To dig deeper into algorithm analysis and learn about asymptotic classes, click here to access the article where we explain you in detail.

Share this article

Share on whatsapp
WhatsApp
Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on email
Email

Follow us
on our social media.