Programming languages: what are they?

Learn the meaning of programming languages. Understand what they look like and how they are used to create programs.

Definition of programming languages

In the most superficial way, a programming language can be defined as a standardized way of communicating instructions for a computer to execute. That is, it is a symbology that specifies how a source code (code that defines a computer program) will be written.

If you still do not know what a source code is, click here and go to the article where we explain in detail about it.

To understand the meaning of programming languages, imagine the following scenario: a love poem written in three languages:

Doubt the light of the stars,
That the sun has heat,
Doubt even the truth,
But trust my love.

Love poem written in English.

شك في ضوء النجوم ،
أن الشمس حارة ،
شك حتى الحقيقة ،
لكن ثق بحبي.

Love poem written in Arabic.


Love poem written in japanese.

Although the three poems communicate the same thing, surely you can only understand the poem written in English. This is because the English language gives you symbols and words that make sense to your mind that you can understand.

With programming languages ​​it’s the same thing. They are kind of idiom. Symbol sets and words that programmers use to tell the computer what to do. Below is an example of a simple computer program written in two different programming languages:


2.using namespace std;

3.unsigned int factorial(unsigned int n)


5.     if (n == 0)

6.     return 1;

7.     return n * factorial(n - 1);

8.} main() {

10.     int num = 5;

11.     cout <<"Factorial is " <

12.     return 0;


Program written in C ++ language

1.def recur_factorial(n):

2.     if n == 0:

3.          return 1

4.     else:

5.          return n*recur_factorial(n-1)

6.num =5

7.fact = recur_factorial(num)

8.print("The factorial is ",fact)

Program written in Python language.

All two programs, although written in different languages ​​and in different ways, are technically interpreted and executed in the same way. Thus, we can understand that programming languages ​​are languages ​​used by programmers to write source code that will be executed by a computer.

Characteristics and use of Programming Languages

The programming languages ​​were designed, and still are, to provide the programmer with a high level communication model. It makes use of humanly understandable terms that are often related to the common actions of human activity.

Notice in the code below how language terms refer to humanly understandable actions, such as the term “setColor”, which refers to assigning a color, and the term “drawImage”, which refers to drawing an image on the screen.

1.protected void paintComponent(Graphics g) {

2.     super.paintComponent(g);

3.     setBackground(Color.LIGHT_GRAY);

4.     g.setColor(Color.RED);

5.     g.fillOval(ball.posX, ball.posY, ball.raio*2, ball.raio*2);

6.     g.drawImage(ball.stoped, ball.posX, ball.posY, null);


In terms of their usefulness, programming languages ​​are used to make representations of real world knowledge. To make computers represent human theories and concepts from various fields of science.

To understand how this happens, let’s move on to an example in the field of mathematics. Note below the representation of the mathematical concept of Pythagorean Theorem, expressed in two ways: in the first image we have the mathematical language, which we learn in basic school education; In the second we have the same concept represented in the Java programming language.

hypotenuse = √(side12 + side22)

Pythagorean theorem expressed in the mathematical language. side1 = 3; side2 = 4; ds1 = Math.pow(side1, 2); ds2 = Math.pow(side2, 2); hypotenuse = Math.sqrt(ds1 + ds2);

Pythagorean theorem expressed in the Java language.

Looking at the representation of Pythagorean theorem, which was expressed in the image above, it is clear that the complexity of languages ​​represents something between what a human being can understand and what a machine can understand. For humans, a natural language made up of logical rules that define how we can construct a word or phrase is perfectly understandable. For a machine, whatever.

Programming Languages ​​versus Binary Code

In fact, a computer only understands binary code instructions (zeros and ones), and the programming languages ​​represent simplifications of these instructions in a higher, more humanly understandable language. Thus, all source code written in a programming language is converted to binar code before it is executed by the computer.

If you still do not know what a binary code is, click here and access the article where we explain in detail about it.

To better understand this difference, see below the example of a program written in a programming language and binary code. main(){

2.     printf("Hello World");

3.     return 0;


The program is created in a more humanly understandable language.

1.01101001 01101110 01110100 00100000

2.01101101 01100001 01101001 01101110

3.00101000 00101001 01111011 00001101

4.00001010 00100000 00100000 00100000

5.00100000 01110000 01110010 01101001

6.01101110 01110100 01100110 00101000

7.00100010 01001000 01100101 01101100

8.01101100 01101111 00100000 01010111

9.01101111 01110010 01101100 01100100

10.00100010 00101001 00111011 00001101

11.00001010 00100000 00100000 00100000

12.00100000 01110010 01100101 01110100

13.01110101 01110010 01101110 00100000

14.00110000 00111011 00001101 00001010


To run, the program is converted to binary code, which is the representation that the computer understands.

It is obvious that even though it sounds a bit complicated, it is much easier for a human being to understand source code written in a programming language than in binary code. The computer, in turn, can also understand code written in a programming language and translate it into its binary representation.

Low-level languages

In the 1950s, in the early days of computer science, computer programs were not commercial products. They were tools restricted to university research centers and national defense agencies. For this reason, there was no large market demand for software construction.

Programs at that time were developed by scientists for their own use using a low-level programming language called Assembly.

The Assembly language was not understandable at all. It was strongly linked to the concepts of architecture itself and electronic components. For this reason, it was termed as low level language.

Another negative feature was the fact that each computer architecture had its own Assembly language. A program written for one machine did not work on the other and vice versa. The source code had to be completely rewritten in Assembly for each machine where it was run. This was not very productive.

High-level languages

In response to this inefficiency, high-level programming languages ​​have emerged. Languages ​​more understandable and similar to the human way of communicating. For that reason, they were more and easier to learn.

The main feature of high level languages is the ability to write programs for all types of machines. That is, the rework of rewriting the entire program to a different computer has been resolved!

The source code of the program was written in a high level language. It was later converted to binary code through another program called the compiler. The compiler generated the program that could be run on that particular machine.

The importance of High Level Languages

Since high-level programming languages ​​emerged, it was possible not only to write unique programs that worked on all machines, but it was also possible to spread the use of computer software to other sectors of society, outside universities and research centers.

Just look around to see how society is flooded with computer programs. If you get your cell phone you will find hundreds of them. On your Digital TV, your fridge, the elevator, the smart traffic lights and almost everything we use every day and we don’t even realize it.

In this sense, high-level programming languages ​​played a very important role, as they allowed a considerable reduction in the learning curve of programming activity and drastically simplified the software development process for all sectors of the world economy.

Was this article helpful to you?

So support us and share it with others who are interested in this subject!


Other articles


This website uses cookies to ensure you get the best experience on our website.