# Algorithmic

Contenus

## Algorithmic

Algorithmics have been around forever, it is a thought process that is described in a logical manner. Although this system seems essential to us today for many sciences, the systematization of algorithmic writing has evolved over time and through technology. It began in the time of the Babylonians, until it was concretely established by René Descartes in the general method proposed by the Discourse on Method in 1637. Modern algorithms, including a so-called “program” language, made its debut shortly afterwards in 1677. The term we use today, “algorithm”, appears two centuries later.

The algorithm of XXe and XXIe century is often based on formalisms like that of Turing machines, which provides a scientific framework for studying the properties of algorithms.

With computer science, algorithms developed a lot in the second half of the XXe century and Donald Knuth, author of the The Art of Computer Programming laid a rigorous mathematical foundation for their analysis.

Many formal or theoretical tools have been developed to describe algorithms, study them, express their qualities, and be able to compare them:

• algorithmic structures: control structures and data structures.
• the notions of correctness, completeness and termination.
• complexity theory.

## Control structure and data structure

A control structure is a command that controls the order in which the various instructions of an algorithm or a computer program are executed. The processor is responsible for memorizing the address of the next instruction to be executed, the blocks (loops, if, etc.) or the subroutines.

This sequence of instructions is also called the execution flow of a program and can be represented using a control flow graph or an automaton. A certain programming paradigm calling on several processors is then exposed to the problem of assignment of tasks in the processors.

A data structure is a logical structure intended to contain data, in order to give them an organization allowing their processing to be simplified. There are many types of data structure depending on the actions to be performed on them, we will not delay any further on this subject.

Each computer language has a type of control structure and data structure. The writing of an algorithm is very often based on widely used languages such as C, java, etc.

## Notions of termination, correction, completeness

Termination is the assurance that the algorithm will terminate in a finite time. The proofs of termination usually involve a strictly decreasing positive integer function at each step / iteration of the algorithm.

Given the guarantee that an algorithm will finish, the proof of correctness must provide the assurance that if the algorithm ends by giving a proposed solution, then this solution is correct (in the sense that it answers the problem posed).

The proof of completeness guarantees that, for a given problem space, the algorithm, if it finishes, will give a solution for each of the inputs of the algorithm.

Termination : the algorithm has time complexity.

Correction : any computed answer of the algorithm is solution of the problem.

Completeness : the algorithm can calculate a response for any instance of the problem.

## Algorithmic complexity

The complexity of algorithms is an evaluation of the cost of running an algorithm in terms of time (temporal complexity) or memory space (spatial complexity). The efficiency of an algorithm therefore depends on these two criteria, which we will try to minimize.

Algorithmic complexity makes it possible to learn about the intrinsic efficiency of a
algorithm: the “natural” performance of an algorithm outside the environment in which the latter will be implemented.

There are several types of analysis of the performance of an algorithm:

• Optimistic analysis (best-case analysis or best-case analysis).
• The average analysis is often a factor in the programming paradigm used.
• Pessimistic analysis (worst-case analysis or worst-case analysis).

Complexity classes are sets of problems which all have the same complexity according to a certain criterion. The best-known classes are classes defined on Turing machines (deterministic or non-deterministic - see the course on automata) with time and space constraints. There is a relationship between algorithmic complexity and its energy cost of operation, called Landauer's principle.

Here is a list of the most commonly encountered complexities: FR FR EN RU ES