- ADSA Project: Among Us
- Tutoriel sur le codage de Huffman
- Exercices corrigés sur les structures de contrôle et les structures de données
- Exercices corrigés sur la complexité en temps
- Corrected exercises about time complexity (en)
- Exercices corrigés sur les algorithmes récursifs, terminaux, multiples et croisés
- Exercices corrigés de type Licence sur le paradigme de diviser pour régner
- Exercices corrigés sur les algorithmes en Diviser pour régner et Programmation dynamique
- Corrected exercises about divide & conquer and dynamic programming
- Exercice corrigé pas à pas du Branch and Bound sur du LP en nombre entier
- Exercices corrigés de programmation logique par contraintes
- Exercices corrigés sur les algorithmes de tri

## Algorithmic

Algorithmics have been around forever, it is a thought process that is described in a logical manner. Although this system seems essential to us today for many sciences, the systematization of algorithmic writing has evolved over time and through technology. It began in the time of the Babylonians, until it was concretely established by René Descartes in the general method proposed by the *Discourse on Method* in 1637.

Modern algorithms, including a so-called “program” language, made its debut shortly afterwards in 1677. The term we use today, “algorithm”, appears two centuries later.

The algorithm of XX^{e} and XXI^{e} century is often based on formalisms like that of Turing machines, which provides a scientific framework for studying the properties of algorithms.

With computer science, algorithms developed a lot in the second half of the XX^{e} century and Donald Knuth, author of the *The Art of Computer Programming* laid a rigorous mathematical foundation for their analysis.

Many formal or theoretical tools have been developed to describe algorithms, study them, express their qualities, and be able to compare them:

- algorithmic structures: control structures and data structures.
- the notions of correctness, completeness and termination.
- complexity theory.

## Control structure and data structure

This sequence of instructions is also called the execution flow of a program and can be represented using a control flow graph or an automaton. A certain programming paradigm calling on several processors is then exposed to the problem of assignment of tasks in the processors.

Each computer language has a type of control structure and data structure. The writing of an algorithm is very often based on widely used languages such as C, java, etc.

## Notions of termination, correction, completeness

Termination is the assurance that the algorithm will terminate in a finite time. The proofs of termination usually involve a strictly decreasing positive integer function at each step / iteration of the algorithm.

Given the guarantee that an algorithm will finish, the proof of correctness must provide the assurance that if the algorithm ends by giving a proposed solution, then this solution is correct (in the sense that it answers the problem posed).

The proof of completeness guarantees that, for a given problem space, the algorithm, if it finishes, will give a solution for each of the inputs of the algorithm.

** Termination** : the algorithm has time complexity.

**Correction** : any computed answer of the algorithm is solution of the problem.

**Completeness** : the algorithm can calculate a response for any instance of the problem.

## Algorithmic complexity

algorithm: the “natural” performance of an algorithm outside the environment in which the latter will be implemented.

There are several types of analysis of the performance of an algorithm:

- Optimistic analysis (best-case analysis or best-case analysis).
- The average analysis is often a factor in the programming paradigm used.
- Pessimistic analysis (worst-case analysis or worst-case analysis).

Complexity classes are sets of problems which all have the same complexity according to a certain criterion. The best-known classes are classes defined on Turing machines (deterministic or non-deterministic - see the course on automata) with time and space constraints. There is a relationship between algorithmic complexity and its energy cost of operation, called Landauer's principle.

Here is a list of the most commonly encountered complexities:

Rating | Complexity type |
---|---|

constant complexity: an assignment, an elementary calculation, etc. | |

logarithmic complexity: dichotomy, tree | |

linear complexity: a loop | |

quasi-linear complexity: divide and conquer, tree | |

quadratic complexity: nesting two loops | |

cubic complexity: nesting of three loops | |

polynomial complexity: nesting of loops without knowledge of states | |

quasi-polynomial complexity: divide and conquer | |

exponential complexity: tree, brute force | |

factorial complexity: naive approach to combinatorial enumeration |