(One)-minute geek news

Group talks, Laboratoire de Chimie, ENS de Lyon

Computational complexity


Before exploring Python data structures, we need to understand computational complexity, the canonical measure(s) of performance for algorithms.

Time versus space complexity

An algorithm performance can be rated on two main features:

  • The amount of time required for its execution (related to time complexity or cost).
  • The amount of space required for its execution (related to space complexity). It is basically the maximum RAM consumption during execution.
We will mainly focus on time complexity in the following. But let us note that there is a very general relationship between time and space, so that it is common to do time-space compromises: sacrifying space complexity for better time complexity (and reciprocally).

In order to estimate the execution time of an algorithm/program on a given instance (or input), we need a model.

The RAM model of computation

A computation model defines a list of primitive operations (or unit-cost operation) that are given a unit cost. There exist many computational models depending on which kind of computer is considered (Turing machine model, Parallel programming model, Quantum programming model, ...).

The most common computational model is the RAM model of computation, describing Random access machines, the usual kind of computer. In this model, the unit-cost operations are:

  • Basic floating point operations (+, -, *, /)
  • Conditional jumps (if, goto, ...)
  • Variable assignments
We know in the real world that divisions require far more CPU clock ticks than additions, and the variable assignment effective cost depend on many factors (cache misses, ...). So why not considering the number of CPU clock ticks instead? Simply because this measure would be highly machine/architecture-dependant, so that it would provide with highly non-general estimators that require a reference computer. Then, in which extend is this model pertinent? Time complexities are good estimators of asymptotical execution time up to a multiplicative constant.

Big O notation

Worst case and amortized complexity

TODO