Algorithms are present in all aspects of our daily lives. These are their main historical milestones, from the 19th century to the present day
In the first half of the 19th century, British mathematician Ada Lovelace wrote what is now considered to be the first computer algorithm in history. Although she remained practically forgotten for more than a century, this pioneer started a long journey that continues to this day with the effervescence of artificial intelligence.
In the first half of the 19th century, British mathematician Ada Lovelace wrote what is now considered to be the first computer algorithm in history. Although she remained practically forgotten for more than a century, this pioneer started a long journey that continues to this day with the effervescence of artificial intelligence.
The history of the algorithm is the long story of a concept that for decades was not even known by that name. We usually refer to it as the computer instructions that make a machine work, and it is one of the themes around which Code and Algorithms revolves. Sense in a Calculated World, the exhibition on show at our Space until 16 April 2023, in which you can unravel their role in many aspects of our lives.
But algorithms have not always been there. The word was coined by the Persian mathematician Muhammad Ibn Musa in the 9th century but was not associated with computer science until the 20th century. It was then that the role played in this history by Ada Lovelace, whose 207th birthday this December 2022 is the 207th anniversary of her birth, gradually began to be recognised.
With her and her creation, a history began that continues to provide milestones today, but which is difficult to summarise in a chronology. Algorithms are multiplying in all areas of human development, in a complex and ramified technological evolution. But we have made the effort and summarised it in this chronology:
- 1843: Ada Lovelace and the first algorithm
- 1847: Boolean algebra
- 1912: an algorithm for playing chess
- 1936: the term ‘algorithm’ takes hold
- 1938: First programmable computer
- 1950: Algorithms for weather prediction
- 1954: FORTRAN is born
- 1956: The term artificial intelligence is coined
- 1963: algorithms learn to play games
- 1976: the father of fractals
- 1980: Algorithms able to see
- 1983: free software
- 1985: babbling machines
- 1991: Linux is born
- 1996: Kasparov loses to DeepBlue
- 1999: Pagerank, Google’s algorithm
- 2009: EdgeRank, social instructions
- 2020: GPT-3, artificial intelligence that writes
1843: Ada Lovelace
Considered “the first programmer in history”, Ada King, Countess of Lovelace is known for having written the first algorithm. Closely associated with great intellectual figures of the time, she became obsessed with a highly complex calculator designed by the mathematician Charles Babbage. Although the “differential machine” (as it was christened) was never completed, Ada translates from Italian a publication with a description of the device. In this documentation she adds what we would now consider a computer program, intended to perform a very specific calculation: the Bernouille numbers.
1847: ones and zeros
Just a few years later, the mathematician George Boole developed one of the foundations that is still fundamental to computer programming today. Boolean algebra, which sought a mathematical description of logical thought, laid the foundations for the 1s and 0s (signal-not-signal) with which we encode any digital information today.
1912: an algorithm to play with
Decades before Deep Blue beats Garry Kasparov in 1996, the Spaniard Leonardo Torres Quevedo develops an algorithm that allows a machine of his invention to make certain chess moves. The instruction ensures that his automaton could win a game against a human, given certain conditions.
1936: the christening arrives
Alan Turing is one of the fathers of modern computing. To him we owe some of the principles that make today’s computers work, but also the formalisation of the term ‘algorithm’ to define the mathematical formula that provides instructions to a computer.
1938: computer war
Just before World War II, the German Mark Zuse completes his Z1, which today is regarded as the first programmable computer. But it was the Mark1 built by IBM in the USA that gained historic fame. The Z1, whose limitations Zuse himself acknowledged, was wiped out by Allied bombing raids.
1950: Algorithms for weather prediction
The married couple John von Neumann and Klára Dán (one of the pioneers of programming) develop the first successful computer weather forecast, on the ENIAC supercomputer. The forecast for the following day took 24 hours to complete.
1954: FORTRAN is born
Developed by John Backus for IBM, FORTRAN was the first programming language to become a standard for the computer world. Its popularity was key to digitisation in many areas and it is still used by programmers today.
1956: The term artificial intelligence is coined
During the conference convened by John McCarthy, the term “artificial intelligence” is consolidated. Under this umbrella is understood the set of algorithms that try to replicate the processes that we understand to be characteristic of human thought.
1963: Algorithms for playing tic-tac-toe
Donald Michie designs a program called MENACE, capable of learning how to play (and win) the classic game of tic-tac-toe. The design was actually implemented using a set of hundreds of matchboxes in which positive or negative scores were entered for each move; but its mechanics are considered a key algorithm for the advancement of machine learning.
1976: the father of fractals
Benoit Mandelbrot develops fractal arithmetic, which describes many of the behaviours of physics and nature. Its application through algorithms in fields as varied as medicine, 3D graphics, finance and astrophysics represented a giant step forward in computer science.
1980: algorithms capable of seeing
Kunihiko Fukushima develops the ‘Neocognitron’ model, a complex system that is the precursor of the neural networks that today make advances in artificial intelligence possible. Together with neuroscientist David Marr’s algorithms, capable of detecting and recognising patterns in the shapes of objects, machine vision begins to take shape.
1983: free software
Richard Stallman, concerned about intellectual property rights’ limits on programming, promotes free software, a movement whose principles continue to be debated today.
1985: Babbling machines
Terry Sejnowski invents NetTalk, an algorithm capable of learning to reproduce human speech as a baby would. It is a breakthrough in machine learning and speech synthesis technology, which is becoming more and more widely used.
1991: Linux is born
The Linux operating system was created by Linus Torvald. Developed under the principles of free software, it was the embryo of Android, which a large part of the population has on their mobile devices. It is estimated that more than 10,000 programmers have participated in its code.
1996: Kasparov loses to DeepBlue
World chess champion Garri Kasparov loses a game to Deep Blue, a computer program developed by IBM. The victory is considered one of the great milestones of computing, for its ability to mimic the best of human capabilities.
1999: Pagerank, the Google algorithm
The first version of the algorithm of what would later become the most popular search engine in the Western world was implemented. With it, Google was able to rank the most relevant pages for a given search in a list. This innovation changed the way in which internet users accessed information.
2009: EdgeRank, social instructions
Facebook implements the algorithm that controls which posts users see on the social network. These computer instructions took into account users’ social interactions to assess their interests, which meant a substantial change in the way internet users consume information.
2020: artificial intelligence that writes
The company OpenAI publishes GPT-3, a model of artificial intelligence that is able to write in a human-like way. The programme represents a radical change in the way we relate to machines and in December 2022 it takes another giant step forward by making it available through a chatbot, which is surprising for its logical capabilities and its vast database, with knowledge in all areas of knowledge.
Condensing the history of the algorithm into a few milestones does not do justice to the immense adventure of millions of programmers and mathematicians behind it. Starting with Ada Lovelace who, by the way, when reflecting on Babbage’s differential machine, expressed her doubts that computer science would be able to produce a real analysis. Today she would be stunned by the advances made by the algorithms that followed hers.