Your business, our energy

Products and solutions for business and customers Italy and abroad


Working and growing together

The platform dedicated to Eni's current and future suppliers


The computer’s history

A tale that began long time ago, shaping our era.

by Sabato Angeri
05 February 2021
11 min read
by Sabato Angeri
05 February 2021
11 min read

If we were to ask ourselves what really changed the world, computers would immediately come to mind. Since their creation, their application in the various spheres of our life has become increasingly widespread. But if a child asked you out of the blue how the first computer was born, would you have a prompt and confident reply? It’s not that straightforward, after all.

Computer ancestors

You could perhaps start by saying that computers are not as recent an invention as it is generally believed. The term, for instance, had already appeared in a piece of writing by English philosopher and writer Sir Thomas Browne, who called computers people who performed calculations. If we were then to ask ourselves what the original function of the machine was, many would answer to “perform mathematical operations”, and the abacus, devised by the Assyrian-Babylonians, could be identified as the first ancestor of computers. Besides, let's not forget that the same region was also the cradle of mathematics, and that it was the Arabs, millennia later, who introduced the concept of zero, one of the two digits of the binary system. The history of the mechanical devices that were used to carry out calculations starts from the seventeenth century, with the most disparate inventions. Thanks to Napier's theories on logarithms, many scientists built the first slide rules, such as those of William Oughtred in 1632, thanks to which it became possible to perform complex mathematical operations. Others, such as the famous French philosopher and mathematician Blaise Pascal, went further, devising new tools such as the Pascaline (1662), a mechanical calculator that by means of gears was able to execute calculations involving numbers up to 999,999,999.

Vintage illustration of a Pascalina.

Between failure and progress

However, it is only in the nineteenth century that the history of computers took two fundamental steps forward, albeit theoretical ones. Research by Charles Babbage and George Boole, both British philosophers and mathematicians, laid the foundations for what, less than a century later, would become the first computer prototype, as we understand it today. In his research, Boole associated logic with mathematics, and as such he is considered the founder of that branch of algebraic logic fundamental to the theoretical design of computers, while Babbage was the first to conceive and develop the idea of a programmable computer. The latter also designed a steam driven calculating machine that would be able to compute tables of numbers. The project, funded by the British government and called the differential machine was a failure, but Babbage did not give up and continued his research. A few decades earlier, the French inventor Joseph-Marie Jacquard devised the so-called Jacquard loom, a machine that employed punched cards to automatically weave patterns onto fabric. Based on a similar principle, Babbage designed the analytical engine, applying the use of punched cards to mathematical operations. The device was equipped with punched card-based input systems, as in Jacquard's design, as well as with an arithmetic processor to compute numbers, a control unit to determine that the correct task was being performed, an output mechanism, and a memory where numbers were kept waiting to be processed. Even those ignorant of information technology can easily understand that the analytical engine basically included the essential features of modern computers and, in fact, many consider it the first computer. Unfortunately, following the failure of the first project, Babbage was unable to find sufficient funding to build his new machine. However, he managed to gain the interest of an English noblewoman, Lady Ada Lovelace, daughter of the great poet Lord Byron, who became so passionate about Babbage’s research that she became one of its main promoters. Lady Lovelace wrote several programs for the analytical engine and, among her notes, was found an algorithm considered by historians of science to be the first expressly created algorithm to be processed by a machine, which makes her the first computer programmer in history.

An aid for institutions and private companies

A few years later, in 1880, a machine for performing calculations was first put to work by a government to streamline administrative procedures: the Hollerith Machine, so called after the American engineer, Herman Hollerith, who used his own code to develop it. The device was used for tabulating national census data. Hollerith had the idea of recording all census data on punch cards (such as those used by Jacquard and Babbage) and had them processed by the machine he had created. At the time, it would take up to seven years to transcribe these data but Hollerith only took three, saving the US government, among other things, a whopping 5 million dollars (a huge amount in those days). In the wake of this success, in 1896 the engineer founded his own company, the Tabulating Machine Company, which in 1924 became the International Business Machines company, better known by the acronym IBM. In the twentieth century, the history of computers accelerated dramatically. At the beginning, the use of electricity and the invention of valves allowed for significant developments in computers, which, however, were colossal, bulky machines that weighed tons and took up tens of square metres of space. It is indeed no coincidence that the British government named Colossus the programmable electronic computer it employed from 1943 to decipher secret messages by the Nazis, which were in turn processed by a computer called Enigma. The year 1936 was a milestone, a true watershed in the evolution of information technology: that year, the English physicist and mathematician Alan Turing presented his universal machine, later renamed the Turing Machine, which was supposed to be capable of computing all that was quantifiable. Turing is considered the father of modern computer science and one of the fathers of Artificial Intelligence, which he had already theorised in the 1930s (arguing that one day even machines will be able to think).

The Turing Machine.

The first large-scale computer production

If one side of the Atlantic had Colossus, on the other side was ENIAC, considered by many to be the first prototype of modern computers. The Electronic Numerical Integrator and Computer was created in 1946 by J. Presper Eckert and John Mauchly, building on the expertise gained in the making of Colossus. The US government had commissioned the machine to perform ballistic calculations related to the launch of artillery shells. Just to quote a figure, the ENIAC needed 18,000 valves to work and, due to the large amount of heat it generated, every two minutes a valve would burn out. Its successor, the EDVAC (Electronic Discrete Variable Automatic Computer) marked a further crucial evolutionary step, as it incorporated for the first time the theories of the Hungarian scientist John von Neumann. He decided to completely overturn the internal architecture of the machines used until then and to place data and instructions within the machine's memory. This revolution gave rise to machines with internal program memory, just like all computers built since then. This first era in the history of computer science was concluded by IBM, which in 1954 launched the 704, the first computer marketed for non-governmental purposes and available on a large scale, though still very expensive and bulky. The invention of transistors, which replaced fragile condensing valves, allowed for the exponential increase of computing power. The IBM 7090, from 1957, was already six times more powerful than its predecessor and was employed by NASA to launch the Mercury and Gemini probes. A few years later, the PDP-1, built by the American company DEC, came on the market and went down in history for being the first computer on which a video game (designed by Steve Russel) had been played. Technological acceleration experienced a new impetus in 1964, when IBM introduced the 360, the first computer running on integrated circuits made of silicon, which replaced transistors. The 360 introduced the concept of modularity and uniformity of language between devices, thus ushering in the standardization of languages, which took place a few decades later. But the history of computers took a radical turn with the Xerox Star, which gave rise to the first experiments by Apple and Microsoft. This computer, in fact, was the first one in history to be equipped with a graphical interface (GUI) that allowed users to interact with the machine by manipulating conventional graphic objects through icon display and a mouse. Moreover, as their size significantly decreased, computers became more accessible to the public.

One of the most iconic Apple Macintosh personal computer models.

The rise of personal computers

Nevertheless, to understand how the enormous widespread distribution of computers and their transformation into personal computers translates into economic terms, it’s necessary to digress a little and travel to Italy. LSI (Large Scale Integration) technology, developed in 1968 by the Italian physicist Federico Faggin at Fairchild Semiconductor, made it possible to integrate a complete CPU into a single device, significantly reducing production costs. The same year, Robert Noyce and Gordon Moore left Fairchild Semiconductor and founded the Integrated Electronics Corporation, later abbreviated to Intel Corporation. In 1971, Federico Faggin also left Fairchild Semiconductor and joined Intel, helping to create the first microprocessor, the INTEL 4004. This brings us to the 1970s, the era of garages companies established by college students and of great innovators suddenly appearing over the horizon. One of the most famous, Steve Jobs, founded Apple with Steve Wozniak, and from 1977 began marketing the Apple I. Many of the models from this first experiment were housed in wooden chassis as, at the time, there were no prefabricated parts –users simply built them themselves. The Apple II introduced the first real personal computer both on the market and in history: it was the first successful model produced on an industrial scale and marketed for home use (it cost ‘only’ $1,195). All electronic components were hidden away in a plastic case, giving rise to the shapes still used today. Meanwhile, Bill Gates and Paul Allen founded the giant Microsoft. Lastly, the final act that took place before the Internet revolution, occurred –according to many– on 12 August 1981, when IBM retaliated against Apple by launching the IBM 5150, better known as the IBM PC, with an operating system developed by Bill Gates' Microsoft. It was a global success: in just a month, the American company sold 50,000 PCs and more than 200,000 units during the year. Never before had so many computers been sold and the IBM 5150 became the standard on which subsequent machines were developed, thanks to its open, and as such replicable, system. The history of computers is a long and constantly evolving one and perhaps one day not too far off, this child’s question will be answered by a computer itself, narrating its own story.