Img_oil_Our_channel.jpg
enioilproducts

Your business, our energy

Produtcs and solutions for business and customers Italy and abroad

Img_enjoy_Our_channel.jpg
ENJOY

Get around town easily

Live the city with Eni's car-sharing service

1157048386

History of the software

How computer programs changed our lives.

by Andrea Signorelli
23 March 2020
8 min read
byAndrea Signorelli
23 March 2020
8 min read

We’re so used to relying on software at our every waking moment, we tend to forget the extraordinary importance it has in our lives. We use software to write, browse the internet, take and edit photos, send e-mails and WhatsApp messages. And that’s not all. Software is increasingly running our cars, our washing machines, the thermostats in our fridges – the list is endless.

Most of our daily tasks are done with a computer program, prompting Silicon Valley guru Marc Andreessen to write that “software is eating the world”. And this strong comment implies no criticism. It is a simple observation of the fact that the digital tools we use in our daily lives have assumed incredible influence over all of us. This is true not only of programs in popular use, but also of the software underlying the biggest innovations of the last few years. Artificial intelligence is a software and it can process huge amounts of big data to spot skin tumours more accurately than a doctor, or translate ever more reliably from one language to another, to give just a couple of examples. Blockchains have revolutionised the world of finance and are becoming part of the innovation process at businesses and government authorities. The same goes for the programs supercomputers are using to sequence the human genome and make increasingly accurate weather forecasts.

Some of the most important software is crucial in industry, like Echelon, the fruit of a partnership between Eni and the American company Stone Ridge Technology. Technically, it’s an “advanced dynamic reservoir simulator, for optimisation of field monitoring, development and production”. In layman’s terms, it’s a program that recreates the behaviour of oil fields. Eni uses it to read fields and make company decisions. Echelon processes more quickly than other products on the market but sacrifices nothing when it comes to precision. Together with HPC5 – the supercomputer announced in October 2015 and revealed in February 2020, which has brought Eni’s calculating speed up to 52 petaFLOPS (52 million billion operations a second) – it’s an important new milestone on the road to the digital transformation.

More and more focus is on computer power and less and less on man power, which is what actually gives these machines their calculating skills. In the words of the famous advertising slogan, “Power is nothing without control”. Simple but powerful words that reflect the balance we need to strike.


Echelon software uses HPC calculation capability

The origins of software

They say it will reach human levels of intelligence one day. If or when that happens, it will be the last piece in a jigsaw started long ago. It all began with programming in binary code, the endless sequence of 0 and 1 invented more than a century before the first computer prototypes. The year was 1725 and the extraordinary idea belonged to one Basile Bouchon, a textile worker. He came up with a holed sheet that automatically ordered the loom to draw the patterns he wanted on the cloth. It was worlds away from today’s software and it may seem silly to compare them, but the alternation between hole and no hole works on the same logic as binary code. The foundation stone of programming was laid almost three centuries ago.

But now we must jump forward a hundred years, to the “first programmer in history”. Ada Lovelace was born in London in 1815, to the poet Lord Byron and the mathematician Anne Isabella Milbanke. Right from her early childhood, she displayed a remarkable skill in mathematics, and was just a teenager when she first met Charles Babbage, a Cambridge professor in the discipline. He was then designing the first of two computer prototypes, the difference engine, which could do flawless automatic equations. It would later be followed by the analytical engine, which could do any calculation it was set. Their exorbitant costs meant that neither machine was built in Babbage’s era, but that didn’t stop him holding conferences at which he explained the theories behind his inventions. One, held in Turin in 1842, was translated into French by the Italian mathematician Luigi Menabrea.

Lovelace, who had become a countess in the meantime, took up the task of translating the manuscript into English. By the time she finished a year later, her own notes were enough to triple the length of the original text. Besides showing the world, including a stunned Babbage, her perfect understanding of how the two machines worked, Lovelace took the opportunity of describing what would go down in history as the first computer program. The point of the software she designed (which was never built) was to calculate Bernouilli numbers. For the first time ever, here was a gadget that could be ordered to do a calculation and carry out a finite number of operations, following a finite number of rules, to give an answer. In brief, Lovelace had created a software based on an algorithm – a word that now, in 2020, is one of the most frequent in discussions about artificial intelligence and other digital innovations.

Ada Lovelace was a visionary. Back in Victorian Britain, she managed to see how numbers could represent much more than just simple quantities. She imagined a world of tomorrow in which computers could compose music, draw and serve the most important fields of science. Her vision went on to become a reality, even if it did take more than a century.


From Turing’s imitation to brain replication

Lovelace would later inspire Alan Turing, the British mathematician who in the Second World War cracked Enigma, the machine the Germans used to encrypt their communications. Turing quoted Lovelace in his seminal text “Computing Machinery and Intelligence”, in which he examined whether machines might one day learn to think for themselves.

We’ve now arrived at the dawn of the first genuine software. Exploiting some of Turing’s ideas (in particular those behind his Turing machine), Hungarian computer scientist John von Neumann created EDVAC (Electronic Discrete Variables Automatic Computer) in 1945. This was the first digital machine that could be programmed with software and it relied on what was known as the Von Neumann architecture.

Thanks to these pioneers, the foundations of software were laid. The first revolutionary applications were gradually seen in research laboratories and there was an equally gradual shift from the world of science to the world of mass consumption. In 1961 Spacewar, the first video game in history, was released. Four years later, MIT came out with a software that linked different computers and sent short text messages between them – the genesis of e-mail. In 1975 Telenet, the basis for the internet, was born. Microsoft Word was launched in 1983, consigning typewriters to the scrapheap. Our relationship with photographs was changed forever when JPEGs came along in 1992.

We could go on all day with examples. In the second half of the 20th century, software began spreading and changing both the world of research and our daily lives. The combination of ever more sophisticated programs and ever more powerful hardware is the bedrock of some of the most important scientific tools of our time. The supercomputer software Spinnaker was first to (partially) reproduce the functioning of the human brain in real time. Other current software is letting us produce new medicines more quickly and simulate the Big Bang. One day, perhaps, it will be able to prevent earthquakes. Since its beginnings as the brainchild of a textile worker, software has revolutionised every field you can think of. Rather than “eating the world” as Andreessen puts it, software has conquered it.