From what is arguably the first computer, designed by John von Neumann, to the hip languages like Golang, Swift and Hack today, programming has evolved in interesting ways. The infographic in this article shows a brief history of programming, from the 1950s to the 2000s:
One of the most interesting developments in my opinion is the audience capable of working with a programming language. The first computers had to be programmed completely by hand, entering the zeroes and ones by means of punch cards. This meant that a high degree of knowledge was needed and only highly educated mathematicians could program computers.
With the introduction of programming languages, the process was simplified. Programmers could now simply type their program, after which another program called a compiler would translate it into the zeroes and ones that the computer needs. This revolution meant that not only mathematicians could program, more people were now able to learn the conceptual ideas behind a programming language.
When computers became more popular in the 1980s, so did programming. With the C programming language, computers could be programmed in new ways. Operating systems such as Unix and Windows were becoming more powerful and allowed people that couldn’t program themselves to use computers too.
Today almost everyone can use a computer, even toddlers. And the same shift in the target audience of programming continues, with ever simpler and more powerful languages and platforms allowing more people to program a computer.
I believe that what the world needs is to make it easier to tell a computer what to do. In fact, making more people able to make software, is one of the biggest productivity improvements possible. We can achieve this not by teaching more people the intricate knowledge required to write programming language, but rather by making the process of telling the computer what to do easier and easier.