A Journey through time – Programming Part 1
Let’s take a deep dive in time and look at how the world of computers and programming languages began. Follow us as we go through the ages and look at the timeline and the major personalities and events that through their ingenuity paved the road to our current Technological and Digital Era.
Let’s take a journey through the timeline of programming
We talk a great deal about the software development on our blog. Heck that’s what we do after all, but did you ever wonder where it all began? What was the first step humanity took towards the Digital World we know and love today?
Our team prides itself on outstanding results in our own work and pushing development to the next level. And to improve even more, we must understand how it all started.
“If you want to understand today, you have to search yesterday.” – Pearl Buck
It’s so fortunate that I recently finished a time machine out of paper-Mache, a legacy branch in python from 1999, and a quantum field stabilizer.
First thing I thought was “What’s the best way to test it?”. The answer was obvious, we’ll use it to take a glimpse into our biggest passion – Software Development! And more precisely the world of computers and programming!
So, strap in, hold on tight and join us on our journey through time!
First stop – the 1800’s!
Part 1: Jacquard and loom programmers
More specifically the year is 1804. It all began with one – Joseph Marie Jacquard and his programmable loom. The loom simplified immensely the manufacturing process of textiles with complex patterns. You would program this machine by a set of punched cards laced together into sequences. And so, loom programmers were born.
What does this have to do with programming and computer development? Well look at the IBM punched card example from the mid-20th century.
We’ll be coming back to punch cards later, but it felt noteworthy to mention Mr Jacquard as an important foot note that IBM engineers took inspiration from to create their digital compiler.
Let’s move about four decades ahead.
Part 2: Ada Lovelace and the Analytical Engine
We’re in 1842. And we want to take a look at Augusta Ada King, Countess of Lovelace, and her work on the Analytical Machine with none other than Charles Babbage also known as the “father of the computer”. And here’s what it looked like:
The story of the “first” computer is somewhat disappointing, you see, it was never actually constructed in the 19th century. A few components were built by Mr. Babbage but he never saw it completed and died in 1870. It mostly due to monetary reasons, and possible technological inability to construct it back in the day. The recreation you see is the work of the 20th century.
Here’s the kicker though, even in its incomplete work, Ms Lovelace was able to create a theoretical program by studying an Italian book about the machine.
By creating a complex method (or code) of using the machine and calculating Bernoulli numbers, scholars wildly recognize Ms Lovelace as the first programmer.
There’s a lot of discourse about who should be accredited with the “invention” of the programming languages. Primary argument stands that without the difference engine created by Babbage, which based its actions on simple mathematical functions/language, Ada would not be able to do her magic.
Currently programming is only in the “mechanical” stages of its development, and it still some time before humanity gets to play around with code.
Part 3: The Mathematical Analysis of Logic
We’re still in the 1800’s, 1847 to be precise, because there’s one important person we want to see, and an even more important event to witness – the birth of binary code.
Now, it’s important to note that the “binary” approach is credited to Gottfried Leibniz and his work “Explication de l’Arithmétique Binaire” in 1689.
And Leibniz’s work in turn is based on an even older Chinese “Book of Changes” (I Ching); Mr. Leibniz wanted to find a way to interpret logic verbal statements into a pure mathematical ones. Thus, he created a system of rows of zeroes and ones.
A system we all know and love today:
01001000 01100101 01101100 01101100 01101111 00100000 01000101 01000010 01010011 00100000 01001001 01101110 01110100 01100101 01100111 01110010 01100001 01110100 01101111 01110010
However, right now we’re more interested in one George Boole, and his paper “The Mathematical Analysis of Logic”. It describes an algebraic system of logic we all now know as “Boolean Algebra”.
It is based on the basic binary approach and consists of three most basic operations: AND, OR and NOT.
Later his work would be used by a graduate of MIT, Claude Shannon and his thesis “The Mathematical Theory of Communication” in 1937, to jumpstart the use of binary code in computers. And specifically, the creation of levels of abstraction in programming and machine code as we know it.
We’ll be discussing levels of programming and generations of languages (GLs) later. Everything we see on our screens, the article, every digital software you ever used, is using opcode/machine code, categorize it as “1GL” (1st Generation Language).
Anyway, let’s move a decade forward.
Part 4: Konrad Zuse and a High-Level Programming Language
The year is 1943. The world war is still raging on, and Konrad Zuse is moving to a remote alpine village after an extensive allied bombardment which destroyed substantial progress on his work with relay computers.
Without proper equipment (and the war going on in the background) he mostly dedicated himself to theoretical studies during the year. And thus Plankalkül was born.
A high-level programming language for computers that has comprehensive ideas and concepts that wouldn’t see use for a long time yet.
Our hermit Zuse wrote a number of complex programs, and one notable example I particularly like is the one that could play chess.
Zuse’s language held ideas that would only be implemented into “modern” languages in late 50s/early 60s. Using our modern terms here’s a small list of some features it had:
- Variables are local to functions (programs)
- Programs are reusable, and functions are not recursive
- Fundamental data types are arrays and tuples of arrays, but there are also floating point, fixed point, complex numbers; records; hierarchical data structures; list of pairs.
- The type of the variables does not need to be declared in a special header.
- Conditional statement (e.g.: V1 = V2 => R1) – Compare the variables V1 and V2: If they are identical then assign the value trueto R1, otherwise assign the value false. You can apply this to complicated structures, as well as handle arithmetic exceptions.
- Defining sub-programs and repetition of statements (loops), WHILE construct for iteration.
Unfortunately, this language never saw practical use due to technological limitations at the time (the hardware was simply not powerful enough) and relative obscurity of Zuse himself, as other language designers had not been aware of his work.
Part 5: Assembly Language and EDSAC
The 1949 saw the development of two massive contributions to our topic – Assembly Language and Shortcode. Low-level and High-level programming languages, respectively.
Assembly Language (ASM) – a low-level programming language, most notable use saw in the Electronic Delay Storage Automatic Calculator (Aka EDSAC) constructed by Maurice Wilkes and his team at the university of Cambridge.
Gene frequency differential equation and the discovery of 79-digit prime number (the largest at the time) are among the biggest contributions of this device.
Anyway, we can consider ASM as the next step of evolution and categorize it as “2GL” or 2nd Generation Language.
In reality, ASM merely simplified the machine code it was made to interact with and is considered a “Symbolic Machine Code” or “Mnemonic Code”.
It reads the “Assembly Instruction” and transforms it into computer readable binary.
Part 6: Abstractions – Short Code and Opcode Tables
Same year we see John Mauchly, an American physicist, propose Short Code and a year later he ends up writing it. Short Code the first High-Level Programming language, that is understandable to both humans and computers. Although intended to be an improvement it still occupies the category of “2GL”.
It was meant to revolutionize human/computer interaction. And it looks like this:
a = (b+c)/b*c // an operation we want performed. X3 = ( X1 + Y1 )/X1 * Y1 // substitute variables. X3 03 09 X1 07 Y1 02 04 X1 Y1 // substitute operators and parentheses. 07Y10204X1Y1 //group into 12-byte words. 0000X30309X1
You had a set of predefined variables, which are the X1; X2; X3; Y1; Y2; Y3, etc; and so you replace the variables in your statement with actual variables the computer gives you.
Then you group all of these into 12-byte words and congratulations you’ve compiled your code. Oh, but in the current year of 1950, a compiler is a job title which had to be done manually.
And that’s short code, not that short to be honest; It could do basic arithmetic, branching and calls to function libraries. Oh, and it ran 50 times slower than machine code. Truly, progress could go no further…
Computers in general can understand only binary, and for a time new programs were written just with that, binary. And in general, to introduce a new line of code or create a new program, you had to manually translate and introduce it.
By painstakingly writing everything on paper just like our assembly code example and using an Opcode Table for reference.
Some people finally got fed up and wrote a program that assembled binary code from a more readable version; and it did it all automatically!
Part 7: A-0 and Autocode – Early Open-Source Software
Because GitHub is yet to be invented there’s no clear answer which came first, they did not collaborate in any shape or form, so both declared themselves first, however it’s generally agreed that both came at about the same time.
Sadly, there is no surviving code from the A-0 system. So we can’t show it. However;
A-0 was followed by A-1;A-2;A-3 and later B-0 systems; And what’s notable is that the A-2 system was developed at UNIVAC in 1953 and released to the customers alongside its source code with an invitation to the users to send their improvements back to UNIVAC.
Essentially adopting the early philosophy that we now know as “free open-source software”
Autocode, not being a single iteration of a programming language, rather than an a term describing a family of Manchester Mark 1 auto coder systems, and generally referred to any “high-level programming language” that uses a compiler of any sort.
We do however have snippets of the code and this is how Glennie’s original code would process this equation:
1 [email protected] [email protected] [email protected]½C [email protected] [email protected] 2 INTEGERS +5 →c 3 →t 4 +t TESTA Z 5 -t 6 ENTRY Z 7 SUBROUTINE 6 →z 8 +tt →y →x 9 +tx →y →x 10 +z+cx CLOSE WRITE 1 11 [email protected]/½ [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] 12 INTEGERS +20 →b +10 →c +400 →d +999 →e +1 →f 13 LOOP 10n 14 n →x 15 +b-x →x 16 x →q 17 SUBROUTINE 5 →aq 18 REPEAT n 19 +c →i 20 LOOP 10n 21 +an SUBROUTINE 1 →y 22 +d-y TESTA Z 23 +i SUBROUTINE 3 24 +e SUBROUTINE 4 25 CONTROL X 26 ENTRY Z 27 +i SUBROUTINE 3 28 +y SUBROUTINE 4 29 ENTRY X 30 +i-f →i 31 REPEAT n 32 ENTRY A CONTROL A WRITE 2 START 2
Look at the code and note that this was done in order to simplify machine language.
It wasn’t until 1954, when our next protagonist would gather a team to develop a high-level programming language that would have a whooping 32 statements; Including: IF, DO, GO TO, READ, WRITE, PUNCH, STOP and CONTINUE among others.
The language would be available to customer in 1957 and is in fact still in use today.
That however is a story for another day, join me next time when we’ll visit the first programming language that looks more like a code and not overly complicated string of symbols.
Want to guess which step comes next? Write down in the comments and see if you know which programming language was the next step and pushed GL forward!
I would love to hear your opinion on history of programming.
Stay savvy tech and business nerds!