Computer code is what makes all our devices work. The trouble with defining it comes in trying to make it seem less technical and more understandable. Sure, we could give examples, but that sometimes introduces new terminology that muddies the water. To keep this simple, we will look at the idea of “code” as it evolves through history.
The word first shows up in relation to computers in the Association of Computer Machinery databases in 1954, but is getting reference in 1946 from Professor Howard H. Aiken of Harvard University (Hopper, 1952).
We can look back even further into the timeline to 1801, to the invention of the mechanical, automated loom. The loom worked using programmable punch cards that controlled the operation of the loom. The loom did not use electricity, but human interaction was required for power. The difference was the pattern of the output was controlled by what the loom read on the card as it passed in the overhead bins. This eliminated many errors in the construction of the textile.
As time went on, the sense of programming and coding was left to mathematicians. Many of the machines were not programmable in a digital sense until World War II. During the war, the Colossus machine was the first programmable, electronic digital computer that was designed to break and read encrypted German messages.
From that point, machines kept getting bigger, stronger and more complicated. The era of code was upon us; but only for the highly skilled mathematicians. It wasn’t until the 1980’s when coding skills made their way into education through BASIC programming.
Today the definition is expansive, but my favorite comes from urbandictionary.com (don’t share that site with students…although they already know about it; you could get fired.), which explains it as, “Basically, telling your computer to do stuff using a programming language. As long as the computer understands the code, it will execute the command blindly.”
In all seriousness, the word code or coding is substituted for programming. Merriam-Webster defines it as a set of instructions for a computer. When we look at the definition it is simple and understated. However, it is true. All you are writing is a set of instructions; but in a way that the computer can execute the commands. It’s really a series of “If…Then…” statements.
Another interesting find was the use of the word in the ACM databases surged around 1990 through 2010. Since then, the articles being published seem to have dropped off. Is it because the use of the phrase code is falling out of favor? Or, that the coding phenomenon is on a pendulum switch? Of course not, we are only half way through the decade. We will surpass the previous decade in articles published.
The word is really referencing mathematical functions for telling the computer how to perform a specific set of tasks. As it evolves, the details of “code” become more complex. We see that not every code is the same. Some computers need something “coded” differently to understand it. The computer languages or programming languages do vary; but the understanding of the basic ideas will apply throughout.
Grace Hopper discovered the first computer bug. Quite literally a moth stuck in the punch cards. It was saved and taped into the notes, thus coining the term "bug" when referring to errors in the code.