Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Get in text with technology with tech Stuff from stuff
works dot com. Hey there, and welcome to tech Stuff.
I'm your host, Jonathan Strickland, a senior writer with health
stuff works dot com. I hope you guys are doing
well today. I'm going to start tackling a subject I
(00:25):
have alluded to in previous episodes, the history of programming languages.
There's a lot of it, so this will be a
two parter at the very least, and part one we're
going to be laying the groundwork for programming languages. Long
time listeners know this is kind of my m O.
I like to really make sure that we have a
(00:47):
foundation before I go into a topic because I feel
like context is really important. And despite our love of
stories that have a beginning, middle, and end, typically history
is not so neat and tidy. We tend to have
lots of stuff bleed into other things and so it
gets a little complicated. So before we talk about the
(01:09):
history of programming languages itself, let's talk about why we
need programming languages in the first place. Well, when you
get down to it, computers process machine language, or a
type of machine language. Machine language in itself is a
descriptor it is the language that machines quote unquote understand
(01:30):
and what they are using in order to execute various operations. Uh,
the machine language of today is the binary code zeros
and ones. All those instructions that a computer carries out
essentially boiled down to chains of zeros and ones. So
(01:51):
in order to really understand programming languages, at least most
of modern programming languages, we need to understand about binary
and as it turns out, the concept of using binary
arithmetic dates back quite a ways, well before the dawn
of the computer. The earliest scholarly work I could find
(02:14):
regarding binary arithmetic dates to seventeen o three, eighteenth century.
It was written by Gottfried Wilhelm Leibniz. The Leibniz was
a German mathematician and philosopher and one of the two
people to invent differential and integral calculus. Anyone know who
the other person was? Bueller, Bueller, I'm sure a lot
(02:39):
of you are shouting it out right now. It's actually
sir Isaac Newton. Newton and Leibniz were two co inventors
of calculus. They did it independently of one another. In
other words, neither of them were aware of the other
person's work, which is kind of cool when you think
about it. It's this this uh, an interesting moment in
(03:02):
history where you have two different brilliant people coming up
with the same idea simultaneously. And this is not the
only time this has happened. There have been quite a
few times in human history where people in different parts
of the world have come to the same sort of
amazing realization at around the same time without ever being
aware of the other person. As it turns out, they
(03:25):
both tried to lay claim to being the father of calculus. Uh. Really,
their followers were more rabid about it than they were.
And there's in fact, an entire fascinating story about the
battle for who should be given credit for inventing calculus.
But that's a story for another podcast. It's really not
(03:46):
a tech stuff story. You might be a stuff you
missed in history class story if you really want to
make some history folks go crazy about talking about, you know,
lots of maths. But let's go back to binary arithmetic.
The Leibnitz wrote about binary arithmetic in his memoir De
la Caademir Royal de san sees ha ha ha, And
(04:08):
I know my French is terrible. You shouldn't pop off
from Sabien and Maroisment. His work, though, was the first
to explain that arithmetic typically relies upon base ten. Makes sense.
You know your your typical person has five fingers on
each hand, five toes on each foot. Base ten makes sense.
(04:31):
You count one to ten with your fingers. So he
would say that base ten really ranges from the number
zero up to the number nine, and then you repeat
that sequence again, only you put a one in the
second column, you know, the really the tens column, and
then you start back at zero and work your way
back up to nine, and then you would put it
(04:52):
to there, and so on and so forth. So you
worked up to the one hundreds column and the one
thousand's column. But he says he found that the simplest
progression of all to be more useful in the science
of numbers. That simplest of progressions is between just two
numbers zero and one before it repeats itself. So the
(05:15):
number zero is zero, the number one is one. If
you wanted to represent the number two, you would write
one zero. So again it's kind of like going zero
to nine, and then you would go to ten. In
this case, you go to zero the one, then you
would go to what is effectively ten that represents the
number two. Three would be one, one or eleven if
(05:37):
we were using the decimal system, and a four is
one zero, zero or one, and so on and so forth.
Now by the time you get the thirty, you're looking
at one one, one, one zero, and a thirty two
is one followed by five zero, so it would be
one hundred thousand in the base ten system. Leibnitz said
(06:00):
that this scheme allowed for geometric progression. So if you
were to take the binary digits for four, which is
one zero, zero, and then you took the binary digits
for two, which is one zero, and then as in
the number one followed by the number zero, and then
you took the binary digit that represents the number one,
which in this case is one, and you were to
(06:22):
add all of those together, you'd end up with one
on one that is the binary digit for seven, which
is also what you get when you add four and
two and one together in base ten. Leimnez says this
approach allows for lots of practical applications, such as weighing
a lot of different masses with just a few different
(06:43):
types of weights or in coinage to allow for many
different values with just a few coins, So he was
thinking of practical applications for binary arithmetic. He also said
that expressing numbers this way allowed for easy mathematical operations
such as subtraction, multiplication, and division. Leibnitz also said that
using binary just made sense. You didn't have to memorize
(07:05):
facts by rote as. Everything was evident through what he
called ordinary reckoning. So in other words, you wouldn't have
to memorize things like seven plus eight is fifteen, right,
You wouldn't have to memorize these sort of ideas, or
that four times six, you know, what is that? What
is four times six? I'm asking you know it's twenty four.
(07:28):
So you wouldn't have to memorize these and and have
it all by rote But that with the binary approach,
because you're only working with ones and zeros, there's none
of that memorization. It's all very intuitive. Now. Granted you
then have to work out what those ones and zeros
are representing in the base ten system. That's a little
more complicated, but it works. Leibnitz Is work predates computers
(07:53):
by centuries, and there's some evidence to suggest that binary
counting systems were actually being used by other culture ors
well before Leibniz came along. They weren't written in scholarly journals,
but they existed. Some researchers from the University of Norway
noted that on the island of Manga Manga Riva, and
I could be completely butchering the pronunciation of that, so
(08:15):
I apologize. It's a island over in the South Pacific
Islanders had been using a binary system to count between
the numbers twenty through eighty. So they used base ten
for all numbers leading up to twenty, and then between
twenty and eighty they used binary digits. And they were
doing this before the fifteenth century, so before the fourteen hundreds,
(08:37):
the islanders didn't have a written language, which meant that
whenever they had to do math, which they started having
to do they began to trade with other islanders and
other cultures well before the fourteen hundreds. It meant that
they needed to be able to do math in their
heads easily, and binary allowed for that as opposed to
something that's in the decimal ten system. But who considered
(09:02):
using binary as the basis for machine language? Where did
that come from, well, that would be much much later,
And in fact, there's some other elements of programming that
pre date the decision to go with binary as the
basic language for computers. So let's look at some of
those developments because that's kind of again what led to
(09:25):
the rise in programming itself. So to find the thread
of this story, we have to go back to eighteen
o one, so a century after Leibniz was writing about
binary arithmetic, and that's when Joseph Mrie Jacquard introduced the
programmable loom. We've talked about this in previous episodes of
(09:47):
Tech Stuff. The programmable loom was a real innovation back
in the nineteenth century. The programs consisted of wooden punch cards,
So you had these large pieces of wood with holes
punched in through in a certain pattern, and what you
would do is pass threads through the holes in the
cards when you were threading up your loom, and those
(10:10):
holes would essentially dictate what the pattern was going to
be when you finished. So if you wanted a specific pattern,
you just put the appropriate punch card, load that up
for your loom, pass the thread through, and then weave
on the loom. So that you would get the pattern
you wanted. You needed to change it up. You just
switched out the cards that you were using. Uh, so
(10:34):
different patterns would use different configurations of holes. You know, obviously,
if the hole is there, then a thread can pass through.
If there's no hole in that position, then a thread
cannot pass through. It's pretty pretty intuitive now if you
think about this in an abstract way. The punch cards
are like a series of two positions switches. The holes
(10:56):
are the on switch because they allow a thread to
pass through, and the areas where a hole could be
but there isn't a hole is an off switch because
you cannot pass a thread through a solid piece of wood.
So Jacquard's loom sped things up and really got things
moving in the weaving business. It also ended up putting
(11:18):
a lot of weavers out of work in the process,
and they got very upset. But it would take a
couple of decades before someone made the mental leap that
a punch card could be something that you could use
with abstract ideas, not just physical material like thread. That
someone was Charles Babbage who first proposed a mechanical device
(11:40):
called the difference engine. That was a device that was
meant to compute tables of numbers, and it was a
really complicated device with lots of gears and shafts that
could rotate in different directions, and it would have used
a lot of moving parts. But Babbage was never able
to actually finish it. It took longer than what he
had predicted, and ultimately the funding dried up as various
(12:06):
patrons got fed up with waiting around for Babbage to
finish the thing, and they ended up stopped. They stopped
paying him. But he began to work on a new
device that he called the analytic engine. And here's where
Jacquard's work came in. Babbage realized that those punch cards
that were physically either allowing or preventing thread to pass
(12:29):
through could act not just as gateways for physical material
like thread, but also for abstract notions like a problem
statement or information needed to work out a problem solution.
So by changing up what could and couldn't pass through,
you could run a problem through a mechanical calculator. Essentially.
(12:50):
Now with Babbage's design, we're not talking about electricity. Is
still gears, mechanical parts that have to connect with one another,
and all of it is based on physical motion. So
if you've ever worked with a loud computer that had
a bad fan or something, you have a hint of
what it must have sounded like to work on this thing.
Only the analytic engine involved more clanking, or at least
(13:13):
I hope it involved more clanking than your computer did.
If your computer is clanking, you probably should take that
in to get a bit of an adjustment, or if
you're really handy, take a look in there, your fan
is probably out of alignment. While Babbage was working on
that first analytic engine, he was also helped by the
world's first computer programmer. Her name was Aida Lovelace, the
(13:39):
Enchantress of Numbers, and I've done a full episode about
Lovelace before. I Think the Stuff you Missed in History
Class podcast has done an episode on Lovelace before. She
was remarkable, an incredible person, uh, someone who I think
needs more reck ignition for her contributions to computer science.
(14:02):
She envisioned a world in which not only could one
create a punch card program for a calculator to run
through and give you the solution to a problem, she
was able to make another mental leap on the same
level that Charles Babbage had made. Babbage's leap was, hey,
this card that could allow or prevent thread to pass
(14:25):
through could also be used in a more abstract way.
Lovelace's leap was this device that is intended to solve
mathematical equations and answer those sorts of problems, could also
be used to do all sorts of other things, things
that computers today can do. But no one at the
(14:45):
time was even imagining. She was such a forward thinker
that she was able to envision a world where you
could encode all sorts of information and use a device
like the Analytic Engine to process it information like music
or images. So the remarkable thing is, years before, decades
(15:09):
before anything like that would be possible, Lovelace was imagining
that actually coming to pass. And I really wonder what
she would think of the world today from a technological standpoint.
If she was able to look around and see the
sort of things that computers could do, she might feel
very much vindicated by this vision she had back in
(15:32):
the nineteenth century, where electronics were not a thing yet.
No one had really harnessed electricity in a meaningful way
at the time that they were working on the analytic engine.
So it's a really phenomenal thinker. I cannot imagine being
in a world where I'm able to project ahead that
(15:55):
far and think of something so abstract and to be
so right on the money. Really well. The next step
along the path to programming happened in eighteen nineties, so
we're getting to the end of the nineteenth century, and
that was a census year in the United States. So
in the US we hold a census every decade, and
(16:17):
the purpose of the census is to determine the representation
of states in the House of Representatives, that is a
group in Congress in the legislative branch that is dependent
upon state populations. So we have the Senate. Every state
gets to senators, but then we have the House of Representatives,
and the number that we have is dependent upon the
population of the various states, which means we occasionally have
(16:39):
to check and see what the populations are. If they
changed dramatically, then the number of representatives will again change
to reflect that. But the U s had hit a
problem by the end of the nineteenth century. There were
just too many dang people to count. It was just
taking really long to count them all by hand. So
in seventeen nine century earlier. They was also the very
(17:02):
first year that we held a census. It took nine
months to count up all the different responses for the
United States Census. By eight the census, the previous census,
the one that just happened before. They were trying to
figure out a new method. It took them seven and
a half years to tally all the results. Seven and
(17:26):
a half years to count up the results of the
previous census. That meant that you were two and a
half years away from having to do it all over again.
And so the Census Bureau held a contest, and they
called for inventors to come up with some way of
tallying the census responses much more quickly so that they're
not wasting so much time and money just counting how
(17:50):
many people are in each state. Enter Herman Hollerith, who
invented a solution in response to the US Census Bureau
and offered he won the prize that the bureau was offering. Holly.
Rith created what he called a card reader. So again
we get to punch cards, very similar to what was
being used by Babbage and then previously by Jacquard. This
(18:14):
red cards by sensing holes that were punched into the cards,
and it also had a gear mechanism to help keep
count of all the different things that needed to count
all the demographic information, and had a panel of dials
that would track the various counts so that you could
just look at the different dials and you would get
a summary of all the different responses. Each card had
(18:39):
positions on it that would indicate different data about the
citizen it represented, and his invention meant the eighteen nineties
census could be tabulated in three years, so less than
half the time of the previous census, the one that
had happened in eighteen eighty. It's still a good long
while to have to count that up, but much much
faster and more efficient it than the previous census was.
(19:02):
Hall Earth would go on to found a company called
the Tabulating Machine Company. That company, by the way, is
still around today, but its name has changed because the
Tabulating Machine Company would evolve into International Business Machines, which
today we know as IBM. So IBM had in its history,
(19:25):
the original purpose of it was to uh tabulate punch cards,
and those punch cards were originally used for the eight
nineties census in the United States. Kind of cool little
bit of trivia. So if you're ever at pub trivia
and they're asking where IBM came from, now you know.
And you also know that the person who's running the
trivia is a total geek who may also listen to
(19:48):
the show, So hey, shout out to you, Mr or
Mrs Trivia Master. Those punch cards would become an important
part of programming later on now. Arguably the first person
to build a general purpose computer was a man named
Conrad Zeussa who designed and built the Z one computer
(20:08):
in the late nineteen thirties. His device combined electronic and
mechanical parts, so it wasn't a purely electronic computer. It
still had some moving parts to it, and he had
decided to go with binary processing as the basis for
his computer. It makes it simple because you only need
to switch with two positions on or off for each
(20:29):
of your little processors and or gates if you prefer,
and he thought that made more sense than decimal processing.
And he used discarded film as his medium to send
commands to the computer. He wasn't using card stock, he
wasn't using paper tape, he didn't have access to it,
(20:49):
but he did use discarded film from various film houses
in Germany, and he would just punch holes in that
to be the instructions for his machine. His Z three machine,
which he built in nineteen forty one, might have been
the first general purpose programmable digital computer, but it was
destroyed during a bombing raid in World War Two. Only
(21:11):
his Z four device was able to make it through
the war, and for years Zeus's work remained in obscurity.
No one knew that he had made these things, and
he had done it again independently of anyone else. So
you would see the rise of computer science in other
countries disconnected from his work, and in turn, his work
(21:33):
was disconnected from their's. Again, another example of these two
different places with the same ideas coming into shape. He
arrived at his designs independently of all the other pioneers
and computer science. We'll talk a little bit about his
programming language in our next episode, because while he was
creating a programming language earlier than almost anyone else, it
(21:58):
didn't become known to most computer scientists until the nineteen seventies. Well,
I've got a lot more to talk about in the
history of programming languages, but before I jump into the
next section. Let's take a quick break to thank our sponsor.
(22:19):
All Right, during World War Two we get the term computers.
But here's the interesting thing. Computers were not machines. For
the most part. In World War two computers the word
computers referred to people. It applied to human beings, and
their job was to compute various equations, specifically relating to
(22:44):
artillery and gunnery when it first started. By World War two,
our war machines had reached incredible amounts of power and
you could fire upon positions that were well out of you.
But it meant that you needed to understand exactly what
a shell was owing to do when you fired it.
In other words, based on the power of the gun,
(23:05):
the weight of the shell, wind, other factors. If you
fire at a certain elevation, if a certain angle from
the ground, where is that show going to go? It's
really important if you want to hit, say, an enemy
encampment versus just countryside or a town. Well, it meant
(23:29):
that they needed people who were really good at maths.
So the army started to hire math majors. But that
meant that they were looking primarily at women, because the
men were already drafted into the armed forces, and they
were serving as soldiers and other frontline personnel. So women
(23:52):
were predominantly the computers of World War two. These were
women who were studying mathematics, and we're breaking round in
the areas of math. So they would work out all
of these different equations to figure out how things like
muscle velocity or wind effects or atmospheric drag and other
factors affected shells, and they created what we're called firing
(24:15):
tables for various types of weaponry. But there was more
work than they had manpower or maybe I should say
woman power to do. So they needed some way to
do this work more quickly and efficiently, especially where you're
just taking small changes in a variable in order to
(24:36):
create another table, because otherwise you just have to run
all those different equations again. If you could do something
where you could just make a small change and run
that same problem and get the answer quickly, it would
save a lot of time. So there was a need
for computational engines that could do this work faster. Now
over at Harvard, engineers built a machine called the Mark one.
(25:00):
This was the first programmable digital computer made in the
United States, but one it was not a purely electronic machine.
It had some mechanical parts to it, including a large
central shaft that had to be turned by a an
actual motor and about uh, you know, five horsepower, and
it also had clutches and relays. It weighed five tons
(25:22):
and had five hundred miles of wiring inside of it.
The computer itself was eight feet tall and fifty one
ft long, and it read instructions on a reel of
paper tape with holes punched into it. So this was
like one really long punch card this roll of paper tape.
And it solved the problem that some other early computer
(25:43):
programmers ran into, which is, if you have a stack
of punch cards, and especially if you failed to number
your stack of punch cards and you dropped that stack,
then your program was completely out of order and it
was useless. In fact, it might be easier for you
to go back and e program using a fresh new
stack and a hole puncher. Then it would be to
(26:04):
try and figure out what order the cards had been in. Uh.
The moral of that story was always number your cards.
Although very few people have to worry about working with
punch cards these days, but if you do, always number
your cards. That way, if you do drop them, then
you just have to get them in the right sequence again. Uh,
and it's not as big of a headache. One of
(26:26):
the programmers of the Mark one was a woman named
Grace Hopper, who is also credited with creating the term debugging. Now,
the word bug had been in use for a while
to designate the idea of a design error or flaw
that needs to be corrected, whether it's in calculating machine
(26:47):
or something else. So bugs as a term for something
that's not right have been around for a while, but
Grace Hopper got the credit for debugging. She was involved
in a literal d bugging in which programmers pulled a
moth out of relay number seventy in the Mark two system.
(27:08):
We'll talk more about Grace Hopper in a little bit now.
Around the same time as the Mark one was being
put together over in the UK, scientists were building another
machine called the Colossus. This machine was a more specific
computational device. It wasn't a general purpose computer. It was
(27:28):
made specifically to break the cryptographic codes sent by German
officers during World War Two, including codes that were created
using the Enigma Machine, a fiendishly clever cryptographic device. Now
I have done a full episode about the Enigma, so
I'm not going to dwell on it very much here.
(27:50):
It is a fascinating gadget, and again, Colossus was a
specific purpose machine. It could not be reprogrammed to do
really anything else. So while it was fascinating, it also
had limitations by its very design. But we see here
that each individual piece of information can come in various forms, right,
(28:12):
and so especially with punch cards, is essentially on or off. Now,
the mechanical computer didn't take over the world, and digital
computers wouldn't really emerge until the nineteen forties and fifties,
and truly digital electronic computers wouldn't be there till the
nineteen forties and fifties. But many of the earliest computers
would use physical switches that you would set before you
(28:34):
would run any sort of program. So you didn't have
like a processor. You had all these these banks and
banks of switches, so you'd have to make sure that
each switch is in the correct position and all the
wires are plugged into the correct ports before you would
run a problem through the calculation machine. Now, if the
(28:55):
switches had two positions, that's essentially programming in binary zeros
and ones or or off and on, so that only
uh counts if the switches have just two positions. If
you put multiple positions for the switches, then you're not
using binary, you're using a different system. Uh. But a
(29:16):
lot of those early machines were using two position switches,
so they were binary machines. And it makes sense in
the digital world where on and off are the only
discrete values, and so in the beginning, programmers built code
in binary. For the most part, not every machine is
like this, but a lot of them were. In this world,
(29:38):
a single unit of information is the bit that is
either a zero or a one. And I'm sure you've
heard that eight bits are a bite. Now, that was
not arbitrarily chosen. There was a reason for eight bits
to make a bite, and it wasn't the only strategy
of making bites. There were other versions of the bite
(29:59):
that were six bits long or twelve bits long. So
how did eight bits becoming a bite become the standard? Well, honestly,
that part of our story happens a little bit later.
So in part two of Programming Languages, we are going
to revisit bits and bytes to explain why eight bits
make a byte. But I'm going to talk a lot
(30:20):
about binary in this particular episode, so that's why I
had to bring it up here. The next computer I'm
going to chat about, however, was not dependent upon binary.
It was dependent upon the decimal system that that base
ten system. And that computer was one that I mentioned
recently on a different episode of of Tech Stuff. It
(30:43):
was in the Weather Models episode. That would be the
Electronic Numerical Integrator and Computer or ANIAC. It's a very
early computer, one that I talked about in that Weather
Models episode just just previously, and it was the brainchild
of John Mauchley, and several other people worked on it
as well, obviously, but Marchley was the one who proposed
(31:05):
an all electronic calculating machine back in nineteen forty two,
ended up taking a few years to build. The United
States government, specifically the military, was very interested in this
because they wanted to have a more uh efficient machine
to do things like figure out those those firing tables.
But the ENIAC was not completed until after the conclusion
(31:27):
of World War Two in nineteen forty five. It was
the first fully electronic computer with no mechanical parts. The
electronic nature meant it could complete a calculation in a
fraction of the time it took the Mark one. So,
for example, a calculation with the mark one might take
six seconds for it to complete because it actually involves
(31:48):
moving physical pieces around in order to get to that calculation. However,
the ENIAC could do the same calculation in two point
eight thousands of a second. Then againting yet could only
store twenty numbers at a time, so there were limitations,
and for the first ten years of its existence, the
(32:08):
ENIAC ran more calculations than the totality of all calculations
performed by human beings before the ENIAC was built. So
from the dawn of time up to nine humans did
a certain number of calculations that were dwarfed by the
(32:28):
first ten years of Niac's operations. However, then it got
struck by lightning, which reminds us we probably shouldn't get
too cocky about this sort of thing. We might anger
thor So. NIAC didn't have a CPU like a modern computer.
To run a program through NIAC to have it solve
a problem, you would have to set physical switches that
(32:51):
had ten positions, and you'd had to plug wires into
specific ports Before running your calculation. So ANIAC was a big,
big machine, so big it was a hundred fifty feet
wide that's forty six meters wide, and it weighed thirty tons,
and it kind of looked like one of those old
timey phone operator banks where you'd pull a plug out
(33:12):
of one socket and place it into another. And some
of you probably still have no clue what I'm talking about,
but just imagine a big panel of sockets into which
you can insert cable plugs, and on one problem, you
would have a sample configuration of plugs in some of
those sockets. But let's say you want to run another problem.
(33:34):
That would mean having to unplug those first cables, plug
them into different sockets. Each problem would look totally different,
so you would have to change those physical connections as
well as the physical orientations of those ten position switches
that were there. Any act also could not store problems.
(33:56):
It didn't have a programmed storage capability. It was a
collection of electronic adding machines and other arithmetic units which
were originally controlled by a web of large electrical cables.
According to David Alan Greer in the Annals of the
History of Computing, the original programmers of ENIAC were Jene
(34:17):
Jennings Bartek, Francis Betty Snyder Holberton, Kathleen McNulty, Mauchley, Anton Nelly,
Marlon West, Scoff Meltzer, Ruth Lichtmann, Title Bomb, and Francis
Bilas Spence. In other words, the first programmers for ENIAC
(34:38):
were six women, the first programmers of the first fully
electronic computer. And I think it's really important to drive
that home because the very history of computer science is
reliant upon the contributions of women, and frequently, far too frequently,
(34:59):
in the world of computers, women are are disregarded or
thought of as being incapable of uh any real meaningful contribution,
when in fact, the very foundation of computation is on
the backs of women. Um As as for these six,
for years the Army failed to acknowledge their contributions. For
(35:25):
one thing, NIAC was classified and still is to this day,
classified information, and so their contributions weren't acknowledged as a
matter of national security for quite some time. But since
then they have been acknowledged, and I'm glad their story
is now available for people to hear. It's a pretty
incredible one and maybe someday I'll do a more specific
(35:48):
episode about ENIAC and the men and women who were
instrumental in getting it going. But clearly they're programming didn't
involve much of a programming language, at least not in
the way we think of programming languages today. They were
all looking at the very complicated decimal systems. They were
(36:08):
setting things up for running a single problem. If you
wanted to change the problem, you had to change that
entire set up, that physical setup of the computer. Now,
the ENIAC team invited another person, John von Neuman, over
to take a look at it, and I talked about
von Neuman in the Weather Models episode as well, and
together the ani ACT team and von Neuman were able
(36:31):
to kind of come up with the the premise for
the successor to ani ACT. This would become the EDVAC,
which was the first stored program computer. EDVAC stands for
Electronic discrete variable automatic computer, and the stored program aspect
meant you didn't have to change the physical wiring of
(36:52):
the device itself. The program would be stored inside the computer,
not in the arrangement of all its components, and unlike NIA,
ed VAC would be based on the binary system instead
of the decimal system. The binary system simplified things because
you only needed a switch with two positions. And again,
the decimal based machine meant that you needed switches with
(37:14):
ten positions. Since you can represent numbers using binary, then
you know, Leibnitz showed us that centuries earlier, it made
sense to go with that system. Now, one person who
realized this and who really nailed it was Claude Shannon,
another mathematician and someone that I will probably do a
(37:36):
full episode on again in the future. I actually do
have a Claude Shannon episode in the archives as well.
Shannon wrote The Mathematical Theory of Communication in nine to
set the foundation for the theoretical limits of communication between
humans and machines. The identified the bit as the fundamental
unit of information and thus the basic unit of computation.
(37:59):
For this, Shannon is often cited as one of the
fathers of computer science. Since the computers could be reprogrammed
without making physical changes to the machine, it meant that
you had to do that reprogramming through code, and so
we finally get into some of the programming languages. Now.
One of the earliest was called shortcode, originally known as
(38:19):
brief code, and proposed by Mauchley, you know, was the
one of the guys behind Eniac. So how is shortcode
a programming language different from machine code, the language computers
actually quote unquote understand. Well, we'll find out after we
take a quick break to thank our sponsor. Alright, So
(38:46):
how was shortcode different from machine code? Well, shortcode allowed
programmers to work out problems using mathematical expressions instead of
machine instructions, so you didn't have to just have everything
out in zero and ones in binary. You could take
a couple of shortcuts. It was developed for the UNIVAC computer,
the first mass produced computer, and also the work of
(39:09):
Eckert and Mauchley of NIAC fame. This removed programmers from
the same level of language as the one that the
machines were using. So you have machine level language that
refers to the specific language the machine itself quote unquote
understands the language it uses to process the problems that
you submit to it. So you can program machines at
(39:34):
the machine level. But it's hard to do. It tends
to be tedious and difficult, and it's really easy to
introduce an error into the programs because it's not very intuitive,
particularly if you're talking about programming in binary. If you're
just looking at a bunch of zeros and ones, very quickly,
you can lose track of what it actually represents. And
(39:55):
if you put in a zero or a one where
it should have been the other, everything turns to popsy
turvy and you've got bugs in your program. You have
to go back and figure out where the mistake is,
and then you have to fix it, and that one
fix may not be enough. You may have to end
up going down the line and fixing everything that comes
after it. Beyond machine level languages are things like short code.
(40:18):
That's that next step up. It's a very very small
step up. It's actually a low level programming language, which
means the level of the language is not that far
off from the machine code itself. It's easier for humans
to use than pure machine code, but it's not that
far removed from actual machine code. Another low level programming
(40:42):
language would be an assembly language, and there are many
assembly languages. These use simple mnemonic instructions. Assemblers translate this
into machine code, and because different machines rely on different architectures,
there are many assembly language is. Essentially, each architecture needs
(41:02):
its own assembly language. The program that turns that is
of course called the assembler. The assembly language into machine
code is your assembler. So in the old days, we
didn't have a common architecture across computer systems. Computer systems
were very specifically designed by different manufacturers and often for
(41:24):
very specific purposes, so there was no compatibility across different machines,
and your assemblers had to be peculiar to whatever the
architecture was of that machine that you were working on.
One ninety two, Grace Hopper, you remember her well. She
built the first compiler, and it was called the a
(41:46):
dash O compiler. A compiler's job is to take code
that's written in a programming language and convert it into
machine codes. It's sort of like a translator. It takes
the programming language something that humans can work with, and
turns it into machine code, something that machines can work with.
Hopper's compiler was pretty simple and was an effective way
(42:07):
for Hopper too, As she once said, be lazy. She
wanted to stop being a programmer and returned to being
a mathematician, so she wanted to find ways that she
could generate sort of shortcuts to doing very basic tasks.
And that's when she built this basic compiler. Copper took
all these subroutines that she had been using over the years,
(42:29):
and she put them on a reel of paper tape,
labeling each subroutine with a call number. Then when she
needed to run a specific subroutine, she could refer to
it by the appropriate call number, and the machine could
find the corresponding section on the paper tape and run
the subroutine. Now, according to Hopper, people didn't realize the
significance of her contribution right away. Many people were still
(42:52):
thinking of computers as glorified calculators, so these were machines
that could do math, but not much else. Hopper says,
she you foresaw more complicated programs, but they would need
things like compilers to offload some of that laborious work
to make those programs practical. So Hopper was saying, what
I'm doing here is opening up new opportunities for us
(43:13):
to use computers. And she said she ran into a
lot of resistance where people just didn't see it that way.
They had been used to programming these things by hand,
working either at machine code level or just barely above
machine code level, and so it was too complicated to
consider using it for anything other than just crunching some
(43:33):
numbers and running some equations. Well, she and her colleagues
worked on improving this approach and two generations of compilers later.
With the A two compiler, it became one of the
first compilers to get extensive use in the computer world,
and that is what laid the foundation for higher level
programming languages that fell along that same pathway. At this stage,
(43:56):
computer languages were mostly reliant on symbols, not on words,
and Hopper saw need to change that as well. She
wanted to create a system which programs could be written
in English. Letters, she reasoned, were really just symbols, so
they should be something that could be interpreted by a
compiler and then converted into machine code. So she made
(44:17):
further adjustments to the compiler she had created, and she
built something called the flow Matic compiler f L O
W DASH M A t I C. The flow Matic. Well,
this set the stage for an important programming language called Cobal.
But before Cobal, we have to talk about four tran
(44:37):
So over at IBM, there was a programmer named John
Backus who was leading a group who are designing a
programming language and their group had a lofty goal. They
wanted to create a high level programming language. So in
other words, they wanted a language that's much closer to
how we humans tend to communicate with each other, rather
than the types of languages that machines rely on. Assembly
(45:01):
language is a low level language, and it's still pretty
close to machine code, and it's best used for low
level applications. IBM one is something that would be much
easier to use from a programming perspective, something that's much
more robust. For TRAN was the solution, and it stands
for formula translation. Some folks call it a scientific language.
(45:23):
It's largely used in the scientific field. For TRAN was
machine independent as well, which meant that it could run
on different computers and not just a single type. So
it wasn't that they had developed this for a particular
type of IBM computer and that's the only thing it
would run on. It was meant to run on different
types of machines, and it was also meant to be
(45:43):
easy to learn and suitable for a wide variety of uses.
The tradeoff for high level languages is that it takes
longer to compile them into machine code, so it's it
creates a delay in executing the code. You end up
making it easier to build the code for humans, but
(46:05):
it's harder for the machine itself to read the code.
The compiler has to translate more and more, so if
the language is easier to program in, you can make
up the gap of that decrease in efficiency for executing
the code, and that was the case with Fortran. The
high level nature increased the execution time by about twenty
(46:28):
so it added more time for it to execute the
code using the compiler, but it increased the efficiency of
actual programmers by five percent, so programmers were writing programs
faster than they were before. They were making up way
(46:48):
more time because of the ease the relative ease of
programming and for tran as opposed to programming and say
assembly language, so it took longer to execute the code,
but that was more than balanced out by how quickly
programmers could actually write code. So while back Is Group
developed for Trand in nineteen fifty four, it wasn't until
(47:08):
nineteen fifty seven they got a commercial release for Trand
would become the programming language for the scientific community, which
is why it's still used in weather modeling applications, as
I mentioned in our previous episode. Alright, so back to COBAL.
The development of COBAL took place right around the time
the third generation of four trand was being developed. COBAL
(47:29):
would become important because it helped unify efforts in programming.
Up to that point, all those computers were pretty much
using their own specific methods of programming. So you couldn't
build a program meant for one system and poured it
directly over to another. You'd have to go about rebuilding
your entire program following the architecture and constraints of your
(47:49):
new machines architecture UM or design, I should say. And
so in the late nineteen fifties, some computer scientists set
out to create a programming language that could run on
different computers, and that group was then Conference on Data
Systems Languages or CODACILL c O d A S y L.
Members of the group included government employees, computer science experts
(48:13):
from various universities, and members of various industries. So by
the end of nineteen fifty nine they had settled on
the specifications of their programming language, and they called it
the Common Business Oriented Language or COBAL for short. It
would officially be published in January nineteen sixty after receiving
approval from an executive committee. How the Department of Defense
(48:36):
was really behind the effort. They were really pushing for
it because they were one of the few agencies in
the world that actually possessed computer systems from different manufacturers,
So they were running into this problem of having all
these different computer systems that were incompatible with each other.
Most other places didn't have that problem because they didn't
have the money to own that many different systems. They
(48:57):
were all in with a single system, but a unifying
language would mean they could run programs on multiple machines
to handle lots of boring stuff like data management, tracking
information for inventory, payroll administration, and other tasks. The US
government would eventually require that all computers sold or least
(49:19):
to government agencies had to be able to run COBOL software,
which ensured the languages adoption. It's very sad that was
Cobal and not Cobold's the little critters from Dudgeons and
Dragons that end up being low level monsters, but such
as life now. The custodian of COBAL is the American
(49:40):
National Standards Institute, which develops and publishes new COBAL standards.
COBAL saw the most use in government and business applications,
whereas for TRAN was being used in scientific applications. Now,
with COBAL and for TRAN, we are in the earliest
days of computer programming, and I think this is a
good place for us to include today's episode. In our
(50:02):
next episode, we're gonna look at how those programming languages
and a couple of others arose over the years and
gave birth to different programming languages. We're gonna look at
how these different languages are different, why are they different,
what is it that they are able to do because
of those differences, and kind of explain the various families
(50:23):
of programming languages and why they exist. So make sure
you tune into the next episode to learn more. And
then we're gonna turn our glance to some other form
of technology, something that doesn't involve supercomputers and programming languages,
at least not on the technical end, because eventually I
gotta stop talking about math or my brains will leak
(50:44):
out of my ears. If you guys have suggestions for
future episodes of tech Stuff, you should let me know
by sending me an email. You could scream out the window,
but I often can't hear you. The email address for
our show is tech Stuff at how stuffworks dot com
um or you can drop us a line on Twitter
or Facebook. The handle it both of those is tech
(51:04):
Stuff hs W. Remember you can go to twitch dot
tv slash tech stuff and watch me record these shows live.
Every Wednesday and Friday, I go live and I record
episodes of tech Stuff. You get to talk with me,
I chat with the chat room. We have a grand
old time, and you also get to see all the
outtakes that don't make it into the final version. So
(51:25):
go to twitch dot tv slash tech stuff to check
out the schedule, and I hope to see you in
there and I'll talk to you again really soon for
more on this and thousands of other topics. Because it
has staff works dot com