Interesting Timelines of Computer History

In summary, the Computer History Museum has timelines for different topics in AI Robotics. Some of the significant work was accomplished by teams, while others were lead by one individual. They also include fictional stories alongside factual information. Some common toy examples were chosen over industrial robots, and the term 'robot' was first used in a 1920 play. The Analog Museum has a timeline of analog computers which were widely used in naval gun fire control systems. Finally, old memories of computer programming using a stack of 100 or more Holerith cards and waiting 6 hours to get the results are shared.
  • #1
14,746
9,089
I found these timelines on the Computer History Museum which are quite interesting to checkout:

https://www.computerhistory.org/timeline/ai-robotics/

For me, it brought back memories of machines long past. They even mentioned the construction of my robot persona for the Forbidden Planet movie. Those were the days!
 
  • Like
  • Informative
Likes Wrichik Basu, QuantumQuest, DrClaude and 4 others
Computer science news on Phys.org
  • #2
That was interesting. They have split it into different categories.
 
  • Like
Likes jedishrfu
  • #3
jedishrfu said:
I found these timelines on the Computer History Museum which are quite interesting to checkout:

https://www.computerhistory.org/timeline/ai-robotics/

Well thought out time line with intriguing data points. The authors have overcome or bypassed many of the problems inherent in these essays including:

Most significant work was accomplished by teams but we recognize a lead figure.​
Mixing fiction such as Robby the Robot and Asimov's Three Laws of Robotics with fact.​
Choosing one innovation or invention over another ('author's choice').​
Choosing popular toy examples over common industrial robots.​
Choosing 'humanoid' robot examples over dedicated machines such as welders.​
We all have favorites but did the authors omitt the brilliant early SF writer Capek for the origin of the term robot? Isaac Asimov translates robot as slave in his essays.
From wikipedia
"The term comes from a Czech word, robota, meaning "forced labor";[6] the word 'robot' was first used to denote a fictional humanoid in a 1920 play R.U.R. (Rossumovi Univerzální Roboti - Rossum's Universal Robots) by the Czech writer, Karel Čapek"​
 
  • Like
Likes jedishrfu
  • #4
That was fun. Thanks for sharing Jedi.

I was pleased to see the TI Speak and Spell. That one really blew my mind when it came out.
I bought one for a relative. It might still be working today except that she loved it to death. :cool:

I was disappointed to see no analog computers. My boss used the mechanical differential analyzer at M.I.T. shown below. It was very capable and much more reliable than the vacuum tube electronic ones that followed. The Norden bombsight should also deserve a mention.

240880
 
  • Like
Likes jedishrfu, scottdave and Klystron
  • #5
Analog computers were widely used in naval gun fire control systems.
 
  • Like
Likes Charles Link, Klystron and jedishrfu
  • #6
We used one in undergrad physics. Plugin modules to describe the differential equation and turn it on. Bingo done and displayed on scope.
 
  • Like
Likes scottdave
  • #8
Interesting to see neural networks being studied as far back as 1943. Shrdlu, Eliza and Dendral also bringing back memories of my Comp. Sci. degree in the late 80's.
 
  • #9
I remember in undergrad physics using an analog computer. While I don't remember the name, I do remember it was color-coded had various pluggable modules and suspect we were using a Dornier analog computer.

This article shows the Dornier and its color-coded red, yellow and green markings:

http://www.analogmuseum.org/english/collection/dornier/do80/

You can read more about analog computers at the Analog Museum:

http://www.analogmuseum.org/english/

and Bill Schweber's article:

https://www.analogictips.com/analog-computation-part-1-what-and-why/

One last memory, I built a potentiometer-based analog computer using a voltage meter and three potentiometers encased in a metal box. It worked but it was pretty ugly as I remember my biggest difficult was in bending the galvanized zinc metal sheet into a nice box shape in my junior high school shop class.

In retrospect, I should have used aluminum but I'm not sure that was available to us at the time (cheap school board). It's layout was similar to this one:

https://hackaday.com/2011/02/04/analog-computer-does-math/
 
Last edited:
  • Like
  • Informative
Likes Charles Link, Klystron, Asymptotic and 1 other person
  • #10
Ah yes, I loved those analog computers. They were so effective at teaching dynamics. Modern simulation software is very capable, but it does not teach the feel the way analog computers did.

By analogy (no pun intended) slide rules taught us numerical estimation skills, that calculators and software want.

We should mention that chart recorders using ink or thermal paper were essential to use of analog computers. I'll tell a funny story. One day when using my analog computer (in building 273 @jedishrfu), there was a fault in the power supply that put 200 kHz on top of my sub-hertz range simulation. The pens on the ink recorder aliased those signals and they went crazy, resulting in everyone and everything in that room being painted blue.:oldtongue:
 
  • Like
  • Wow
Likes jedishrfu, Klystron, Asymptotic and 1 other person
  • #11
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
 
  • Like
  • Informative
Likes anorlunda, jedishrfu, Klystron and 1 other person
  • #12
Charles Link said:
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
As a freshman in 1974 during, if I recall correctly, the first computer class sponsored by our high school. Only one card punch machine in the building, and only seniors and office personnel had access to it. The rest of us marked up our cards with #2 pencil, and ran them through an optical scanner.

With special permission one could gain access to our single ASR-33 teletype, dial up the mainframe time share over a 300 baud handset modem, write and debug a program, and save it to paper tape.
 
  • Like
  • Informative
Likes jedishrfu, Klystron and Charles Link
  • #13
Charles Link said:
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
At college in 1980's just before student programs became commonly stored on computers; math, science and engineering students had to manage large 'decks' of punch cards. If memory serves, the top of the deck included job control language depending on the main frame operating system followed by the instruction set that comprised your program and subroutines depending on the language, followed by your data set. While programs might require many cards, data cards could be very numerous depending on the problem to be solved.

Students became adept at managing tall card decks in precise order and coordinating work schedules. Another reason to adopt the principle of simplicity in program and set designs.

Computer textbooks and innovative languages from this period such as LISP emphasize the merging of code and data as lists of objects. Could maintaining Hollerith cards have influenced these ideas even as punch cards were replaced by removable disk drives?
 
  • Like
Likes jedishrfu, Charles Link and Asymptotic
  • #14
Charles Link said:
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?

In Secondary School in the U.K. (11-16 years old - High School?) for our 'O' Level Computer Studies, we'd write our code onto coding sheets and they would get sent to our local Polytechnic (where I did my CS degree) to be punched onto card and run on a Prime mainframe. We'd get the music paper output the next week, correct errors, if any, and repeat.
 
  • Like
Likes pinball1970, Asymptotic and Charles Link
  • #15
cobalt124 said:
In Secondary School in the U.K. (11-16 years old - High School?) for our 'O' Level Computer Studies, we'd write our code onto coding sheets and they would get sent to our local Polytechnic (where I did my CS degree) to be punched onto card and run on a Prime mainframe. We'd get the music paper output the next week, correct errors, if any, and repeat.
It was so painstaking. If you even missed one comma or other required character in a hundred lines of code (100 cards), the program would crash.o_O
 
  • Informative
  • Like
Likes Asymptotic and Klystron
  • #16
Charles Link said:
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
You're asking about us from the Flintstone era. :cool:

I actually did it without punched cards. Entering the program in binary using switches and reading the results in binary from the lights. We built and debugged a training simulator that way.

In the punched card era on the IBM 1620, some of our programs needed 8 hours to run, but the MTBF of the computer was only 4 hours. So we had to save the results after each iteration on punch cards so we could restart a partial solution after a computer crash. That increased run time to 24 hours. It also meant that all of the cards punched to save intermediate results were discarded; most without ever being read. It also meant that a person had to supervise the whole 24 hours to remove and discard those punch cards, or to restart the computer after a crash. The final solution was also punched on cards that we could sent to another facility that could read the cards and print the results on paper.

It made for a more active life style and we probably had better upper body strength than today's programmers.
 
  • Like
  • Informative
Likes Asymptotic, Charles Link and Klystron
  • #17
Charles Link said:
It was so painstaking. If you even missed one comma or other required character in a hundred lines of code (100 cards), the program would crash.o_O

Or if you confused the BASIC syntax of your ZX81 at home with the syntax of the BASIC on the mainframe. :redface:
 
  • Like
Likes Charles Link and Asymptotic
  • #18
Charles Link said:
If you even missed one comma or other required character in a hundred lines of code (100 cards), the program would crash
You're making this so much fun Charles.

One of my main complaints about early FORTRAN was that spaces didn't count.

So the following incorrect code missing a comma:
Code:
      DO 10 I=1 10
         X=X*X+2
10    CONTINUE
the first line was interpreted as
Code:
DO10I=110
 
  • Like
Likes Charles Link
  • #19
Actually LISP came from the 1950's and during that time it became fashionable to combine data with program.

Some of the earliest computers stored data using one scheme and programming via a totally different scheme and slowly folks began to the see that they could be stored together and then the worry was programs with commingled data and code and so the separation of the two started in memory.

My boss who was an exceptional macro assembly-level programmer, would often reuse his initialization section as a data buffer once initialization was complete. This meant that when reading a program dump, you lost vital startup info since it got overwritten by data.

Another trick he would use is to overwrite a statement like a NOP with a TRA (goto) statement essentially turning off a block of code after having executed it and no longer needing to execute it. Today we would use a flag and an IF but the TRA saved a few coy cycles.

These problems eventually led to the notion of stacks, heaps and code which you see in today's programming model. With the stack and heap sharing the same block of memory and crashing when one overruns the other rather than wiping out program code.
 
  • Like
  • Informative
Likes Klystron and anorlunda
  • #20
With respect missing a comma, cobol would fall off the world with hundreds of error because you missed a terminating period early on in the identification division of the program.

Other funny errors occurred in tss basic such as allowing

for i=1to x step 10

but failing on

for i = 1 to t step 10

Why? The interpreter saw the keyword tst which was a function in basic. That one through me for a loop :-)

Lastly free form fortran had a confusing issue allowing comments to begin with c in column 1 and what would happen when you defined your complex or character variables and common blocks also starting in column one also.

You got a lot of strange errors and eventually figured out that your variables were using implicit datatyping (starting letter of i thru n variables are implicitly integer vs being floats) because the compiler didnt see your definitions.
 
  • Like
Likes Klystron and Charles Link
  • #21
jedishrfu said:
Some of the earliest computers stored data using one scheme and programming via a totally different scheme
Just think of how differently computer security could have evolved if this hardware separation of code and data had been continued.
 
Last edited:
  • Like
Likes Charles Link
  • #22
How about a XDS Sigma 8 computer (considered a pretty hot real time machine once upon a time) that when presented with two specific floating point numbers A and B, would yield the correct answer C 30000 times in a row but yield an incorrect answer the next time? The hardware engineers could look at the bit patterns in A and B and see how close they were to the worst case for the multiplier circuit.

Before ECC codes or even parity bits on registers, errors like that were pretty common. The error rates were even time of day sensitive unless the AC did a really good job holding the temperature constant.

Another favorite. The Interdata Model 1 mini had a potentiometer knob for CPU speed (i.e. clock rate). The idea was to allow you to turn down the speed enough so that you could watch your program execute in binary; maybe 1 or 2 instructions per second. That was a method of debugging. The problem was the psychology of programmers. 100% of the programmers could not believe that it was their program was so slow, but rather that the computer was not running full speed, so they would twist the knob harder. Every Model 1 ever delivered had the speed knob twisted off and broken.
 
  • Like
Likes cobalt124 and Charles Link
  • #23
My
242852
father brought home a Friden calculating machine from work when I was 11, and I had many hours of fun punching in numbers and letting it clunk away to work out the answer. It would shake my desk as it calculated, and the noise was terrific.

The machine was obsoleted by electronic calculators, so was mine to keep and when I was a bit older, I wondered how it worked, so took it apart. Being a thoughtless teenager, my approach was haphazard and lacked any organization so once a small pile of screws, springs, and cogs was created, there was no way I was ever going to put it back together. So I kept going, dismantling what really was a piece of mechanical art.

When my dad found out, he was furious, which I did not understand until way later when he told me that he'd been using the machine for almost fifteen years, so had considered it along the lines of a workplace heirloom. If only he'd told me at the time, but that's not how things were done, back in the day.
 
  • Like
Likes Klystron
  • #24
I remember when my wife and I had an Altair computer connected to an ASR-33 Teletype terminal in our living room. The computer was one of the low-serial No. units that had a problem with the dynamic RAM. I wrote a memory test to run overnight and log errors to the Teletype. We had a young cat at the time that would always keep a wary, watchful distance from that TTY. She never knew when it would decide to BARK at her!

Historical Note: The dynamic memory problem was a confluence of 'well, almost right.' The refresh cycles were generated by a network of Single-Shots (74LS123 monostable multivibrator, which were notorious for unreliability) on the memory boards and the timing was a little borderline, and jittery. At the same time, the memory chip manufacturer changed ownership and they seemed to have forgotten how to make memory chips that worked. The chips could not be counted on to remember between successive refresh cycles, sometimes.

A case of "Murphy Strikes Again."
(A mostly irrelevant recollection)
Later, we, along with a couple relatives, opened a service bureau doing structural engineering for the local Architects. We had a Grand Opening with catered food, etc. in the Spring of the following year. With a decent turnout, the computer would NOT read the magnetic tape used for operating system and program storage. Rather embarrasing. Well, it was the first hot day of the season and apparently everyone had their air conditioners on. This was enought to drop the line voltage a bit, and the tape interface board had an unregulated 12V supply for the analog stages. In an ordinary design this would not be a problem... However the coupling between the 12V analog stages and the 5V logic used a Zener diode as a coupling device, as opposed to a level shifter or a capacitor. With the reduced line voltage, the 12V sagged and wasn't high enough to overcome that Zener diode, therefore no signal from the tape!

Hence forth, the system was on a ferro-resonant constant-voltage transformer.

Nice thread here, brings back memories.

Cheers,
Tom
 
  • Informative
  • Like
Likes Klystron and Asymptotic
  • #25
242854
A particularly impactful computer experience I had was my first computer, a COSMAC VIP, vintage 1978.

That's not actually me in the photo (though it could be my mate-at-the-time, Gary, it's a scary likeness), it's courtesy of 'darelfii’ by airship, via Flickr, but very nicely conveys the vibe of VIP ownership:

- A kit computer on a board, the white hex keypad was a very slick data entry method, especially as the keys had a nice tactile clunk when you pressed them.

- Connection to a TV via an RF modulator to display the 64 * 128 pixel graphics, using the CDP1861 video display chip that was actually pretty advanced for the time and impressed everyone who saw it.

- The stupefied look that I at least maintained as I struggled to learn CHIP-8, an interpretive programming language that made Basic look complex.

This photo doesn't show a connected tape recorder, but you could save and load programs to cassette tape, and that was fantastically cool. My mate Jim also purchased a VIP, and we used to write programs and swap tapes at school. I'm sure everyone thought we were merely exchanging songs taped from the radio, because that was what you did in the days before iPods and Spotify!

I don't recall exactly what it cost me, but I was running three paper rounds a day to get the cash together, so it must have been a hefty sum. Certainly, I recall forgoing many purchases of my usual weekly sci-fi paperback, and that was a huge sacrifice.

The CPU itself was interesting, as RCA Engineer Joseph Weisbecker developed it as a new 8-bit architecture, by himself, at home. It was a CMOS design that had no minimal clock frequency, so you could literally freeze operations by stopping the clock. The instruction set was orthogonal, and when I started Computer Science at Uni I was horrified by the messy architecture of the 8086 chips we were forced to program against.

The VIP was low cost to the max, so was a boring 'bulk silicon' chip, but a radiation hardened version of the 1802 was fabricated in Silicon on Sapphire (SOS) by Sandia National Laboratories. That would have been NASA-project expensive, but it was well-suited for space applications, and I think that version was available for decades. Weisbecker certainly knew his stuff!

Indeed, Weisbecker also developed CHIP-8, and I was entirely in awe of his prowess in those heady days. The manual described memory use, with programs starting at location 0x200. Locations below this were used by the OS and 0xF00-0xFFF were reserved for that awesome display, while 0xEA0-0xEFF were scratch memory for registers and other things. CHIP-8 was quite sophisticated, all things considered, though I found it took a while to get your head around its 2-byte hexadecimal instruction set.

The VIP came with at least a dozen CHIP-8 games that I recall, and I read later that Weisbecker's daughter wrote some of them.

In terms of computing power, the VIP was no slouch. It was clocked 1.76 MHz (yes, that's mega Hertz. It was the late 70's, after all) and packed a whopping 2048 bytes of RAM. I doubled mine which necessitated installation of a black, anodized heat sink, and drilling the hole for the screw to lock it into place was a painstaking exercise because the risk of cracking the PCB was high. Such a surfeit of compute was enough to drive what were state of the art video games, and I played them all pretty much until my eyes bled.

It also set the scene for my wonderment when I see that Acrobat Reader on my Android phone needs 300MB to operate. Such a profligate consumption of resources!

Anyway, the VIP cemented my burning desire to be a programmer, and that's what I did. So thanks to Joseph Weisbecker and RCA, because without his creativity and their corporate largess, who knows what I'd be doing now. Likely nothing good, that's for sure :wink:
 
  • Like
Likes Asymptotic
  • #26
cobalt124 said:
In Secondary School in the U.K. (11-16 years old - High School?) for our 'O' Level Computer Studies, we'd write our code onto coding sheets and they would get sent to our local Polytechnic (where I did my CS degree) to be punched onto card and run on a Prime mainframe. We'd get the music paper output the next week, correct errors, if any, and repeat.
I didnt do computer science because of the trip involved to the nearby college required for access to computers. It would have eaten into my playtime/football.
I first used a computer in anger at 6th form in 1984, I think they had more than than one but I never saw any others.
 
  • #27
This thread has me pondering on all the purpose built, low production run computers used in industrial control systems. My only involvement with this Accuray Betamike controller was assisting a factory tech when he came to visit, and then, mostly replacing blown shutter solenoid drive transistors, or assisting in the replacement of a massive, 8", 250K hard disk drive which was exquisitely prone to head crashes.

It had an ASR 33 teletype and library of tape spools with test programs and for reloading the OS, and was similar to the one pictured, although it was probably the next generation.
Accuray computer.jpg
 
  • #28
What a fun thread. Thanks all.

I started in the early 1960s, so naturally the computers were much bigger and power hungry.

One of my favorite memories was from a stunt that could not be repeated with today's computers. I was working with an XDS Sigma 8. It was comprised of about 12 racks, each about the size of a jumbo refrigerator. One day I got really mad at the machine and I kicked it. The kick was so hard, that it left a recognizable footprint on the door to one of the racks. Thereafter, every time I walked past and saw that footprint, it made me smile.
 
  • Like
Likes Klystron, Asymptotic and pinball1970

1. What is the first known computer?

The first known computer was the "Antikythera mechanism" which was created in ancient Greece around 150-100 BC. It was used to calculate astronomical positions and eclipses.

2. Who is considered the father of modern computing?

Alan Turing is considered the father of modern computing. He is known for his contributions to the development of the first electronic computer and for his work on code-breaking during World War II.

3. When was the first computer virus created?

The first computer virus, called "Creeper", was created in 1971 by Bob Thomas. It was designed to infect the ARPANET, a precursor to the internet, and display a message on infected computers.

4. What was the first personal computer?

The first personal computer, called the "Altair 8800", was created in 1975 by Ed Roberts. It was sold as a kit and had a basic processor, memory, and interface for connecting to a TV.

5. When was the World Wide Web created?

The World Wide Web was created in 1989 by British computer scientist Tim Berners-Lee. However, it wasn't until 1991 when it was made available to the public and popularized the use of the internet.

Similar threads

Replies
10
Views
2K
  • Art, Music, History, and Linguistics
Replies
23
Views
1K
  • Programming and Computer Science
Replies
29
Views
2K
  • Computing and Technology
Replies
1
Views
1K
  • Science Fiction and Fantasy Media
Replies
13
Views
1K
Replies
18
Views
3K
  • Art, Music, History, and Linguistics
Replies
25
Views
2K
  • Quantum Physics
Replies
2
Views
1K
  • STEM Academic Advising
Replies
5
Views
1K
Back
Top