Interesting Timelines of Computer History

  • Thread starter jedishrfu
  • Start date
  • Featured
10,333
3,865
I found these timelines on the Computer History Museum which are quite interesting to checkout:

https://www.computerhistory.org/timeline/ai-robotics/

For me, it brought back memories of machines long past. They even mentioned the construction of my robot persona for the Forbidden Planet movie. Those were the days!
 

scottdave

Science Advisor
Homework Helper
Insights Author
Gold Member
1,547
568
That was interesting. They have split it into different categories.
 

Klystron

Gold Member
290
303
I found these timelines on the Computer History Museum which are quite interesting to checkout:

https://www.computerhistory.org/timeline/ai-robotics/
Well thought out time line with intriguing data points. The authors have overcome or bypassed many of the problems inherent in these essays including:

Most significant work was accomplished by teams but we recognize a lead figure.​
Mixing fiction such as Robby the Robot and Asimov's Three Laws of Robotics with fact.​
Choosing one innovation or invention over another ('author's choice').​
Choosing popular toy examples over common industrial robots.​
Choosing 'humanoid' robot examples over dedicated machines such as welders.​
We all have favorites but did the authors omitt the brilliant early SF writer Capek for the origin of the term robot? Isaac Asimov translates robot as slave in his essays.
From wikipedia
"The term comes from a Czech word, robota, meaning "forced labor";[6] the word 'robot' was first used to denote a fictional humanoid in a 1920 play R.U.R. (Rossumovi Univerzální Roboti - Rossum's Universal Robots) by the Czech writer, Karel Čapek"​
 

anorlunda

Mentor
Insights Author
Gold Member
6,292
3,497
That was fun. Thanks for sharing Jedi.

I was pleased to see the TI Speak and Spell. That one really blew my mind when it came out.
I bought one for a relative. It might still be working today except that she loved it to death. :cool:

I was disappointed to see no analog computers. My boss used the mechanical differential analyzer at M.I.T. shown below. It was very capable and much more reliable than the vacuum tube electronic ones that followed. The Norden bombsight should also deserve a mention.

240880
 
10,333
3,865
We used one in undergrad physics. Plugin modules to describe the diffeq and turn it on. Bingo done and displayed on scope.
 
10,333
3,865
Theres a timeline book on computers called The Computer Book at Amazon

 

cobalt124

Gold Member
26
17
Interesting to see neural networks being studied as far back as 1943. Shrdlu, Eliza and Dendral also bringing back memories of my Comp. Sci. degree in the late 80's.
 
10,333
3,865
I remember in undergrad physics using an analog computer. While I don't remember the name, I do remember it was color-coded had various pluggable modules and suspect we were using a Dornier analog computer.

This article shows the Dornier and its color-coded red, yellow and green markings:

http://www.analogmuseum.org/english/collection/dornier/do80/

You can read more about analog computers at the Analog Museum:

http://www.analogmuseum.org/english/

and Bill Schweber's article:

https://www.analogictips.com/analog-computation-part-1-what-and-why/

One last memory, I built a potentiometer-based analog computer using a voltage meter and three potentiometers encased in a metal box. It worked but it was pretty ugly as I remember my biggest difficult was in bending the galvanized zinc metal sheet into a nice box shape in my junior high school shop class.

In retrospect, I should have used aluminum but I'm not sure that was available to us at the time (cheap school board). It's layout was similar to this one:

https://hackaday.com/2011/02/04/analog-computer-does-math/
 
Last edited:

anorlunda

Mentor
Insights Author
Gold Member
6,292
3,497
Ah yes, I loved those analog computers. They were so effective at teaching dynamics. Modern simulation software is very capable, but it does not teach the feel the way analog computers did.

By analogy (no pun intended) slide rules taught us numerical estimation skills, that calculators and software want.

We should mention that chart recorders using ink or thermal paper were essential to use of analog computers. I'll tell a funny story. One day when using my analog computer (in building 273 @jedishrfu), there was a fault in the power supply that put 200 kHz on top of my sub-hertz range simulation. The pens on the ink recorder aliased those signals and they went crazy, resulting in everyone and everything in that room being painted blue.:oldtongue:
 

Charles Link

Homework Helper
Insights Author
Gold Member
2018 Award
4,262
1,784
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
 
663
417
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
As a freshman in 1974 during, if I recall correctly, the first computer class sponsored by our high school. Only one card punch machine in the building, and only seniors and office personnel had access to it. The rest of us marked up our cards with #2 pencil, and ran them through an optical scanner.

With special permission one could gain access to our single ASR-33 teletype, dial up the mainframe time share over a 300 baud handset modem, write and debug a program, and save it to paper tape.
 

Klystron

Gold Member
290
303
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
At college in 1980's just before student programs became commonly stored on computers; math, science and engineering students had to manage large 'decks' of punch cards. If memory serves, the top of the deck included job control language depending on the main frame operating system followed by the instruction set that comprised your program and subroutines depending on the language, followed by your data set. While programs might require many cards, data cards could be very numerous depending on the problem to be solved.

Students became adept at managing tall card decks in precise order and coordinating work schedules. Another reason to adopt the principle of simplicity in program and set designs.

Computer textbooks and innovative languages from this period such as LISP emphasize the merging of code and data as lists of objects. Could maintaining Hollerith cards have influenced these ideas even as punch cards were replaced by removable disk drives?
 

cobalt124

Gold Member
26
17
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
In Secondary School in the U.K. (11-16 years old - High School?) for our 'O' Level Computer Studies, we'd write our code onto coding sheets and they would get sent to our local Polytechnic (where I did my CS degree) to be punched onto card and run on a Prime mainframe. We'd get the music paper output the next week, correct errors, if any, and repeat.
 

Charles Link

Homework Helper
Insights Author
Gold Member
2018 Award
4,262
1,784
In Secondary School in the U.K. (11-16 years old - High School?) for our 'O' Level Computer Studies, we'd write our code onto coding sheets and they would get sent to our local Polytechnic (where I did my CS degree) to be punched onto card and run on a Prime mainframe. We'd get the music paper output the next week, correct errors, if any, and repeat.
It was so painstaking. If you even missed one comma or other required character in a hundred lines of code (100 cards), the program would crash.o_O
 

anorlunda

Mentor
Insights Author
Gold Member
6,292
3,497
How many people on here are old enough to have used a stack of 100 or more Holerith cards for a computer program? LOL Or had to wait 6 hours or longer to get the results of their program on the Holerith cards?
You're asking about us from the Flintstone era. :cool:

I actually did it without punched cards. Entering the program in binary using switches and reading the results in binary from the lights. We built and debugged a training simulator that way.

In the punched card era on the IBM 1620, some of our programs needed 8 hours to run, but the MTBF of the computer was only 4 hours. So we had to save the results after each iteration on punch cards so we could restart a partial solution after a computer crash. That increased run time to 24 hours. It also meant that all of the cards punched to save intermediate results were discarded; most without ever being read. It also meant that a person had to supervise the whole 24 hours to remove and discard those punch cards, or to restart the computer after a crash. The final solution was also punched on cards that we could sent to another facility that could read the cards and print the results on paper.

It made for a more active life style and we probably had better upper body strength than today's programmers.
 

cobalt124

Gold Member
26
17
It was so painstaking. If you even missed one comma or other required character in a hundred lines of code (100 cards), the program would crash.o_O
Or if you confused the BASIC syntax of your ZX81 at home with the syntax of the BASIC on the mainframe. :redface:
 

anorlunda

Mentor
Insights Author
Gold Member
6,292
3,497
If you even missed one comma or other required character in a hundred lines of code (100 cards), the program would crash
You're making this so much fun Charles.

One of my main complaints about early FORTRAN was that spaces didn't count.

So the following incorrect code missing a comma:
Code:
      DO 10 I=1 10
         X=X*X+2
10    CONTINUE
the first line was interpreted as
Code:
DO10I=110
 
10,333
3,865
Actually LISP came from the 1950's and during that time it became fashionable to combine data with program.

Some of the earliest computers stored data using one scheme and programming via a totally different scheme and slowly folks began to the see that they could be stored together and then the worry was programs with commingled data and code and so the separation of the two started in memory.

My boss who was an exceptional macro assembly-level programmer, would often reuse his initialization section as a data buffer once initialization was complete. This meant that when reading a program dump, you lost vital startup info since it got overwritten by data.

Another trick he would use is to overwrite a statement like a NOP with a TRA (goto) statement essentially turning off a block of code after having executed it and no longer needing to execute it. Today we would use a flag and an IF but the TRA saved a few coy cycles.

These problems eventually led to the notion of stacks, heaps and code which you see in today's programming model. With the stack and heap sharing the same block of memory and crashing when one overruns the other rather than wiping out program code.
 
10,333
3,865
With respect missing a comma, cobol would fall off the world with hundreds of error because you missed a terminating period early on in the identification division of the program.

Other funny errors occured in tss basic such as allowing

for i=1to x step 10

but failing on

for i = 1 to t step 10

Why? The interpreter saw the keyword tst which was a function in basic. That one through me for a loop :-)

Lastly free form fortran had a confusing issue allowing comments to begin with c in column 1 and what would happen when you defined your complex or character variables and common blocks also starting in column one also.

You got a lot of strange errors and eventually figured out that your variables were using implicit datatyping (starting letter of i thru n variables are implicitly integer vs being floats) because the compiler didnt see your definitions.
 

anorlunda

Mentor
Insights Author
Gold Member
6,292
3,497
Some of the earliest computers stored data using one scheme and programming via a totally different scheme
Just think of how differently computer security could have evolved if this hardware separation of code and data had been continued.
 
Last edited:

anorlunda

Mentor
Insights Author
Gold Member
6,292
3,497
How about a XDS Sigma 8 computer (considered a pretty hot real time machine once upon a time) that when presented with two specific floating point numbers A and B, would yield the correct answer C 30000 times in a row but yield an incorrect answer the next time? The hardware engineers could look at the bit patterns in A and B and see how close they were to the worst case for the multiplier circuit.

Before ECC codes or even parity bits on registers, errors like that were pretty common. The error rates were even time of day sensitive unless the AC did a really good job holding the temperature constant.

Another favorite. The Interdata Model 1 mini had a potentiometer knob for CPU speed (i.e. clock rate). The idea was to allow you to turn down the speed enough so that you could watch your program execute in binary; maybe 1 or 2 instructions per second. That was a method of debugging. The problem was the psychology of programmers. 100% of the programmers could not believe that it was their program was so slow, but rather that the computer was not running full speed, so they would twist the knob harder. Every Model 1 ever delivered had the speed knob twisted off and broken.
 
85
32
My
242852
father brought home a Friden calculating machine from work when I was 11, and I had many hours of fun punching in numbers and letting it clunk away to work out the answer. It would shake my desk as it calculated, and the noise was terrific.

The machine was obsoleted by electronic calculators, so was mine to keep and when I was a bit older, I wondered how it worked, so took it apart. Being a thoughtless teenager, my approach was haphazard and lacked any organization so once a small pile of screws, springs, and cogs was created, there was no way I was ever going to put it back together. So I kept going, dismantling what really was a piece of mechanical art.

When my dad found out, he was furious, which I did not understand until way later when he told me that he'd been using the machine for almost fifteen years, so had considered it along the lines of a workplace heirloom. If only he'd told me at the time, but that's not how things were done, back in the day.
 

Tom.G

Science Advisor
2,506
1,341
I remember when my wife and I had an Altair computer connected to an ASR-33 Teletype terminal in our living room. The computer was one of the low-serial No. units that had a problem with the dynamic RAM. I wrote a memory test to run overnight and log errors to the Teletype. We had a young cat at the time that would always keep a wary, watchful distance from that TTY. She never knew when it would decide to BARK at her!

Historical Note: The dynamic memory problem was a confluence of 'well, almost right.' The refresh cycles were generated by a network of Single-Shots (74LS123 monostable multivibrator, which were notorious for unreliability) on the memory boards and the timing was a little borderline, and jittery. At the same time, the memory chip manufacturer changed ownership and they seemed to have forgotten how to make memory chips that worked. The chips could not be counted on to remember between successive refresh cycles, sometimes.

A case of "Murphy Strikes Again."
(A mostly irrelevant recollection)
Later, we, along with a couple relatives, opened a service bureau doing structural engineering for the local Architects. We had a Grand Opening with catered food, etc. in the Spring of the following year. With a decent turnout, the computer would NOT read the magnetic tape used for operating system and program storage. Rather embarrasing. Well, it was the first hot day of the season and apparently everyone had their air conditioners on. This was enought to drop the line voltage a bit, and the tape interface board had an unregulated 12V supply for the analog stages. In an ordinary design this would not be a problem... However the coupling between the 12V analog stages and the 5V logic used a Zener diode as a coupling device, as opposed to a level shifter or a capacitor. With the reduced line voltage, the 12V sagged and wasn't high enough to overcome that Zener diode, therefore no signal from the tape!

Hence forth, the system was on a ferro-resonant constant-voltage transformer.

Nice thread here, brings back memories.

Cheers,
Tom
 
85
32
242854
A particularly impactful computer experience I had was my first computer, a COSMAC VIP, vintage 1978.

That's not actually me in the photo (though it could be my mate-at-the-time, Gary, it's a scary likeness), it's courtesy of 'darelfii’ by airship, via Flickr, but very nicely conveys the vibe of VIP ownership:

- A kit computer on a board, the white hex keypad was a very slick data entry method, especially as the keys had a nice tactile clunk when you pressed them.

- Connection to a TV via an RF modulator to display the 64 * 128 pixel graphics, using the CDP1861 video display chip that was actually pretty advanced for the time and impressed everyone who saw it.

- The stupefied look that I at least maintained as I struggled to learn CHIP-8, an interpretive programming language that made Basic look complex.

This photo doesn't show a connected tape recorder, but you could save and load programs to cassette tape, and that was fantastically cool. My mate Jim also purchased a VIP, and we used to write programs and swap tapes at school. I'm sure everyone thought we were merely exchanging songs taped from the radio, because that was what you did in the days before iPods and Spotify!

I don't recall exactly what it cost me, but I was running three paper rounds a day to get the cash together, so it must have been a hefty sum. Certainly, I recall forgoing many purchases of my usual weekly sci-fi paperback, and that was a huge sacrifice.

The CPU itself was interesting, as RCA Engineer Joseph Weisbecker developed it as a new 8-bit architecture, by himself, at home. It was a CMOS design that had no minimal clock frequency, so you could literally freeze operations by stopping the clock. The instruction set was orthogonal, and when I started Computer Science at Uni I was horrified by the messy architecture of the 8086 chips we were forced to program against.

The VIP was low cost to the max, so was a boring 'bulk silicon' chip, but a radiation hardened version of the 1802 was fabricated in Silicon on Sapphire (SOS) by Sandia National Laboratories. That would have been NASA-project expensive, but it was well-suited for space applications, and I think that version was available for decades. Weisbecker certainly knew his stuff!

Indeed, Weisbecker also developed CHIP-8, and I was entirely in awe of his prowess in those heady days. The manual described memory use, with programs starting at location 0x200. Locations below this were used by the OS and 0xF00-0xFFF were reserved for that awesome display, while 0xEA0-0xEFF were scratch memory for registers and other things. CHIP-8 was quite sophisticated, all things considered, though I found it took a while to get your head around its 2-byte hexadecimal instruction set.

The VIP came with at least a dozen CHIP-8 games that I recall, and I read later that Weisbecker's daughter wrote some of them.

In terms of computing power, the VIP was no slouch. It was clocked 1.76 MHz (yes, that's mega Hertz. It was the late 70's, after all) and packed a whopping 2048 bytes of RAM. I doubled mine which necessitated installation of a black, anodized heat sink, and drilling the hole for the screw to lock it into place was a painstaking exercise because the risk of cracking the PCB was high. Such a surfeit of compute was enough to drive what were state of the art video games, and I played them all pretty much until my eyes bled.

It also set the scene for my wonderment when I see that Acrobat Reader on my Android phone needs 300MB to operate. Such a profligate consumption of resources!

Anyway, the VIP cemented my burning desire to be a programmer, and that's what I did. So thanks to Joseph Weisbecker and RCA, because without his creativity and their corporate largess, who knows what I'd be doing now. Likely nothing good, that's for sure :wink:
 

Want to reply to this thread?

"Interesting Timelines of Computer History" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top