The Most Hated PC CPU

  • #1
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2023 Award
34,768
21,465
This was triggered by the thread on the collapse of Adobe's PrintGear.

There have been a lot of CPUs that did not feel much love: the 80286, the AMD Bulldozers, the Celeron D, but the one that got absolutely creamed in public opinion was the IDT WinChip. Which is ironic, as it was a technological success.

In the late 1990's, the most common CPU socket was the so-called Socket 7. Unlike today, these motherboards would accept CPUs from several vendors: Intel, AMD, IBM/Cyrix, and others. You could take a machine with an Intel CPU, pop it out, put in one from AMD and go on your way. This, of course, put a lot of pressure on CPU makers to build better and better chips.

At the time, Intel was selling their Pentium MMX line in the $400-500 range. The competitors were selling similalry priced chips for similar performance (AMD) or slightly slower chips for a little less money.

IDT came along and asked "is this the optimal thing to do?" So they profiled a lot of desktop applications and discovered:
  • The CPU spent most of its time doing loads and stores (I don't know why this is ever a surprise)
  • Fancy features like out-of-order execution take a lot of silicon, but only help speed a little.
  • Floating-point is rarely used. You want something there, as fixed point emulation was up to 1000x slower, but whether it was 700x faster or 1500x faster made little difference.
This let them use a much smaller piece of silicon, which saved a lot of money, and produce a chip that was a little slower than its competition, for a lot less money. It sold for $90.

How did they beef up performance? They used eight times as much cache as the competition.

So, why was it hated?

(1) If you already owned a Socket 7 computer, there is no reason to spend $90 on a less performant CPU. If you didn't, you could save some money, sure, but it's not afactor of 4 or 5; it's more like 30%,

(2) Benchmarks of the day were more CPU-intensive than typical application code, so this chip underperformed.

(3) The idea of a "gaming PC" was just starting to evolve, and gaming workloads differ from the "business workloads" that the chip was optimized for.

The irony is that the idea was a success, even if the product was not. What are today's Intel E-Cores? A simpler CPU connected to a boatload of cache.

It's impossible to tell, but had this come out in 2004 instead, appropriately scaled, this could have been a fierce c ompetitor to the new dual-core Pentiums: a quad core thatb cost less and used less power. But the market zigged when they thought it would zag.
 
  • Like
Likes davenn
Computer science news on Phys.org
  • #2
This happens in software, too. My first experience was with Lattice C. It worked great and did the job. Microsoft rebranded and sold it until they developed their in-house C compiler, which had code common to their other language compilers.

Lattice C was effectively dead on PC-DOS and MS-DOS.

Another was VisiCalc, which pioneered the novel spreadsheet idea, but Lotus 123 destroyed it.

https://en.wikipedia.org/wiki/VisiCalc

and the list goes on...
 
  • Like
Likes davenn
  • #3
When I joined a software engineering team at NASA Ames, most system programmers worked on DEC PDP-series, distrusting and maligning the newer VAX cpus. Management assigned me to develop software and system protocols for Vax-11 VMS 'mainframes' loved by application programmers yet loathed by the PDP diehards.

VMS functioned well with a few tweaks while VAX internals such as STARnet provided excellent hardware interfaces and near-time performance. I worked happily on various VAX platforms under several NASA projects until we replaced them with Sun Microsystem servers running Solaris, a not-bad version of UNIX.
 
  • Like
Likes davenn
  • #4
I worked on Honeywell 6000 mainframes at GE but really wanted to work with VAXen machines after I saw one at a local university. They just looked so modern and so cool.
 
  • Like
Likes Klystron
  • #5
jedishrfu said:
This happens in software, too
Can you clarify what "this" is? There are several possibilities.

jedishrfu said:
Lotus 123
Lotus 1-2-3 had two huge advantages over its competition, one of which led to its demise.

(1) It would properly determine which cells depends on which other cells and execute them in order. Previous spreadsheets worked "row-wise" or "column-wise".

(2) It had a macro language. This was truly out of control. Macros would literally run for days, on PCs dedicated to run a single spreadsheet. On the plus side, it moved effort from senior accountants to junior bookkeepers, but on the minus, the spreadsheet logic was an unintelligible mess. Auditors refused to sign off on this, management got spooked, and a lot of this turned into procedural code run as scheduled jobs.

The Lotus people were always nice to me. Partly because I found a bug and went to them before going to the press.
 
  • #6
I would not call a VAX a "hated CPU", except by people who were deeply invested in others (370, PDP). People loved it. People were happy to pay the price/performance penalty to have a VAXStation over a unix workstation from Sun or Apollo.
 
  • #7
To add t the VAX story. This was one of those rare case where a company made a vital and strategic realization about their business....and then forgot it.

When the Alpha/AXP was coming out, the issues of VAX compatibility came up. There were no good options - emulation was too slow, a partial RISC implementation was too slow, and a CIS coprocessor was too slow. The last two were also too expensive.

It was then realized that people don't run VAX. They run VMS. Most code was either supplied by DEC, like VMS, DECWindows, DECMail, LSE (which I still miss), or was user-compiled using in most cases a DEC compiler. So if DEC were to release all its VMS software for AXP, people would gobble it up. And they did.

If DEC had said, we are a software company, not a computer company, they might still be around today. They might have kept their peripherals business, or at least some of it, instead of selling it to keep their computer line afloat.

Had they realized this, they could have changed CPUs (or licensed others to do so) when the AXP started to face competition.
 
Last edited:
  • Like
Likes davenn
  • #8
Thanks guys for the thread

Always enjoyable reading personal insights and experiences
Of the older tech/software.

It passes the time whilst in my hospital bed

Dave
 
  • Care
Likes pinball1970 and Klystron
  • #9
Sorry to hear that you are hospitalized. I tried it a couple times. Didn't much care for it.

There were unquestionably good ideas that came about too early or too late.

I am still grumpy about the hatred shown to the WinChip. I owned a few. It was very good for what it was, but the reviewers at the time had the opinion "If I am not interested, nobody should be interested".

It would have been interesting to see the move to multicore a decade earlier, but at the time Intel was telling us the future was 10 GHz single-core chips. Oops.
 
  • Like
Likes davenn and Klystron
  • #11
Sadly, the people they fooled first and best was themselves. One cannot always predict the future from a line between the past and the present.
 
  • #12
Vanadium 50 said:
I would not call a VAX a "hated CPU", except by people who were deeply invested in others (370, PDP). People loved it. ...
So true. Actually missed "PC" in the thread title (apologies) but thought about the odd problem Digital Equipment Corporation played on itself introducing the excellent VAX technology following its successful PDP line of mini-computers. No doubt the 'operating system wars' that dominated discussion during that era influenced professional opinions. I never had a problem with platform/ architecture -specific OS as long as the investment is worth the improvement. Certain keys and bit flags acessible at the system level provided exquisite synchronization among processes and programs.

Ames management had already migrated computer accounts to VAX. My group took that responsibility from the director but only as an ancillary duty. Operating, configuring internals and I/O, and programming latest dedicated VAX boxes occupied real time.
 
  • #13
I remember the times when a ten-year-old computer had a decent scrap content. For me, it all came to an end with the VAX. Gone are the halcyon days of metal recycling.
 
  • #15
Klystron said:
as long as the investment is worth the improvement.
This was the birth of VAX and the death of VAX.

PDP-11 was hitting its architectural limits. Further, it had a bunch of operating systems for sale, which a) fragmented the user base, and b) had some of the user base gain experience with OS migration.

With the VAX, everyone ran VMS. Yes, they could run Ultrix, but that's what it could do, not what it did do. And that's what let them transition to Alpha. The mistake they made was saying "We successfull moved from PDP to VAX. Then we successfully transitioned to AXP. Good thing we're done." And not "Let's get ready for the next one."

x86 has evolved to survive by not being an instruction set. Hasn't been for many years. It's effectively pseudocode. The CPU translates it into its own "real" instruction set, which few humans ever see.
 
  • Like
Likes davenn and Klystron
  • #16
Unless I misremember, the Celeron line of cheaper Intel processors were just Intel processor chips where the on-chip cache system (and maybe some of the cores) failed testing but everything else checked out. This blurb from Lenovo definitely brought out my snark: "Celeron® processors are designed for basic tasks and have lower clock speeds, fewer cores, and smaller cache sizes." That's marketing baloney right there.

The cache subsystem of a computer is an important aspect of its design that is little understood and typically not even described in specifications any longer (whereas in past years, you'd at least see Layer 1 (usually on-chip) and Layer 2 (often on the motherboard) cache sizes mentioned, though not always.

Now when buying a computer, all the available CPU's are fast enough for me. I focus more on getting 32 Gb RAM (or more) and the fastest, largest solid-state drive because Windows likes to use a lot of the disk as swap file space and software grows ever bigger.

I tend to avoid Acer brand (if my budget allows) because I found that the main way they keep cost down is by not using much external cache on their motherboards.
 
  • #17
Vanadium 50 said:
Can you clarify what "this" is? There are several possibilities.


Lotus 1-2-3 had two huge advantages over its competition, one of which led to its demise.

(1) It would properly determine which cells depends on which other cells and execute them in order. Previous spreadsheets worked "row-wise" or "column-wise".

(2) It had a macro language. This was truly out of control. Macros would literally run for days, on PCs dedicated to run a single spreadsheet. On the plus side, it moved effort from senior accountants to junior bookkeepers, but on the minus, the spreadsheet logic was an unintelligible mess. Auditors refused to sign off on this, management got spooked, and a lot of this turned into procedural code run as scheduled jobs.

The Lotus people were always nice to me. Partly because I found a bug and went to them before going to the press.

Spreadsheet macros have always, IMHO, been nightmares. I've been tasked with "debugging" what accountants believed to be "programs". Thinking back it can still bring tears and involuntary tics to my face.
 
  • #18
@harborsparrow I fail to see the problem. Some Intel customers want high performance. and some want low cost. Intel has product line for each. If they can make them (and this sell them) for less by rectcling product from other lines. everybody wins.
 
  • #19
@sbrothy I don't think macros are themselves bad. It's the abuse that;s bad. Using a macro to do a non-trivial validation of daat (you can't deduct more dependents than you are allowed to) is perfectly sensible.

The problem is when these become hundreds or thousands of lines of spaghetti, often with "magic numbers" hard-coded in and not a whoy of documentation.

They also spent a lot of time doing unnecessary work. If I need to know the 5 top stores, I don't need to sort a list of thousands.

This was a good idea, gone out of control. Lotus knew this was a bad idea, but they made a lot of money at it. To their credit, they actually waned people against doing this in the manual. Which was predictably ignored.
 
  • Like
Likes sbrothy
  • #20
I'm deciding if I should go onto the silver medalist, the 80286.
 
  • #21
Vanadium 50 said:
@sbrothy I don't think macros are themselves bad. It's the abuse that's bad. Using a macro to do a non-trivial validation of daat (you can't deduct more dependents than you are allowed to) is perfectly sensible.

The problem is when these become hundreds or thousands of lines of spaghetti, often with "magic numbers" hard-coded in and not a whoy of documentation.

They also spent a lot of time doing unnecessary work. If I need to know the 5 top stores, I don't need to sort a list of thousands.

This was a good idea, gone out of control. Lotus knew this was a bad idea, but they made a lot of money at it. To their credit, they actually waned people against doing this in the manual. Which was predictably ignored.
I agree completely. What triggered this response was the wistful thought of having access to any documentation. Any at all, whoy(?) or otherwise, I wish! Awful times!

Hah! And expecting users to read the manual?! Now that's science fiction!

o0)

EDIT:

Reminds me of a little silly anecdote. One of my colleagues once went straight "over my head" and directly to my boss (who was sitting in the same room as me only 4 meters away!) complaining that I hadn't documented my work. (a simple C++ DLL implementing a simple word replace for use in PowerPoint, which was much too slow for the task). They came to my desk, the idiot rat with a smug smile on his face. Turned out I, as always in a hurry, had put the "documentation" (really? documentation for a DLL containing one simple word replace function?!) in the source file instead of in the header. So, too lazy to look in all the files of a project comprising 5 files?! I never saw that person again. :-p
 
Last edited:
  • #23
Vanadium 50 said:
I'm deciding if I should go onto the silver medalist, the 80286.

The 80286 was my first PC system, purchased all the parts...case, MB PSU, CPU etc and built it up.
I moved up from an ATARI 1040ST (1Meg of RAM)

Had to buy the math co-processor as a separate chip and plug it in.
From memory, only 1MB of ram
40MB HDD ( such a crazy amount of storage hahaha)

I cant remember which version of DOS it was, V5? back in 1991-'92
 
Back
Top