Up to Date on The Second Coming

  • Thread starter Thread starter rudinreader
  • Start date Start date
Click For Summary
The discussion revolves around reflections on past predictions about computing and the internet, highlighting a nostalgia for earlier computing eras when systems were more understandable. Participants express frustration with modern complexities in software and hardware, suggesting that the depth of knowledge required has increased while accessibility has decreased. There is a call for simpler, more transparent computing technologies, particularly advocating for Linux to cater to hobbyists and scientists rather than competing with mainstream operating systems. The conversation also touches on the evolution of computer science, arguing that understanding the hardware-software interface is essential yet increasingly obscured by high-level abstractions. Ultimately, there is a desire for a return to more comprehensible computing systems that allow for deeper engagement with technology.
  • #31
rudinreader said:
it's hard to comprehend, don't bother.

Not everyone feels this way about calculus. Not everyone feels this way about all of the intricate workings of a modern PC.

There are plenty of people, many of them contributors to this forum, who find all manner of challenging subjects both interesting and enjoyable. I suspect you are the sort of person who has a limited taste for challenge. The people driving all of this "complex" technology most surely do not have such a limitation; and it's not likely to ever get easier, from your perspective. But it's likely to get more fun from mine.
 
Computer science news on Phys.org
  • #32
rudinreader said:
In the same light, there is nothing stopping electrical engineering students from reading Masters to Phd level Math.

Except all the terms, theorems and notation you would need to become familiar with. There's a lot of math in CS/E but personally i wouldn't expect to be able to read and fully understand most advanced math papers without first having to go through a lot of research to get acquainted with some of the principles.
 
  • #33
I'm not saying that we should de-evolve.. Instead, I'm simply questioning the prevailing view of the term "obsolete". I'm not saying "we should do (fill in the blank)". And yes, I have a naive view of operating systems and technology, the number of people who can comprehend technology on the whole is probably around the number of people who can comprehend mathematics on a whole: "vanishilngly small".

But take for example that you can buy a dell PC with "free DOS":http://www.dell.com/content/topics/segtopic.aspx/e510_nseries?c=us&cs=19&l=en&s=dhs

And go with it. I'm not saying you should buy it. But if simple technologies (once considered obsolete) exist, then it's not hard or expensive for manufacturers to add it to their product line... And I think that there is going to be more of this in the future, and the conjecture is that when text-surfing and the high-school chemistry example of reading input sensors is all someone wants, then they will be able to buy it and it won't have to be a cell phone (again I don't have a cell phone and probably never will).. but instead who knows, could be like one of those $10 atari joysticks that plug into your tv..

But don't get me wrong, I'm not trying to "impose my view", admittidely I regret that I actually know so little about computers - and clearly everyone who has responded knows more about them than me. Anyways...
 
Last edited by a moderator:
  • #34
rudinreader said:
I'm not saying that we should de-evolve.. Instead, I'm simply questioning the prevailing view of the term "obsolete".

What exactly is the "prevailing view of obsolete"? Just because a new product comes out every 6 months doesn't mean all previous products are rendered obsolete by it...

rudinreader said:
But take for example that you can buy a dell PC with "free DOS":http://www.dell.com/content/topics/segtopic.aspx/e510_nseries?c=us&cs=19&l=en&s=dhs

And go with it. I'm not saying you should buy it. But if simple technologies (once considered obsolete) exist, then it's not hard or expensive for manufacturers to add it to their product line...

So are you saying it's too hard to buy older technology? Check on Ebay, I bought my fiance a laptop for $150 that surfs the internet, plays DVD's, and does everything she wants to do.

rudinreader said:
And I think that there is going to be more of this in the future, and the conjecture is that when text-surfing and the high-school chemistry example of reading input sensors is all someone wants, then they will be able to buy it and it won't have to be a cell phone (again I don't have a cell phone and probably never will).. but instead who knows, could be like one of those $10 atari joysticks that plug into your tv..

Text-surfing would be a terribly inefficient way to surf the internet IMO. The internet is all about MULTImedia! Pictures, videos, sound files, all seamlessly integrated together in webpages. The internet would be a terribly boring place if it was just a big text-based encyclopedia.

The goals you want to accomplish with basic sensors could be done with a TI calculator and some programming; or a BASIC chipset and some soldering; or on an oscilloscope; there's nothing new or inherently exciting about it. Simple tasks like you are describing can be accomplished by even the most rudimentary "computers" in our society, and the best computers in our society can accomplish things that are lightyears ahead.

So let's follow your idea to its logical conclusion; we go back to basics like you are suggesting. You would like to have DOS on a computer; a simple operating system that makes you type in commands. Of course, if you were REALLY starting at the basics you have to write this DOS program as well, but we'll assume we aren't COMPLETELY reinventing the wheel here.

We'll want some kind of file structure to store stuff, and we need to be able to find where we stored things. So we'll make a directory structure on the computer's memory.

To suport the hardware of the computer, you'll need to load the drivers for each component; there could be a couple dozen of them, so the logical thing to do would be to load drivers from a text file that lists all of the drivers you need for static hardware in your system... and we'll go ahead and do the same thing with drivers themselves too.

Next? Well, we want an interface that allows us to easily browse through lots of files quickly and efficiently (dir/w just won't cut it for thousands of files...) so we make up a mouse-keyboard based GUI. It will of course need a driver for the display, and some basic stuff for placing representations of files (icons) and other things like file parameters and such. Since we already have a simple system for sorting files and browsing them (DOS) let's lay our GUI over the top of it.

We'll want to be able to run programs (what good is a computer without at least that?) so we'll make a standard programming language that the GUI/DOS will run off of. this way we can run programs that can control the computer from the quick comfort of our GUI. So we do this, write a few programs for things like surfing the internet, writing notes, performing tasks on the computer, etc.

So, what have we just described? Could it be we have "invented" something that perhaps already exists? Hmm?

...how about an operating system like Windows or Linux! That's really all they are, a program made to increase convenience and efficiency! We can perform tasks using the computer's hardware without us having to type the code in; instead we click a button and a pre-determined set of code is excecuted! Granted Wondws is incredibly bloated these days, but it's due to the HUGE amount of third-party hardware and software it has to support; not some inherent malfunction in the way modern computers operate. If the software the computer ran was specificaly tailored to the computer's hardware (can be done in an open-source OS like Linux) it would be a very streamlined and smooth way to accomplish complicated tasks on the computer.

Woah, I got way out on tangents there...
 
Last edited by a moderator:
  • #35
If you want to get back to basics in computing, why not check this out:

http://www.parallax.com/

Description of a BASIC stamp chip

Parallax.com said:
A BASIC Stamp microcontroller is a single-board computer that runs the Parallax PBASIC language interpreter in its microcontroller. The developer's code is stored in an EEPROM, which can also be used for data storage. The PBASIC language has easy-to-use commands for basic I/O, like turning devices on or off, interfacing with sensors, etc. More advanced commands let the BASIC Stamp module interface with other integrated circuits, communicate with each other, and operate in networks. The BASIC Stamp microcontroller has prospered in hobby, lower-volume engineering projects and education due to ease of use and a wide support base of free application resources.
 
Last edited:
  • #36
Thanks for the link mech_engineer.. They have a lot of intriguing products at good prices.. One project that I am kind of considering (a couple years down the road) is to read through Knuth's Art of Computer Programming, and preferably do the exercises on the lowest level device as possible (I'm not sure if this would make sense as of yet..).. Could be I could use a parallax device..

As for a reason why anyone would want to do that, here's a clip from a conversation with Bill Gates at http://www.microsoft.com/presspass/exec/billg/speeches/2005/07-18FacultySummit.aspx:

MARIA KLAWE:...And I wondered if you had just ideas that would help us or whether Microsoft has ideas that might help computer science departments stay on top of the most recent technological developments.

BILL GATES: Well, certainly it's the goal of our University Relations Group to make sure that we're talking about what we think the state of the art problems are, finding out from the universities and a lot of dialogue back and forth about that. In a certain sense, yeah, the curriculum has changed, but say somebody came for an interview and they said, "Hey, I read the 'Art of Computer Programming', that's all I ever read, I did all the problems, I would hire them right then."

MARIA KLAWE: You'd hire them right then.

BILL GATES: Yeah, that's right.

MARIA KLAWE: So would I.

I'm not saying necessarily that I wan't to work for Microsoft (as opposed to becoming a Math professor), but I found it interesting how his ideal new hire would have studied just Art of Computer Programming...

Another quoute I found interesting
BILL GATES:--- it's about time you guys figured out how to take n processors and get n capability out of these things, and now just go do that.

Well, that turns out to be one of the great unsolved problems, and yet there's no way around it. I mean, literally Intel is going to have 32-processor systems running at 5 gigahertz, and that's what a modern PC will be, say, within four or five years. So we certainly need brilliant people thinking about that problem, what's going to go on with it.
 
Last edited by a moderator:
  • #37
The Art of Computer Programming is literally the bible of computer science. If you know the stuff between its covers, you know 90% of what's really important about programming a computer.

Being a computer scientist is not about knowing lots of programming languages, or being able to use Visual Studio to make graphical applications, or being able to download and use a math library.

Being a computer scientist means understanding the deep, important subjects that underly all computer programming -- data structures, algorithms, and trade-offs. Those three topics literally define computer science. All the rest that goes on top of it (PHP, Visual Basic, etc.) is fluff. The fluff is easy to learn, and changes yearly -- that's why Microsoft would rather hire programmers who have ignored the fluff and studied only the deep magic.

This is why I think this entire thread is misguided, rudinreader. You seem to be mistaking pretty graphical interfaces, network applications, layers of abstraction on top of the hardware, rapid application development languages, etc. as computer science, but they aren't.

In the same vein, interior design is not architecture.

- Warren
 

Similar threads

  • · Replies 18 ·
Replies
18
Views
2K
Replies
10
Views
5K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
10K
  • · Replies 12 ·
Replies
12
Views
6K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
29
Views
5K