Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Up to Date on The Second Coming

  1. Dec 14, 2007 #1
    Up to Date on "The Second Coming"

    Just reading some old (1990,2000) articles concerning prophesies or the future of computing and the internet, etc..

    The Second Coming Manifesto:
    http://www.flatrock.org.nz/topics/info_and_tech/second_coming.htm

    The Computer for the 21st Century:
    http://www.ubiq.com/hypertext/weiser/SciAmDraft3.html

    It's actually interesting to look at Weiser's home page, because the picture reads: "I have no idea what the future will be". I gather this was to admit being pretty wrong with his ideas.

    My personal take is pretty much the opposite of both of these guys, but still I could be dead wrong. I personally am pretty annoyed (or nastalgic) at the early days of the IBM PC, up until Windows 3.1. In those days it was possible to understand how the entire operating system and hardware worked (if they would only give you enough time to study it without changing the technology in the next release!!), and you could also understand (via C/C++ etc) exactly how the program you wrote would be executed.

    Certainly there were lots of problems and inefficiencies back then. If your program crashed (especially in DOS), then you would have to restart your computer. Another big inefficiency was the lack of rapid application development. Unfortunately, when RAD came to be, we simultaneously lost the ability to know what the programs looked like at an assembly level, etc.

    All in all, it's clear to me that the original computer actually had something in it that would be of interest to computer science. The modern computer only simulates computer science, in that there is no way in hell you can understand how the code is processed in the machine.

    To make a long story short, I hope that technologies like this one will play a bigger role: http://www.grifo.com/SOFT/uk_bas52.htm. Or I would also be happy if linux would play a role in making interesting computer science more accessible to nonspecialists. I have "glanced" through the "Linux Device Drivers" book, and the fact that that is a starting point seems to me to be kind of a problem. The "embedded linux" seems to me to have more hope than the intel cpu linux. Ideally, in the future you would not have to be a specialist to study linux on cpu's easy enough to understand, etc..

    Anyone know of other recent prophecies on the future of computing, or have any ideas of your own? Please post.
     
    Last edited: Dec 14, 2007
  2. jcsd
  3. Dec 14, 2007 #2

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    It's called the "law of leaky abstractions" - in Java/C#/ruby you don't have to know what the bytes are doing - but if you do something wrong you have no hope of figuring out what is really happening.

    One prediction that is almost becoming true is
    "In ten years, computers will just be bumps in cables. (Gordon Bell 1990 )"

    We don't think of a phone/Ipod/etc as a computer but we don't go to a big beige box on a desk to play music or send a message.
     
  4. Dec 14, 2007 #3

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Computer science is primarily the study of data structures and algorithms, not assembly language code. In fact, most of what you describe (deep knowledge of the OS and hardware) is actually computer engineering, not computer science.

    In fact, I could just as well argue that very high-level languages like Python permit users to work with "pure" CS concepts in a way never before possible in syntactically challenging, feature-poor languages like C++.

    - Warren
     
  5. Dec 14, 2007 #4
    It's actually very possible to understand everything that a modern computer and OS does. It just takes some time and commitment, and less than you would think. But it won't come without some effort, just like anything worth while.
     
  6. Dec 14, 2007 #5
    I totally agree that C++ sucks..

    What you claim about computer science, that it's just algorithms and data structures, is certainly a simplification. Operating Systems, Compilers, are computer science subjects (moreso than computer engineering) that require you to deeply understand the assembly language, yet one of the hypotheses is that you assume that the assembly language instructions do what they are supposed to do. But in it's nature, that requires an understanding of computer architecture. Nonetheless, it's still computer science.

    On the other hand, the claim I made above that modern computers "simulate real computer science" was also a simplification. But part of the reason for my annoyance with modern PC's is that if you are interested in the hardware-software bridge then you are best off working with Lego's Minstorms or the basic-interpreter, as I mentioned above, no modern PC is sufficient for that.
     
  7. Dec 14, 2007 #6
    Sure it's possible, if you work for microsoft, or do what I'm probably going to have to do - read linux device drivers and write a better book (OK, that's bad, it's actually a good book. But still, what I'm saying is that more work needs to be done on linux to make it accessible). But I totally disagree with the sentiment. (And of course, I am referring to actually a very small subset of what computers do, because on a whole I don't think anyone is interested in everything computers do).
     
  8. Dec 14, 2007 #7

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Do you also find it frustrating that your doctor knows more about medicine than you do after reading Wikipedia articles? Does it bother you that Boeing consistently builds better airplanes than you can in your backyard?

    - Warren
     
  9. Dec 14, 2007 #8
    Ha ha, funny.
     
  10. Dec 14, 2007 #9

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    It's not meant to be funny -- it's an observation. Why does the complexity of modern computer software bother you less than the complexity of modern airplanes?

    - Warren
     
  11. Dec 14, 2007 #10

    dst

    User Avatar

    Actually, it's a real peeve for me. Not so much the complexity but the idea of so many things working sequentially and not failing. That all goes away when I see how it's all made - huge numbers of people who've more or less dedicated their lives to it and still spend years on it at a time. But I don't want to know what one of their CAD (Boeing or otherwise) models is going to look like, that's for sure. I took one look at that IBM Cell cpu layout and choked.
     
  12. Dec 14, 2007 #11
    chroot, I don't disagree with your point of view: that Computer Science is data structures and algorithms, as long as you don't force everyone else to share your point of view - in particular me.

    The complexity bothers me because it is unneccessary. Cheap computers with simple architecture and simple operating systems should be made available for $50, instead of trying to add features, when I'm happy with much less than what I already have.

    I began this article nuetral, but in reality I must say I am most interested in linux. It is not clear to me what the goals of that project are at the moment, but I am hoping that they don't try to compete with microsoft via matching every feature. Instead, I would like to apply linux to a simple cheap cpu, which I can add all kind's of scientific measurement equiptment on for a cheap price (in the future).

    Chroot, you are basically saying that microsoft, apple, and linux are like automobile manufacturers, and you shouldn't try to build one in your back yard. What I am saying, is that I think there's a market for machines that can be understood inside and out, but at the same time more advanced than lego's minstorms.

    Does that answer your question?
     
  13. Dec 14, 2007 #12

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    There's nothing stopping you from using old hardware and old software until the day you die. Embedded linux is already extremely popular. You can buy 486-based single-board PCs for far less than $50. If you like that route, go for it.

    The infrastructure you don't like is actually quite necessary if you want good-looking graphical programs with lots of interoperability and features to be designed in less than a human lifetime. New languages, APIs, RAD, etc. are all "power tools" which software engineers create to allow them to do more in less time. In the same way, Boeing has long since abandoned trying to build airplanes using nothing more than hammers and screwdrivers.

    I'm not saying you can't or shouldn't try to build your own computer or software in your backyard -- in fact, I think you should. I just don't think you should lament the fact that the field has developed to the point where the hobbyists can't compete with the professionals. After all, every other major industry has been this way for tens or hundreds of years, and I like that my doctor knows more than Wikipedia.

    - Warren
     
  14. Dec 14, 2007 #13

    turbo

    User Avatar
    Gold Member

    Showing my age here, but I started out with CP/M and migrated to DOS, writing application programs that would run under Ashton-Tate dBase. I could carry an entire point-of-sale/inventory-control/bookkeeping suite around on a few floppies. Things were wild and wooly in the early DOS years when the open architecture allowed software to directly address hardware, and "cowboy" developers were not above stealing system resources.
     
  15. Dec 14, 2007 #14
    Again I started this topic kind of nuetral, but really my prediction is towards linux.. I really hope that in the future, that if you study basics of computer science in "compilers-architecture-operating systems-networking", that you would be able to understand the actual implementation linux from a deep point of view with a minimum of effort...

    This would be geared more towards "serious hobbyists", as a general hobbyist could perhaps be satisfied playing with mindstorms, etc..

    I hope the brains behind linux will realize it's more of an important goal to market towards scientists rather than the general desktop market.. and then perhaps old CPU's like the 486 will find larger demand,, but hopefully better, simpler CPU's than the 486 in my opinion.

    But that's just my theory in regards to the second coming: linux will improve in this regard, and will thus not go away, regardless of whatever microsoft and others are up to.
     
  16. Dec 14, 2007 #15

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    There aren't enough scientists to comprise an entire market for an operating system. Furthermore, there is no incentive for anyone who is not a hobbyist to make anything hobbyist-friendly. Computer hardware and software are some of the largest and fastest emerging technologies the world has ever seen. Untold billions of dollars change hands over them.

    Your little fantasyland of the world being rebuilt with hobbyist-friendly 286's and toy operating systems is just that: a fantasyland. Futhermore, I can guarantee you that Linux will not evolve in any such way. I'm sorry to say, but you're basically delusional.

    - Warren
     
  17. Dec 14, 2007 #16
    Just underlines the point of why less people study computer science. I guess I should get back to Rudin..
     
  18. Dec 14, 2007 #17

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    What you're saying is basically the equivalent of lamenting how complex and indecipherable EKGs and MRIs are, and how you sure wish we could just go back to the days of drilling holes in each others' heads and sticking leeches on wounds. You know, things the average Joe can understand, because that's what is important.

    - Warren
     
  19. Dec 14, 2007 #18
    That's not at all what I'm saying, are you a lawyer or something?? For instance, I'm not saying the general theory of relatively is wrong because I (and the average Joe) don't understand it.

    What I'm saying is that this "fastest emerging technology" you mention is really not changing that much as far as I can see. It changed a lot in the 80's and 90's, and in the meantime we wen't through hundreds (exagerration) of hardware upgrades. But since windows 2000, the desktop environment hasn't changed much.

    The fact that it has actually stood still is a singularity in comparison to the 80's and 90's. My prediction is actually simple and seems very plausible to me (yet delusional to you). So we have PC's now, and we have the internet now. As long as that stay's fixed, and it probably will, then it's just a matter of time before simplified operating systems on simplified cpu's will show up. I'll just leave it at that for now, I guess.
     
  20. Dec 14, 2007 #19

    dst

    User Avatar

    The desktop isn't constant at all. Considering Microsoft is deciding what everyone sees lately, you might want to take a look as to what they have up their sleeves.
     
  21. Dec 14, 2007 #20
    No comment.
     
  22. Dec 16, 2007 #21
    chroot I reread this discussion and also some of your quotes:

    It's not meant to be funny -- it's an observation. Why does the complexity of modern computer software bother you less than the complexity of modern airplanes?
    I'm sorry to say, but you're basically delusional.


    After a second read, it seems to me you are kind of coming out of left field with your replies (although you are also implying that I am coming out of left field with my theories..). Although it's true that I began this discussion about how I'm "nastalgic" at the good ole days of DOS, that wasn't really the point. And I totally don't take it personal that you called me delusional, because that is an understatement for Weiser if you read his article below (and he was actually an "influential computer scientist").

    The Computer for the 21st Century:
    http://www.ubiq.com/hypertext/weiser/SciAmDraft3.html

    He actually claims "disposable pieces of paper computers" will perpetuate in society, and people at the time take him seriously.

    In contrast, what I say is pretty down to earth. The kind of prediction I am making is not really a prediction at all. It's also just an "observation". The modern PC behemoths, both Windows and Linux, were made in haste in terms of very rapid development of humongous technologies to support what amounts what a tiny CPU can accomplish: communicate text and multimedia over a phone line, display a graphical user interface with mouse, and perform i/o on various devices of varying (sometimes enormous) complexity.

    Now don't get me wrong, I am not proposing a design for such a system that has a single 8-bit data register in a single CPU. I'm not an engineer, and really don't know a thing about that. But to rebute the claim you made about "I gaurantee linux won't go in that direction", it's not actually up to the brains behind linux, it seems like it's more going to be a fact of the manufacturers. The software is free, it runs on hundreds of CPU's, and a sufficient subset of linux gets you to the internet, and gives you a GUI.

    As long as the current desktop computing situation doesn't change much, and it wont, (actually that is the central point of what I'm saying), then more and more people will buy a $50 linux machine when their toshiba laptop burns out after the warantee expires.. so they can save to buy a new laptop, and ultimately they will put off buying the laptop altogether for economic reasons, since the machine they have gives them all they need..

    And the end result, is that some of the said systems will come with complete schematics of how the motherboard is laid out, with instruction manuals about how to do i/o using linux on the parallel port or whatnot, etc.. and a lot of simple things. Unlike DOS, these computers won't disappear. More and more high school students will stumble upon this as a hobby, and as such more interest will grow in computer science, or as you say computer engineering. Well, I would stop in this train of thought here because this aspect - the ability to understand the machine itself, is a personal interest but is not the point of "the prediction".

    I personally don't understand why the "ubiquotous" computing people could imagine serious computer users will ever depart from a workstation. Are these conclusions delusional? Perhaps, but still seem much more down to earth than what some well respected computer scientists have come up with.
     
  23. Dec 16, 2007 #22

    -Job-

    User Avatar
    Science Advisor

    You don't actually need any hardware/sofware for studying Computer Science. All of today's machines are equivalent to a simplistic theoretic model called a "Turing Machine". The hardware implementation on today's computers isn't that interesting if you just want to play around or study Computer Science, because their increased complexity is with the purpose of optimization, not adding new capabilities. Same goes for modern Operating Systems.

    In my opinion, if anything today's technology enables students to do much more than they were able to do before. There are such high level languages and libraries that you just need to get acquainted with the basic programming principles to be able to do almost anything. And you are able to understand how the hardware and OS components your program runs on function, but when you do you'll see that such implementation-level details aren't that relevant.

    You are also able to get software emulators for older hardware, so that you don't have to go back to the stone age just to study something that's more understandable.
     
  24. Dec 16, 2007 #23

    -Job-

    User Avatar
    Science Advisor

    From a Computer Science perspective, Operating Systems and Compilers are only interesting in that they implement algorithms and data-structures. Actual implementation details, such what assembly language the Compiler generates, or which scheduling algorithm Windows implements are only relevant in Software/Hardware engineering.

    Computer Science studies what can be done, in how much space and how much time - the implementation details of that theory is relevant mostly for engineers.
     
    Last edited: Dec 16, 2007
  25. Dec 16, 2007 #24
    Just to let you all know my relative background in regards to computer science. I'm not an expert in any computer science subfield (..yet), but not a complete novice either, computer science was my undergraduate minor. (I'm planning to start Phd math program next year.) I took (Hard, in C) Data Structures, Assembly Language, Computer Architecture, Operating Systems, Theoretical Computer Science (so I know what a Turing Machine is..), and I think that's it. I started to take compilers but the teacher was so hard it was beyond description (there was certainly no parser generators allowed in his course!), so I dropped.

    I get the impression (understatement) there must have been a different atmosphere, for whatever reason, that a law student at harvard (hint, hint..) would be interested in basic compilers, assembly code, and operating systems.. But nowadays if you are a Phd track math student (like myself) you get advised to avoid digital logic much in the same light that an engineer should be advized not to study calculus: it's hard to comprehend, don't bother.

    Is that a contradiction of all I have said up to now? You decide; I don't think so. The point is that part of my prediction is that the embedded systems are going to make it possible for digital logic hobbysts to turn something into a profession if they get good at it, and perhaps a very lucrative one, if you can figure out how to sell linux on $40 lcd computers that can connect to dsl.

    About your quote "In my opinion, if anything today's technology enables students to do much more than they were able to do before." Of course it is true. But one cannot help to sense that "modern computing" must be about as immature as thermodynamics before Clausius, not to mention Boltzmann, etc.

    In particular, "students can do more than ever before", but the problem is that you have to be a bit more self motivated compared to any other scientific discipline. For instance, it's very straightforward to get a sophisticated mathematics education: Read Rudin. The closest equivalent book in Computer Science is Knuth's Art of Computer Programming (which I'm planning to read through eventually), and yet still there's so more more to read...

    Anyways, just thoughts..
     
  26. Dec 16, 2007 #25
    Actually if there is anywhere that I am wrong or delusional, I think it would be in my above quote. I just realized that the main thing all the crazy computer science prophets had in common was the shared belief that modern computers suck. One realizes that it is entirely possible that it may never actually get any better than it is today (relatively speaking).

    **Editing the last thing I said here because it was truly a bit over the top:: that "high school students would be able to read computer schematics, etc. in the future". I was lacking in sleep when I initially wrote that. But perhaps what can be said is anyone who studies embedded systems will be able to do the above said task, for relatively simple logic setups.
     
    Last edited: Dec 16, 2007
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook




Loading...