Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Up to Date on The Second Coming

  1. Dec 14, 2007 #1
    Up to Date on "The Second Coming"

    Just reading some old (1990,2000) articles concerning prophesies or the future of computing and the internet, etc..

    The Second Coming Manifesto:
    http://www.flatrock.org.nz/topics/info_and_tech/second_coming.htm

    The Computer for the 21st Century:
    http://www.ubiq.com/hypertext/weiser/SciAmDraft3.html

    It's actually interesting to look at Weiser's home page, because the picture reads: "I have no idea what the future will be". I gather this was to admit being pretty wrong with his ideas.

    My personal take is pretty much the opposite of both of these guys, but still I could be dead wrong. I personally am pretty annoyed (or nastalgic) at the early days of the IBM PC, up until Windows 3.1. In those days it was possible to understand how the entire operating system and hardware worked (if they would only give you enough time to study it without changing the technology in the next release!!), and you could also understand (via C/C++ etc) exactly how the program you wrote would be executed.

    Certainly there were lots of problems and inefficiencies back then. If your program crashed (especially in DOS), then you would have to restart your computer. Another big inefficiency was the lack of rapid application development. Unfortunately, when RAD came to be, we simultaneously lost the ability to know what the programs looked like at an assembly level, etc.

    All in all, it's clear to me that the original computer actually had something in it that would be of interest to computer science. The modern computer only simulates computer science, in that there is no way in hell you can understand how the code is processed in the machine.

    To make a long story short, I hope that technologies like this one will play a bigger role: http://www.grifo.com/SOFT/uk_bas52.htm. Or I would also be happy if linux would play a role in making interesting computer science more accessible to nonspecialists. I have "glanced" through the "Linux Device Drivers" book, and the fact that that is a starting point seems to me to be kind of a problem. The "embedded linux" seems to me to have more hope than the intel cpu linux. Ideally, in the future you would not have to be a specialist to study linux on cpu's easy enough to understand, etc..

    Anyone know of other recent prophecies on the future of computing, or have any ideas of your own? Please post.
     
    Last edited: Dec 14, 2007
  2. jcsd
  3. Dec 14, 2007 #2

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    It's called the "law of leaky abstractions" - in Java/C#/ruby you don't have to know what the bytes are doing - but if you do something wrong you have no hope of figuring out what is really happening.

    One prediction that is almost becoming true is
    "In ten years, computers will just be bumps in cables. (Gordon Bell 1990 )"

    We don't think of a phone/Ipod/etc as a computer but we don't go to a big beige box on a desk to play music or send a message.
     
  4. Dec 14, 2007 #3

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Computer science is primarily the study of data structures and algorithms, not assembly language code. In fact, most of what you describe (deep knowledge of the OS and hardware) is actually computer engineering, not computer science.

    In fact, I could just as well argue that very high-level languages like Python permit users to work with "pure" CS concepts in a way never before possible in syntactically challenging, feature-poor languages like C++.

    - Warren
     
  5. Dec 14, 2007 #4
    It's actually very possible to understand everything that a modern computer and OS does. It just takes some time and commitment, and less than you would think. But it won't come without some effort, just like anything worth while.
     
  6. Dec 14, 2007 #5
    I totally agree that C++ sucks..

    What you claim about computer science, that it's just algorithms and data structures, is certainly a simplification. Operating Systems, Compilers, are computer science subjects (moreso than computer engineering) that require you to deeply understand the assembly language, yet one of the hypotheses is that you assume that the assembly language instructions do what they are supposed to do. But in it's nature, that requires an understanding of computer architecture. Nonetheless, it's still computer science.

    On the other hand, the claim I made above that modern computers "simulate real computer science" was also a simplification. But part of the reason for my annoyance with modern PC's is that if you are interested in the hardware-software bridge then you are best off working with Lego's Minstorms or the basic-interpreter, as I mentioned above, no modern PC is sufficient for that.
     
  7. Dec 14, 2007 #6
    Sure it's possible, if you work for microsoft, or do what I'm probably going to have to do - read linux device drivers and write a better book (OK, that's bad, it's actually a good book. But still, what I'm saying is that more work needs to be done on linux to make it accessible). But I totally disagree with the sentiment. (And of course, I am referring to actually a very small subset of what computers do, because on a whole I don't think anyone is interested in everything computers do).
     
  8. Dec 14, 2007 #7

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Do you also find it frustrating that your doctor knows more about medicine than you do after reading Wikipedia articles? Does it bother you that Boeing consistently builds better airplanes than you can in your backyard?

    - Warren
     
  9. Dec 14, 2007 #8
    Ha ha, funny.
     
  10. Dec 14, 2007 #9

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    It's not meant to be funny -- it's an observation. Why does the complexity of modern computer software bother you less than the complexity of modern airplanes?

    - Warren
     
  11. Dec 14, 2007 #10

    dst

    User Avatar

    Actually, it's a real peeve for me. Not so much the complexity but the idea of so many things working sequentially and not failing. That all goes away when I see how it's all made - huge numbers of people who've more or less dedicated their lives to it and still spend years on it at a time. But I don't want to know what one of their CAD (Boeing or otherwise) models is going to look like, that's for sure. I took one look at that IBM Cell cpu layout and choked.
     
  12. Dec 14, 2007 #11
    chroot, I don't disagree with your point of view: that Computer Science is data structures and algorithms, as long as you don't force everyone else to share your point of view - in particular me.

    The complexity bothers me because it is unneccessary. Cheap computers with simple architecture and simple operating systems should be made available for $50, instead of trying to add features, when I'm happy with much less than what I already have.

    I began this article nuetral, but in reality I must say I am most interested in linux. It is not clear to me what the goals of that project are at the moment, but I am hoping that they don't try to compete with microsoft via matching every feature. Instead, I would like to apply linux to a simple cheap cpu, which I can add all kind's of scientific measurement equiptment on for a cheap price (in the future).

    Chroot, you are basically saying that microsoft, apple, and linux are like automobile manufacturers, and you shouldn't try to build one in your back yard. What I am saying, is that I think there's a market for machines that can be understood inside and out, but at the same time more advanced than lego's minstorms.

    Does that answer your question?
     
  13. Dec 14, 2007 #12

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    There's nothing stopping you from using old hardware and old software until the day you die. Embedded linux is already extremely popular. You can buy 486-based single-board PCs for far less than $50. If you like that route, go for it.

    The infrastructure you don't like is actually quite necessary if you want good-looking graphical programs with lots of interoperability and features to be designed in less than a human lifetime. New languages, APIs, RAD, etc. are all "power tools" which software engineers create to allow them to do more in less time. In the same way, Boeing has long since abandoned trying to build airplanes using nothing more than hammers and screwdrivers.

    I'm not saying you can't or shouldn't try to build your own computer or software in your backyard -- in fact, I think you should. I just don't think you should lament the fact that the field has developed to the point where the hobbyists can't compete with the professionals. After all, every other major industry has been this way for tens or hundreds of years, and I like that my doctor knows more than Wikipedia.

    - Warren
     
  14. Dec 14, 2007 #13

    turbo

    User Avatar
    Gold Member

    Showing my age here, but I started out with CP/M and migrated to DOS, writing application programs that would run under Ashton-Tate dBase. I could carry an entire point-of-sale/inventory-control/bookkeeping suite around on a few floppies. Things were wild and wooly in the early DOS years when the open architecture allowed software to directly address hardware, and "cowboy" developers were not above stealing system resources.
     
  15. Dec 14, 2007 #14
    Again I started this topic kind of nuetral, but really my prediction is towards linux.. I really hope that in the future, that if you study basics of computer science in "compilers-architecture-operating systems-networking", that you would be able to understand the actual implementation linux from a deep point of view with a minimum of effort...

    This would be geared more towards "serious hobbyists", as a general hobbyist could perhaps be satisfied playing with mindstorms, etc..

    I hope the brains behind linux will realize it's more of an important goal to market towards scientists rather than the general desktop market.. and then perhaps old CPU's like the 486 will find larger demand,, but hopefully better, simpler CPU's than the 486 in my opinion.

    But that's just my theory in regards to the second coming: linux will improve in this regard, and will thus not go away, regardless of whatever microsoft and others are up to.
     
  16. Dec 14, 2007 #15

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    There aren't enough scientists to comprise an entire market for an operating system. Furthermore, there is no incentive for anyone who is not a hobbyist to make anything hobbyist-friendly. Computer hardware and software are some of the largest and fastest emerging technologies the world has ever seen. Untold billions of dollars change hands over them.

    Your little fantasyland of the world being rebuilt with hobbyist-friendly 286's and toy operating systems is just that: a fantasyland. Futhermore, I can guarantee you that Linux will not evolve in any such way. I'm sorry to say, but you're basically delusional.

    - Warren
     
  17. Dec 14, 2007 #16
    Just underlines the point of why less people study computer science. I guess I should get back to Rudin..
     
  18. Dec 14, 2007 #17

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    What you're saying is basically the equivalent of lamenting how complex and indecipherable EKGs and MRIs are, and how you sure wish we could just go back to the days of drilling holes in each others' heads and sticking leeches on wounds. You know, things the average Joe can understand, because that's what is important.

    - Warren
     
  19. Dec 14, 2007 #18
    That's not at all what I'm saying, are you a lawyer or something?? For instance, I'm not saying the general theory of relatively is wrong because I (and the average Joe) don't understand it.

    What I'm saying is that this "fastest emerging technology" you mention is really not changing that much as far as I can see. It changed a lot in the 80's and 90's, and in the meantime we wen't through hundreds (exagerration) of hardware upgrades. But since windows 2000, the desktop environment hasn't changed much.

    The fact that it has actually stood still is a singularity in comparison to the 80's and 90's. My prediction is actually simple and seems very plausible to me (yet delusional to you). So we have PC's now, and we have the internet now. As long as that stay's fixed, and it probably will, then it's just a matter of time before simplified operating systems on simplified cpu's will show up. I'll just leave it at that for now, I guess.
     
  20. Dec 14, 2007 #19

    dst

    User Avatar

    The desktop isn't constant at all. Considering Microsoft is deciding what everyone sees lately, you might want to take a look as to what they have up their sleeves.
     
  21. Dec 14, 2007 #20
    No comment.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Up to Date on The Second Coming
  1. Back Date in Windows (Replies: 2)

Loading...