Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why do programming books teach everything in this way?

  1. Mar 12, 2014 #1
    Here's the way that every programming book teaches a language:

    1. Teach the reader how to print "Hello, world!" to the console/browser/etc.
    2. Tell the reader, "Don't worry what all this stuff means. It'll make sense when we get to the end of the book."
    3. Teach the reader how to make more advanced programs.

    Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?

    I find that it's extremely difficult to help people who are learning programming for the first time in college intro classes because I can't explain to them why they're getting an error or why one procedure is better than another. (And not an expert by any means.)
  2. jcsd
  3. Mar 12, 2014 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    The books would be twice as long, even with the crash course. People want to jump right into things and skip all the preparation.
  4. Mar 12, 2014 #3
    Really? I feel like the fundamentals could be condensed into a couple pages or so.
  5. Mar 12, 2014 #4


    User Avatar
    Homework Helper

    I think the idea here is to show how to print anything to the console, in order to see the output, then that can be used to output variables when first learning programming. If using a source level debugger with variable (or if assembly, register) display, then printing to the console wouldn't be needed, but the student would then have to learn how to use the source level debugger.

    As far as an intro to programming, something interactive siimlar to a calculator with some ability to create functions would probably be better for the first week or so, before moving on to one of the more common languages (like a simple version of Basic).
  6. Mar 12, 2014 #5


    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    Well, here's yer chance.
  7. Mar 12, 2014 #6


    User Avatar
    Science Advisor
    Homework Helper

    There are (or should be) two different types of book. Those that teach you how to program in your first language, assuming you know nothing, and those that teach you a new language assuming you already know how to program.

    I've lost count of the number of different languages I've learned, over a few decades (and I've forgotten the details of most of them.) Probably at least 20. I don't need 20 copies of a crash course on how computers work!

    Actually, you could make a good argument that "a crash course in computer hardware" is absolutely the WRONG way to learn your first language. Learning to programming in a high level language should be learning about abstraction and generalization, not specifics of a particular system like the number of bits in an integer or the way characters are encoded.
  8. Mar 12, 2014 #7

    jim mcnamara

    User Avatar

    Staff: Mentor

    I think a lot of courses of study follow that paradigm. IMO, two of the main goals of middleware and higher level development environments:
    Code (Text):

    'prevent junior programmers from directly accessing the OS'  
    'provide abstract data models and methods for junior programmers'  
    However, I have needed to explain basic OS concepts to Senior P/A types. I find that a bad state of affairs. Maybe some change away from that paradigm is needed.
  9. Mar 12, 2014 #8

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Introductory books on a specific programming are written for two classes of people:
    - Those who have never programmed before.
    - Those who have programmed a lot, but not in that specific language.
    The second group has come to expect that the "Hello world" program will be one of the very first examples in the text.

    I would argue the other way around. An introduction to computer programming class should use a language that is ambivalent to how things are represented in computer hardware. Python, for example, or lisp (which is what MIT used to use).

    Newbie programmer errors usually have nothing to do with the underlying machinery. They're errors in the logic of the code. Newbies don't think logically, they can't execute the code by hand, and they don't use debuggers.
  10. Mar 13, 2014 #9
    There is usually a course such as "Introduction to Computers" that is addressed to a broad audience, most of whom have no interest in programming the computer.

    Programming courses generally assume that the students have already used computers.

    The "Hello World" example obviously introduces them to one of the output devices.

    I suppose that would be considered "old school". A lot of what is taught in college programming courses seems to avoid any discussion of how data is represented at the bit and byte level. A big part of the problem is that these courses need to address themselves to a wide audience. Most of the students in a programming course are not going on to become professional programmers. So it seems that the programming courses are designed for people with a range on enthusiasm for the course material. The objective of most programming students is to pass the course - not to master the course material.

    By the way, you could spend quite some time describing how "int x = 5;" is implemented. For example, if you're using an optimizing compiler, (compiling in "release" mode), there may be no memory reserved for x or perhaps the entire statement would be optimized away.

    In almost all cases, knowing the implementation will help most people understand the concept. For example, once you've worked through polymorphism, seeing how it is implemented at the machine level will help cement the concept and syntax.
  11. Mar 13, 2014 #10
    Which is going to really screw the students up when, for example, they can't figure out that the reason they're getting a compiler error is because they're trying to compare an unsigned int to an int.
  12. Mar 13, 2014 #11
    As I said, knowing the internals does help cement knowledge.
    This doesn't distress me as much as it does you.
    First, even knowing about the data structures isn't going to protect people from misunderstanding error messages.
    Second, not everyone uses their brain in the same way while programming. It's amazing what can be done by a programmer who doesn't really know what's going on, but is bizarrely persistent in trying to get it done. Many programmers have no use for the "internal" information.
    Third, getting good at programming is a trial by fire. You will never learn everything you need to know by just reading. What is best remembered are those pieces of information that took you hours to discover on your own - through experimentation or research through the reference material.

    All that being said, it is good to minimize a students frustrations early in the process. They should get a few good experiences under their belts before having to deal with the really fun stuff.
  13. Mar 13, 2014 #12
    Do you think that someone can be considered a real programmer if they don't know, and have no use for, internal information?
  14. Mar 13, 2014 #13


    User Avatar

    Staff: Mentor

    After the introductory programming sequence, a computer science major normally takes a course called "computer organization" or something like that, which introduces assembly language and the internal workings of a computer.
  15. Mar 13, 2014 #14


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I think it's reasonable to say that many people who program in high level languages such as Matlab don't know or care about internal information, but certainly they are real programmers if they are producing useful software that works.

    Even people who program in low-level languages such as C may have no idea how their particular system implements the standard I/O package, and it may be completely different from one platform to another. They don't need to know, because the interface is the same regardless of the implementation.

    Another consideration is portability: I may know very well how my compiler and platform do things internally, but if I want to write code that works on other compilers and platforms, I cannot write it in such a way that it depends on internal details unless they are guaranteed to be consistent by the language's specification.
  16. Mar 14, 2014 #15
    If you, as a programmer, "real" or otherwise, are completely unaware of what is going on at the register level, then you have set a rugged path for yourself. If programming is your career path, then sooner or later you are going to run into a situation where understanding what the compiler and computer are doing with your code will be critical - and not knowing it will turn what should be a 10 minute project into a 1+ month project - or a project avoided or abandoned.
    Of course, if you are interfacing with the hardware - writing device drivers and such - then understanding the machine, the mechanics of context switching, etc will be fundamental and completely unavoidable. As the market has developed, people who are comfortable writing in that environment are at a premium.
  17. Mar 27, 2014 #16


    User Avatar
    Gold Member

    Higher-level, object-oriented languages such as Java and C# run on a virtual machine, so knowing about the processor hardware is not going to help you be a better programmer when using such languages. That said, you do need to have a good sense of how the programming language is using memory in the storage that is available to it. People who have a hard time understanding object-oriented code didn't learn about memory and storage issues when they learned to code, and sooner or later, that will become very limiting.

    The good news is, that these modern, object-oriented languages simplify the use of memory considerably over languages such as C or C++. For many tasks, there is no need to use a more dangerous, older language (C, C++, Fortran, etc). By using the more modern languages, you are protected in certain ways from errors that even experienced programmers would make sooner or later, the kinds of things that result in memory leaks that crash a machine. Modern languages just generally don't let you go there. They reclaim unused memory automatically. However, this doesn't exempt you completely. You still need to understand when a statement results in memory being allocated (or made reclaimable).

    Modern computers have literally dozens of levels of abstraction between what you are doing as a coder and the actual hardware. For example, the machine-level instructions may be optimized without your knowledge so that some parts execute simultaneously in parallel pipelines without your needing to know. Semantically, most programs are sequential--one line of code executes completely before the next begins, unless you have deliberately used some kind of threading or other parallel mechanism--but in reality, the underlying operating system and hardware are doing all kinds of complex optimizations. It's pretty cool.
  18. Mar 27, 2014 #17
    Is C++ really considered an "older language"? I've watched recent lectures by Bjarne Stroustrop and he talks like its the language of the future.
  19. Mar 27, 2014 #18

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Please! We don't need yet another thread that devolves into programming religious wars.
  20. Mar 27, 2014 #19


    User Avatar
    Science Advisor
    Homework Helper

    That's like observing that Elon Musk talks like Tesla is the car maker of the future.

    (Debating whether Musk or Stroustop is the more credible prophet is missing the point of the comparison).
  21. Mar 28, 2014 #20

    Many subjects are easier learned top down than bottom up. You don't start learning physics with quantum mechanics and build up to the algebraic generalizations of basic physics.

    For a true beginner, seeing progress right away gives more inspiration than covering a bunch of topics that they really aren't going to understand anyway which is why books are written to start with "Hello world". Most intro to programming books are going to start by getting the beginner familiar with a the tools they are using: here is how you edit a source file, here is how you compile that program, here is how you run that program. The most useful example to start with is one that gives the user an indication something was done hence "Hello world!".

    You could give a crash course on compute hardware, machine representation of numbers, compilers etc... but after all of that chapter 10 of your book is still going to be here is how to write "Hello world" and then Chapter 11 is still going to be "Here is how you add two numbers together" even if you already know all about floating point, fixed point, various sized integers, one and twos complement, etc.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Why do programming books teach everything in this way?
  1. Books for Programming? (Replies: 6)

  2. Free Programming e-books (Replies: 27)