Why do programming books teach everything in this way?

In summary, programming books typically teach a new language by first instructing how to print "Hello, world!" and then moving on to more advanced programs. Some suggest that books should begin with a crash course on computer hardware and related concepts, while others argue that this approach may not be the most effective for beginners. It is also noted that introductory programming books are typically geared towards those who have never programmed before or those with experience in other languages. Additionally, errors in programming are often due to logical errors rather than a lack of understanding of computer hardware. Ultimately, the best approach may vary depending on the audience and the goals of the book.
  • #1
Jamin2112
986
12
Here's the way that every programming book teaches a language:

  1. Teach the reader how to print "Hello, world!" to the console/browser/etc.
  2. Tell the reader, "Don't worry what all this stuff means. It'll make sense when we get to the end of the book."
  3. Teach the reader how to make more advanced programs.

Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?

I find that it's extremely difficult to help people who are learning programming for the first time in college intro classes because I can't explain to them why they're getting an error or why one procedure is better than another. (And not an expert by any means.)
 
Technology news on Phys.org
  • #2
Jamin2112 said:
Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?

The books would be twice as long, even with the crash course. People want to jump right into things and skip all the preparation.
 
  • #3
SteamKing said:
The books would be twice as long, even with the crash course.

Really? I feel like the fundamentals could be condensed into a couple pages or so.
 
  • #4
I think the idea here is to show how to print anything to the console, in order to see the output, then that can be used to output variables when first learning programming. If using a source level debugger with variable (or if assembly, register) display, then printing to the console wouldn't be needed, but the student would then have to learn how to use the source level debugger.

As far as an intro to programming, something interactive siimlar to a calculator with some ability to create functions would probably be better for the first week or so, before moving on to one of the more common languages (like a simple version of Basic).
 
  • #5
Jamin2112 said:
Really? I feel like the fundamentals could be condensed into a couple pages or so.

Well, here's yer chance.
 
  • #6
Jamin2112 said:
Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?

There are (or should be) two different types of book. Those that teach you how to program in your first language, assuming you know nothing, and those that teach you a new language assuming you already know how to program.

I've lost count of the number of different languages I've learned, over a few decades (and I've forgotten the details of most of them.) Probably at least 20. I don't need 20 copies of a crash course on how computers work!

Actually, you could make a good argument that "a crash course in computer hardware" is absolutely the WRONG way to learn your first language. Learning to programming in a high level language should be learning about abstraction and generalization, not specifics of a particular system like the number of bits in an integer or the way characters are encoded.
 
  • #7
Actually, you could make a good argument that "a crash course in computer hardware" is absolutely the WRONG way to learn your first language.

I think a lot of courses of study follow that paradigm. IMO, two of the main goals of middleware and higher level development environments:
Code:
'prevent junior programmers from directly accessing the OS'  
'provide abstract data models and methods for junior programmers'

However, I have needed to explain basic OS concepts to Senior P/A types. I find that a bad state of affairs. Maybe some change away from that paradigm is needed.
 
  • #8
Jamin2112 said:
Here's the way that every programming book teaches a language:

  1. Teach the reader how to print "Hello, world!" to the console/browser/etc.
  2. Tell the reader, "Don't worry what all this stuff means. It'll make sense when we get to the end of the book."
  3. Teach the reader how to make more advanced programs.
Introductory books on a specific programming are written for two classes of people:
- Those who have never programmed before.
- Those who have programmed a lot, but not in that specific language.
The second group has come to expect that the "Hello world" program will be one of the very first examples in the text.


Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?
I would argue the other way around. An introduction to computer programming class should use a language that is ambivalent to how things are represented in computer hardware. Python, for example, or lisp (which is what MIT used to use).

I find that it's extremely difficult to help people who are learning programming for the first time in college intro classes because I can't explain to them why they're getting an error or why one procedure is better than another. (And not an expert by any means.)
Newbie programmer errors usually have nothing to do with the underlying machinery. They're errors in the logic of the code. Newbies don't think logically, they can't execute the code by hand, and they don't use debuggers.
 
  • #9
There is usually a course such as "Introduction to Computers" that is addressed to a broad audience, most of whom have no interest in programming the computer.

Programming courses generally assume that the students have already used computers.

The "Hello World" example obviously introduces them to one of the output devices.

Jamin2112 said:
Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as "int x = 5;", then build off that understanding?
I suppose that would be considered "old school". A lot of what is taught in college programming courses seems to avoid any discussion of how data is represented at the bit and byte level. A big part of the problem is that these courses need to address themselves to a wide audience. Most of the students in a programming course are not going on to become professional programmers. So it seems that the programming courses are designed for people with a range on enthusiasm for the course material. The objective of most programming students is to pass the course - not to master the course material.

By the way, you could spend quite some time describing how "int x = 5;" is implemented. For example, if you're using an optimizing compiler, (compiling in "release" mode), there may be no memory reserved for x or perhaps the entire statement would be optimized away.

In almost all cases, knowing the implementation will help most people understand the concept. For example, once you've worked through polymorphism, seeing how it is implemented at the machine level will help cement the concept and syntax.
 
  • #10
.Scott said:
A lot of what is taught in college programming courses seems to avoid any discussion of how data is represented at the bit and byte level.

Which is going to really screw the students up when, for example, they can't figure out that the reason they're getting a compiler error is because they're trying to compare an unsigned int to an int.
 
  • #11
Jamin2112 said:
Which is going to really screw the students up when, for example, they can't figure out that the reason they're getting a compiler error is because they're trying to compare an unsigned int to an int.
As I said, knowing the internals does help cement knowledge.
This doesn't distress me as much as it does you.
First, even knowing about the data structures isn't going to protect people from misunderstanding error messages.
Second, not everyone uses their brain in the same way while programming. It's amazing what can be done by a programmer who doesn't really know what's going on, but is bizarrely persistent in trying to get it done. Many programmers have no use for the "internal" information.
Third, getting good at programming is a trial by fire. You will never learn everything you need to know by just reading. What is best remembered are those pieces of information that took you hours to discover on your own - through experimentation or research through the reference material.

All that being said, it is good to minimize a students frustrations early in the process. They should get a few good experiences under their belts before having to deal with the really fun stuff.
 
  • #12
.Scott said:
Many programmers have no use for the "internal" information.

Do you think that someone can be considered a real programmer if they don't know, and have no use for, internal information?
 
  • #13
After the introductory programming sequence, a computer science major normally takes a course called "computer organization" or something like that, which introduces assembly language and the internal workings of a computer.
 
  • #14
Jamin2112 said:
Do you think that someone can be considered a real programmer if they don't know, and have no use for, internal information?
I think it's reasonable to say that many people who program in high level languages such as Matlab don't know or care about internal information, but certainly they are real programmers if they are producing useful software that works.

Even people who program in low-level languages such as C may have no idea how their particular system implements the standard I/O package, and it may be completely different from one platform to another. They don't need to know, because the interface is the same regardless of the implementation.

Another consideration is portability: I may know very well how my compiler and platform do things internally, but if I want to write code that works on other compilers and platforms, I cannot write it in such a way that it depends on internal details unless they are guaranteed to be consistent by the language's specification.
 
  • #15
Jamin2112 said:
Do you think that someone can be considered a real programmer if they don't know, and have no use for, internal information?
If you, as a programmer, "real" or otherwise, are completely unaware of what is going on at the register level, then you have set a rugged path for yourself. If programming is your career path, then sooner or later you are going to run into a situation where understanding what the compiler and computer are doing with your code will be critical - and not knowing it will turn what should be a 10 minute project into a 1+ month project - or a project avoided or abandoned.
Of course, if you are interfacing with the hardware - writing device drivers and such - then understanding the machine, the mechanics of context switching, etc will be fundamental and completely unavoidable. As the market has developed, people who are comfortable writing in that environment are at a premium.
 
  • #16
Higher-level, object-oriented languages such as Java and C# run on a virtual machine, so knowing about the processor hardware is not going to help you be a better programmer when using such languages. That said, you do need to have a good sense of how the programming language is using memory in the storage that is available to it. People who have a hard time understanding object-oriented code didn't learn about memory and storage issues when they learned to code, and sooner or later, that will become very limiting.

The good news is, that these modern, object-oriented languages simplify the use of memory considerably over languages such as C or C++. For many tasks, there is no need to use a more dangerous, older language (C, C++, Fortran, etc). By using the more modern languages, you are protected in certain ways from errors that even experienced programmers would make sooner or later, the kinds of things that result in memory leaks that crash a machine. Modern languages just generally don't let you go there. They reclaim unused memory automatically. However, this doesn't exempt you completely. You still need to understand when a statement results in memory being allocated (or made reclaimable).

Modern computers have literally dozens of levels of abstraction between what you are doing as a coder and the actual hardware. For example, the machine-level instructions may be optimized without your knowledge so that some parts execute simultaneously in parallel pipelines without your needing to know. Semantically, most programs are sequential--one line of code executes completely before the next begins, unless you have deliberately used some kind of threading or other parallel mechanism--but in reality, the underlying operating system and hardware are doing all kinds of complex optimizations. It's pretty cool.
 
  • #17
harborsparrow said:
For many tasks, there is no need to use a more dangerous, older language (C, C++, Fortran, etc).

Is C++ really considered an "older language"? I've watched recent lectures by Bjarne Stroustrop and he talks like its the language of the future.
 
  • #18
Please! We don't need yet another thread that devolves into programming religious wars.
 
  • Like
Likes 1 person
  • #19
Jamin2112 said:
've watched recent lectures by Bjarne Stroustrop and he talks like its the language of the future.

That's like observing that Elon Musk talks like Tesla is the car maker of the future.

(Debating whether Musk or Stroustop is the more credible prophet is missing the point of the comparison).
 
  • #20
Jamin2112 said:
Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?


Many subjects are easier learned top down than bottom up. You don't start learning physics with quantum mechanics and build up to the algebraic generalizations of basic physics.

For a true beginner, seeing progress right away gives more inspiration than covering a bunch of topics that they really aren't going to understand anyway which is why books are written to start with "Hello world". Most intro to programming books are going to start by getting the beginner familiar with a the tools they are using: here is how you edit a source file, here is how you compile that program, here is how you run that program. The most useful example to start with is one that gives the user an indication something was done hence "Hello world!".

You could give a crash course on compute hardware, machine representation of numbers, compilers etc... but after all of that chapter 10 of your book is still going to be here is how to write "Hello world" and then Chapter 11 is still going to be "Here is how you add two numbers together" even if you already know all about floating point, fixed point, various sized integers, one and twos complement, etc.
 
  • #21
Jamin2112 said:
Here's the way that every programming book teaches a language:

  1. Teach the reader how to print "Hello, world!" to the console/browser/etc.
  2. Tell the reader, "Don't worry what all this stuff means. It'll make sense when we get to the end of the book."
  3. Teach the reader how to make more advanced programs.

Shouldn't programming books start out by giving readers a crash course on computer hardware, machine representations of numbers, compilers, etc., then explain how that relates to a single statement such as int x = 5;, then build off that understanding?

I find that it's extremely difficult to help people who are learning programming for the first time in college intro classes because I can't explain to them why they're getting an error or why one procedure is better than another. (And not an expert by any means.)

The problem you are having is that being new to programming, you are having issues thinking algorithmically. That is, reducing a task to a list of mathematical statements.

This is perfectly normal, and is actually the hardest part of learning to code.

What you're trying to get your head around is "I know how I as a human would sort a list of numbers into ascending order. A computer can't do it that way. How would I design the algorithm for that?"

Once you have that down, the syntax of the language - producing the actual code - isn't hard (but the debugging is!)

What you're asking for - what you think you need - isn't really relevant when you start. You simply don't need to know low level, platform, architecture, language and implementation specific stuff - and in fact a lot of very experienced programmers don't either.

Jamin2112 said:
Is C++ really considered an "older language"? I've watched recent lectures by Bjarne Stroustrop and he talks like its the language of the future.

He is biased.

Pros:

Modern C++ 14 is very different from C or older C++. A lot of people aren't aware of the differences and think you still need to use pointers in C++. You do not (and it's a code smell without good reason)

Cons:

C++ is still (mostly) backwards compatible with C which does cause some major headaches, especially with some aspects of its syntax. There are newer languages that follow a similar paradigm but with a more modern design, for example Rust or D.

Further, there are some issues they really can't fix due to problems with curating the libraries and the wide range of platforms C++ runs on. For example, trying to do Unicode in C++ is simply very unpleasant isn't going to improve soon.
 
Last edited:

1. Why do programming books use a specific programming language to teach concepts?

Programming books typically use a specific programming language to teach concepts because it allows for a more structured approach to learning. By using a specific language, readers can focus on understanding the concepts and syntax of that language, rather than trying to learn multiple languages at once. Additionally, using a specific language allows for more practical and relevant examples to be used in the book, making it easier for readers to apply the concepts to real-world situations.

2. Why do programming books break down complex topics into smaller, more digestible chunks?

Programming books often break down complex topics into smaller chunks to make the material more manageable and easier to understand. By breaking down a complex topic into smaller pieces, readers can focus on one concept at a time and build upon their understanding gradually. This approach also allows for more frequent practice and reinforcement of the material, leading to better retention and understanding.

3. Why do programming books use a hands-on approach with exercises and projects?

Programming books use a hands-on approach with exercises and projects to help readers apply the concepts they are learning. This approach allows readers to practice their programming skills and see the results of their code in action. It also helps to reinforce the material and identify any areas that may need further clarification or review.

4. Why do programming books include explanations and examples of common errors?

In programming, it is common to encounter errors while writing code. Programming books often include explanations and examples of common errors to help readers understand why they occur and how to fix them. By learning how to troubleshoot and debug code, readers can become more proficient programmers and avoid making the same mistakes in the future.

5. Why do programming books encourage readers to think critically and problem-solve?

Programming requires a lot of critical thinking and problem-solving skills. Programming books often encourage readers to think critically and problem-solve by presenting challenging exercises and projects that require readers to apply their knowledge in new and creative ways. This approach helps readers develop their problem-solving abilities and prepares them to tackle more complex programming challenges in the future.

Similar threads

  • Programming and Computer Science
Replies
15
Views
1K
  • Programming and Computer Science
Replies
8
Views
1K
  • Programming and Computer Science
3
Replies
73
Views
4K
  • Programming and Computer Science
Replies
6
Views
1K
  • Programming and Computer Science
Replies
2
Views
1K
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • Programming and Computer Science
Replies
1
Views
8K
  • Programming and Computer Science
4
Replies
122
Views
13K
  • Science and Math Textbooks
Replies
9
Views
3K
Back
Top