As I implied earlier, just knowing how to write a piece of code isn't the whole story. For example, it's important to know the performance implications of using different ways to do things, to know how a particular language or implementation does its garbage collection (or not), to understand when one can lose precision in arithmetic operations, when to use an integer or real (float, whatever), or the importance of semaphores/locks, or whether variables are passed by value or reference or both and the implications, or how to gracefully handle errors (which can be up to 90% of the source code sometimes). Learning to use some debuggers, particularly those on Unix/Linux, is a major effort just in itself. (The newer generation of graphic debuggers makes this effort much easier but I don't know of a free one for Unix/Linux) Unless one is an avid reader of language "usage" manuals (not the language specification/reference manuals), these are things not easily learned.
The one other major area where beginning programers are deficient is in the program lifecycle and tool usage, e.g., the whole process of turning requirements into architecture into design into code, then debugging and arranging for maintenance. For example, how many self-taught programmers use a source code control system, which no professional would ever do without (except, perhaps, for a very small personal project, e.g., <2500 LOC).
But I do encourage anyone to learn programming. Programming is one of the few disciplines where "half way" won't do the job; it teaches clear and detailed thinking, as these attributes are required to do anything more than the most basic operations. Lastly, programmig is FUN and very rewarding. Starting off with a general understanding of some problem to be solved, working with the user to fully understand the requirements, then working out the architecture to provide the best software and hardware utilization, performance and reliability, then developing, debugging and releasing the code, all gives one a real sense of accomplishment.
But just as knowing how to use a hammer and saw doesn't make one a carpenter, being able to write a few lines of code doesn't make one a programmer.
Now that I've done my ranting: a newbie might consider starting with Microsoft's free C# Express. C# is nice because it is a pseudo-compiled language (compiled to a hypothetical machine), has an integrated development system (IDE) with editor, debugger and other necessary tools. And, because of its pseudo-compilation, virtually all run-time errors immediately put you directly into your source code editor with a (usually) meaningful error message, highlighting the actual source line that caused the problem and allowing it to be changed. With C#, a beginner stands a good chance of having something actually run if it compiles and links correctly, unlike C or C++.
Java is also good, but to my knowledge there's not a single package that one can get for free that contains a complete IDE (compiler, linker, editor, debugger and the other essential development tools) like Microsoft's C# package. Microsoft also has a C and C++ express, containing the same elements as the C# Express package, but a beginner will have much more difficultly learning the language since so many compile-time semantic errors will only be caught with some not-very-meaningful catastrophic run-time error. However, the C language is clearly the simplest of all the current batch of popular languages to learn. Unless you like LISP or APL or FORTH, that is (chuckle, chuckle).