Best Programming Language for a Smart Teenager

In summary: Python, including books, online tutorials, and videos. Good luck to you and your granddaughter on your programming journey!
  • #1
yungman
5,718
241
OK, since you guys are just talking amount yourselves, I want to ask a question about learning C++. I really want to expose my grand daughter to programming. She's 14 only. I was going to learn Python and work with her as it's supposed to be easier. But I am thinking, what if I work with her to learn C++, at least the first few chapters only on Procedure programming like learning different variable types, doing some simple if-then else etc. to give her a feel. Would that be too much for my little girl? She's a straight A student taking some advanced classes in grade 9. But I don't want to scare her off either.

The good thing is I am around for her, I have all my notes for her, I already loaded up VS on her computer and I have a few GByte of programs of all the problems I worked on already. I even have the best book on C++...Gaddis. She should have plenty of support.

Let me know what you guys think.

thanks
 
Technology news on Phys.org
  • #2
@yungman I think that Python would be better for your granddaughter at least until she's ready to decide whether she likes programming ##-## as you know, sometimes coding can be tedious -- a language that is sooner to allow meaningful results, without as much environmental overhead up front, may be less likely to alienate a neophyte coder.
 
  • Like
Likes bhobba, PeterDonis, Filip Larsen and 3 others
  • #3
Thanks guys, after I read Sysprob's reply, I took out the Python for Kids book and took a look, it already talked about drawing on the screen 2/3 into the book that is 1/2" thick! Here I am almost finish the Gaddis (to 870 pages) and still working only in cmd window! I might be interested in Python too!

Thanks
 
  • Like
Likes sysprog
  • #4
While I know it is considered passe', I still think BASIC is the best place to start. I particularly like the final effort by Kimmeny and Kurtz called True BASIC. It is cheap, easy, and quite powerful. I use it routinely to this day for all manner of problems.
 
  • Like
Likes Tom.G
  • #5
Dr.D said:
While I know it is considered passe', I still think BASIC is the best place to start. I particularly like the final effort by Kimmeny and Kurtz called True BASIC. It is cheap, easy, and quite powerful. I use it routinely to this day for all manner of problems.
Visual Basic Express is available for free, as is Visual C++, and as is Visual Studio, by which to work with those languages and others, and Python is also available for free.

When John Kemeny et. al. at Dartmouth wrote the original BASIC interpreter, he/they was/were addressing a concern that compiled languages such as FORTRAN required more learning than was really necessary for simple interactive programming.

By the mid '70s, BASIC had been enhanced to extents that allowed it to be used in small-scale business and scientific applications; however, compiled code continued to be more efficient than interpreted code, and that often mattered in large-scale business applications and in scientific applications.

In my opinion, the compiled versus interpreted comparison still matters, e.g. vis-a-vis C++ and Python, and depending on the specific requirements, you could swap in other compiled or interpreted languages.

I think that Python is for most new programmers a better choice than BASIC, and I'm not alone in that opinion, see e.g. https://www.slant.co/versus/110/34559/~python_vs_basic
##-## for web coding, PHP is a good choice, but Ruby on Rails may help with employabilty ##-## the best choice of programming languages is nearly always purpose-dependent.
 
  • #6
Thanks for all your opinion, I talked to her more, she actually worked through 40 pages of the Python for kids which for a thin book, that's is like 1/6 of the book already. She doesn't have any problem and she never really elaborate to me. I thought she only read through a few pages, but she actually went to Python.org to download the IDE and work on it already.

So Python it is and I am a happy grandpa. My little girl! I might have her show me what to do, we'll see how much she learn!
 
  • Like
Likes bhobba, Filip Larsen and sysprog
  • #7
First of all, thanks for raising this up. I learned c++ during my university ages, but switched c#, java after graduation from university. In the last 5 years, 70% of my programming time is for python (AI and data science). One day, my little girl just played on my macbook and she was interested in the python code I wrote, and I just showed her how to code a 'Hello world', after that she tried every opportunity to ask me for programming.
So, i could like to add one vote for Python.
 
  • Like
Likes jbunniii, berkeman, yungman and 1 other person
  • #8
I strongly agree that Python is a great choice for a first programming language. It's easy to learn, easy to read and write, has many natural and powerful built-in features (like list comprehension) that older languages lack, and has a pretty much unbeatable ecosystem of libraries/packages, development tools, etc.

Not only that, but it's also used by virtually every professional software developer, so by learning Python you're not wasting time on a pedagogical toy language or one with only niche applicability.

C++ would be an absolutely terrible first language to learn, and I say that as someone who has been a professional C++ developer for nearly 20 years. It is an extremely fiddly language with a very complicated standard library and many error-prone "features" as a result of its C legacy. It is an indispensable language for certain purposes (OS/system programming, embedded software, etc.) but IMO a poor choice for most other purposes.
 
  • Like
Likes Buzz Bloom, lomidrevo, pbuk and 1 other person
  • #9
jbunniii said:
I strongly agree that Python is a great choice for a first programming language. It's easy to learn, easy to read and write, has many natural and powerful built-in features (like list comprehension) that older languages lack, and has a pretty much unbeatable ecosystem of libraries/packages, development tools, etc.

Not only that, but it's also used by virtually every professional software developer, so by learning Python you're not wasting time on a pedagogical toy language or one with only niche applicability.

C++ would be an absolutely terrible first language to learn, and I say that as someone who has been a professional C++ developer for nearly 20 years. It is an extremely fiddly language with a very complicated standard library and many error-prone "features" as a result of its C legacy. It is an indispensable language for certain purposes (OS/system programming, embedded software, etc.) but IMO a poor choice for most other purposes.
I cannot agree more, I have been studying C++ for close to 6months, almost finish the Gaddis brief version book, I am just starting to feel a little more comfortable. If I were to do it again, I likely pick Python to learn. There is so much twist and turns in C++, that's why I stop after chapter 14 and keep playing with Constructor, Copy Constructor and Assignment operator for over a week just to get a little more comfortable with it in which one is needed in particular case. Just like I keep verifying with your guys whether I got it right in the other post...Also, when and where to put const!

What you explained in the other post are very helpful.

Thanks
 
  • Like
Likes sysprog
  • #10
yungman said:
.Also, when and where to put const!

See Gaddis pp.104-105, 336, 374, 392, 613, 714, 717 and 817-824.
 
  • Like
Likes sysprog and yungman
  • #11
A possible area of contention in a software house in which everyone is coding in assembly language is that a programmer might want to have his own maclib (macroinstruction library) that might be incompatible at a parameter block level with something that was written by another programmer ##-##

It seems to me that sometimes management wants programmers to be co-interchangable, while conversely, programmers may want themselves to be individually absolutely indispensable ##-##

C++, with its typing that I would like to call 'strict but violable', both the strictness of which and laxity-of-enforcement of which can be helpful to programmers who want to interchange program objects, is an extremely powerful computer language that has allowed much collaboration.

The Python language, with its absence of types, allows programmers to write 'one-off' programs without worrying about too many rules.

Here's a link to an article that I think is rather good: https://www.fluentcpp.com/2017/03/31/how-typed-cpp-is-and-why-it-matters/
 
  • #13
sysprog said:
programmers may want themselves to be individually absolutely indispensable
Job security. :wink:
 
  • #14
I really like Python and find the formatting and syntax to be much easier than C or Java, less verbose.
 
  • Like
Likes bhobba and sysprog
  • #15
How big the piece of pie for firmware programming? Seems like nobody here talk about it. In my line of work in quite a few companies, C++ seems to be the go to language all these years. And that had a lot to do with my decision of learning C++. It's like that's supposed to be it.

We did have a lot of speed and timing concern in firmware, assembly is not out of the question. In the 90s, I came up with an idea of continuous measuring the time by sampling two pure sine wave 90deg difference in phase to get accuracy to about 10pS (10EE-11 sec). The company director personally wrote the firmware to run my hardware. Talk about time critical. I think he did it with C++.

funny my director's background is PhD in physics, but I trust him writing any program more so than anyone else. I am not even sure he knew about OOP , class overloading and all that. I have to ask him one day. But he is the BEST programmer of the company and bar none. That project was so critical we ended up working together to complete the project. He even tested the whole thing on the system to prove the performance.

I can imagine anyone that know hardware(limiting to digital) and programming will not have problem finding a job and get high pay in silicon valley. It is so hard to find someone that can program, then able to look at the schematic and scope probe the signal to trouble shoot the program.
 
  • #16
yungman said:
How big the piece of pie for firmware programming? Seems like nobody here talk about it.
I think that it's not impossible for a beginning programmer to learn firmware programming before learning things that may be less difficult, but I also think that firmware that is not hardwired may perhaps be best written in microcode, and that we should please recognize that doing or explaining things at such levels is not especially easy.
 
  • #18
sysprog said:
I think that it's not impossible for a beginning programmer to learn firmware programming before learning things that may be less difficult, but I also think that firmware that is not hardwired may perhaps be best written in microcode, and that we should please recognize that doing or explaining things at such levels is not especially easy.
Ha ha, at the time, I learn about 8085 assembly programming for like half a year and started doing test programs and fixing the prototype pcb. I think if you know a bit of both, people can be a lot more acceptable for young and inexperienced ones. It's hard to find people that knows both. Of cause, I was a lot younger, just look at me, half a year still struggling with C++.

Just a few years ago after I retired, I was talking to someone in the gym while we were working out, he happened to be a very high up manager in Lockheed, I was describing I had to fix the FPGA programs because those programmers don't know hardware and program the FPGA like programming, that run line by line. FPGA have all signal running in parallel, not serial. I had to go in and fix them...He offered me a job right on the spot! No interview or anything! Of cause I decline, I don't want to deal with firmware, I have better jobs if I want to. But there goes to show having people that know both is hard to find and companies might be a lot more forgiving when hiring.
 
  • Like
Likes sysprog
  • #19
As a professional programmer of over 30 years experience, mostly at the senior programmer/team leader level, as you get more and more experience you will find KISS (Keep It Simple Stupid) almost mandatory. Since retiring I do not program much - I got sick and tired of it over the years. But when I do here is my strategy. I program in Python and always recommend beginner programmers use it. The only issue is speed. If speed is important I isolate the parts of the program that need speeding up and write them in Lua running under LuaJit:
https://brmmm3.github.io/posts/2019/07/28/python_and_lua/

Once you know Python learning Lua is a snap. And if you want even faster speed there is now a version of Lua that compiles to C:
https://nelua.io/

For glueing Neula code it is easy to write some C code that can be called from Python.

Personally I compile my Python using Nukita:
http://nuitka.net/pages/overview.html

But the Lua and compiling stuff is what I would call more advanced tricks - best just to learn straight Python first. I like the Anaconda distribution:
https://www.anaconda.com/products/individual

It has other cool stuff as well you can check out. Have fun.

Thanks
Bill
 
Last edited:
  • Like
  • Informative
Likes hutchphd, yungman and sysprog
  • #20
bhobba said:
As a professional programmer of over 30 years experience, mostly at the senior programmer/team leader level, as you get more and more experience you will find KISS (Keep It Simple Stupid) almost mandatory.
......
Thanks
Bill
I cannot agree with you more. In the academic world, people stress on elegance, how to do it in the more efficient way, going way out of the way to save a few lines of code...or in my case in electronics, to save a few components. Then glorify about doing it in the way that show how much you know. But in real job, it's about getting the job done, easy to maintain. Doing it in the most simple straight forward way as possible.

There's a big disconnect between college and real job. In college, people stress on how "sophisticate" can you do the job. In real world, people stress on how fast you can do the job and how reliable is your program or circuit. Who cares how elegance and sophisticate the theory or knowledge behind what you design! People will hire you that design things by rolling the dice but work 100% of the time, over you with great theory and concept, but hit and miss.

I just finished the chapter on overloading operator for Class. I put in a lot of effort learning and I think I get quite good at it now. BUT, if I were to write a real program, I would avoid those if I can help it. I rather go with simple straight forward codes. I might not have experience in programming, but I have 30 years designing electronics, systems and all, it doen't NOT pay to be "sophisticate".

I was in EE part of this forum in July this year talking about designing Starting at post #11.
https://www.physicsforums.com/threads/what-approach-should-be-used-when-solving-a-circuit.991072/
It got quite heated. Funny I posted the test question similar to what I gave to engineer and technician that came for interview in post #25, that was pretty much straight out of a book used in a trade school only, nothing advanced, so few candidate can even get the right answer. Forget all the advanced theory taught in college. I used to help in EE forum over 10 years ago, but I am too busy and drown now to even go there and look.

Keep it simple, keep it basic and use a lot of common sense.
 
Last edited:
  • Like
Likes bhobba
  • #21
sysprog said:
@yungman I think that you might find this article to be interesting: https://www.informatimago.com/articles/microcode/microcode.html
Yeh, the good old days. I knew the Z80 and 8085 by heart, I can write like 40 line code in machine code without even go to assembly language. That was between 1979 to 83. I so wanted to get into bit-slice design, thankfully I did not, it's nowhere to be found. But still a lot of micro controller using 8088 core before I retired, they just add mux ADC and DACs into one chip. I used to know those read/write/DMA/interrupt cycle timing diagram by heart. Those are just as valid today, just get the datasheet of the processor you need to design with and study it. It gets boring after designing a few.
 
  • #22
bhobba said:
As a professional programmer of over 30 years experience, mostly at the senior programmer/team leader level, as you get more and more experience you will find KISS (Keep It Simple Stupid) almost mandatory. Since retiring I do not program much - I got sick and tired of it over the years. But when I do here is my strategy. I program in Python and always recommend beginner programmers use it. The only issue is speed. If speed is important I isolate the parts of the program that need speeding up and write them in Lua running under LuaJit:
https://brmmm3.github.io/posts/2019/07/28/python_and_lua/

Once you know Python learning Lua is a snap. And if you want even faster speed there is now a version of Lua that compiles to C:
What about Julia? Supposedly as simple and straightforward as Python, as fast as C divided by 2. Only takes a long time for the first run.
 
  • #23
fluidistic said:
What about Julia? Supposedly as simple and straightforward as Python, as fast as C divided by 2. Only takes a long time for the first run.

Sure - but it is not as widely used as Python. Lots of support like Numba that complies a version of Python to LLVM as well. Another interesting language is Moonscript:
https://moonscript.org/

Thanks
Bill
 
  • Informative
Likes fluidistic
  • #24
yungman said:
I just finished the chapter on overloading operator for Class. I put in a lot of effort learning and I think I get quite good at it now. BUT, if I were to write a real program, I would avoid those if I can help it

You do know you have been using an overloaded operator since July 10th or possibly even earlier, right? They might not be entirely as useless as you think.
 
  • #25
Back to the topic at hand:

I think embedded processor program is a red herring. No teenager is going to start there. I expect this is non-controversial.

I also think speed is a very minor issue. I expect this to provoke outrage, but here me out. For most problems, speed is irrelevant. Either the computer is already so fast that the response is instantaneous, or the problem takes so long to solve it isn't worth running it in the first place, or it's in between. I maintain that while speed does benefit the "in-between" class of problems, it's actually quite small, especially when considering the kinds of programs a beginner is likely to write.

People have suggested Python and (a modern) BASIC. Both have their strengths. For learning about computers, either would be a good choice. For learning programming, both languages lend themselves to writing spaghetti code - there sure is a lot of it around in both languages.

Today, for reasons expressed above and elsewhere, I'd recommend Python, even though I personally dislike it. But a lot of people use it, and that means a lot of people can potentially provide help. And you'll need it - Python almost goes out of its way to be different. (I would also argue that duck typing is a classic example of where a week of programming can save you an hour of thinking)

What I would like to recommend instead is Pascal. Understand procedural programming before leaping into OO. I would probably suggest Java after that. While I appreciate Ruby's "principle of least surprise", the larger number of Java programmers sways me. One strength (and weakness) of Java is there is usually only one "right" way to do something, and trying to do it some other way makes it quickly obvious that this is not the best way to go about the problem. From there, C and C++ will make a lot more sense.
 
  • #26
Vanadium 50 said:
(I would also argue that duck typing is a classic example of where a week of programming can save you an hour of thinking)
:headbang:
LOL!
 
  • Like
  • Haha
Likes sysprog and Vanadium 50
  • #27
Vanadium 50 said:
You do know you have been using an overloaded operator since July 10th or possibly even earlier, right? They might not be entirely as useless as you think.
Whatever provided by C++ and compiler are perfectly ok, they must be well debugged, have all the documentation how to use it. There is less confusion. But with custom overloading, you can easily introduce bugs. I learned it first hand how easy to make a mistake and create dangling stuffs and shallow copies and all that.

All the overloading only make what's in main() looks easier, but adding complication in the background like in specification and implementation files. Particular if all are in separate files( that's the point of having those), you have to search up and down to look for bugs.

Instead of going through the trouble to make it easy to write A = B + C where they are of same class, it's better to just write A.x = B.x + C.x; A.y = B.y + C.y. etc. their is no miss understanding of this. So it is one more line in main(), how many lines you have to write in header file and .cpp file to make A = B + C happen?
 
  • #28
yungman said:
Instead of going through the trouble to make it easy to write A = B + C where they are of same class, it's better to just write A.x = B.x + C.x; A.y = B.y + C.y. etc. their is no miss understanding of this. So it is one more line in main(), how many lines you have to write in header file and .cpp file to make A = B + C happen?
This is not one more line in main() if you adhere to a programming style that most organizations would enforce for their developers, and that you said you would honor in the code that you present here at PF.
Furthermore, you would have to write a multiple-line version every time you wanted to add two of these objects, whereas if you had an overloaded operator+(), you would only need to have a few lines of code in one place. Having the code in one place makes it much easier to debug and test.

Your argument here is essentially why bother to write functions at all when you can do the calculations inline in a monolithic main()?
 
  • Like
Likes bhobba and Vanadium 50
  • #29
Mark44 said:
This is not one more line in main() if you adhere to a programming style that most organizations would enforce for their developers, and that you said you would honor in the code that you present here at PF.
Furthermore, you would have to write a multiple-line version every time you wanted to add two of these objects, whereas if you had an overloaded operator+(), you would only need to have a few lines of code in one place. Having the code in one place makes it much easier to debug and test.

Your argument here is essentially why bother to write functions at all when you can do the calculations inline in a monolithic main()?
No, I am all for splitting up program into smaller programs, writing class to perform specific function so one doesn't have to repeat over and over. BUT overloading is another story doing all the extra things and risking all the bugs just to make it "easier" to read in the main().
 
  • #30
yungman said:
No, I am all for splitting up program into smaller programs, writing class to perform specific function so one doesn't have to repeat over and over. BUT overloading is another story doing all the extra things and risking all the bugs just to make it "easier" to read in the main().
Overloading an operator is really nothing more than writing a function. All of the things that you are concerned about with the risk of bugs are things that can happen when you write a function that is buggy.
 
  • Like
Likes bhobba and Vanadium 50
  • #31
It's really funny we hardware people gone through hell speeding up the hardware and the programmer keep saying speed is NOT the problem and keep adding "sophisticate" crap and say it's NOT a problem. I just checked, USB 3.0 is over 1GB/S, and it's getting slower and slower writing and reading from flash drive.

No wonder the stuffs are getting slower and slower as the hardware gets faster and faster.
 
  • #32
yungman said:
I just checked, USB 3.0 is over 1GB/S, and it's getting slower and slower writing and reading from flash drive.

That is the same as saying "my tires are rated at 130 mph, but it still takes me 3 minutes to get to the grocery store and it's only a mile away!"
 
  • Like
Likes pbuk
  • #33
yungman said:
Whatever provided by C++ and compiler are perfectly ok, they must be well debugged, have all the documentation how to use it. There is less confusion. But with custom overloading, you can easily introduce bugs.

Let's examine this. You seem to be saying it's OK for the compiler developers to use overloading, but not for you. I have no trouble with that - nobody needs to use every features. For good and sound reasons, I myself have never used multiple inheritance in production code (but I have used it in testing).

Where I have trouble is where you make the jump from "I don't want to use it myself" to the idea that this is a bad feature of the language and thus nobody should use it. In fact, if it's a bad feature, it should be removed, yes?

Maybe even this wouldn't be so troubling, except you have used this feature in every single piece of code you have posted. How can you be so opposed to a feature you use so much?
 
  • #34
Vanadium 50 said:
Let's examine this. You seem to be saying it's OK for the compiler developers to use overloading, but not for you. I have no trouble with that - nobody needs to use every features. For good and sound reasons, I myself have never used multiple inheritance in production code (but I have used it in testing).

Where I have trouble is where you make the jump from "I don't want to use it myself" to the idea that this is a bad feature of the language and thus nobody should use it. In fact, if it's a bad feature, it should be removed, yes?

Maybe even this wouldn't be so troubling, except you have used this feature in every single piece of code you have posted. How can you be so opposed to a feature you use so much?
Yes, at least chances are they supposed to be the specialist and from one source.

Vanadium 50 said:
That is the same as saying "my tires are rated at 130 mph, but it still takes me 3 minutes to get to the grocery store and it's only a mile away!"
No, it's like we give you a sports car and it takes you 30 minutes to the store instead! You are taking it for joy ride instead of driving directly from point A to B in the name of "sophistication" and "elegance".Case in point, I am copying my C++ files into a USB flash drive, it's been running for over 2 hours already, only 61%, still estimate another 2hours. Look at the speed.

Return twist bar.jpg


Also, how the hell the programs in 4 chapters can be 5GB is size from VS? Those are all example short programs.
 
Last edited:
  • #35
There must be a lot of "baggage" in a Visual Studio project. Here are the various versions of my three-vector program:

Code:
% ls -l threeVec*
-rwxr-xr-x  1 jtbell  staff  28648 Dec  6 00:09 threeVector
-rw-r--r--  1 jtbell  staff   4601 Dec  6 00:08 threeVector.cpp
-rwxr-xr-x  1 jtbell  staff  28128 Dec  5 15:48 threeVector2
-rw-r--r--  1 jtbell  staff   2792 Dec  5 15:48 threeVector2.cpp
-rwxr-xr-x  1 jtbell  staff  28832 Dec 14 10:51 threeVector3
-rw-r--r--  1 jtbell  staff   5544 Dec 14 10:50 threeVector3.cpp
-rwxr-xr-x  1 jtbell  staff  28904 Dec 14 10:55 threeVector4
-rw-r--r--  1 jtbell  staff   5496 Dec 14 10:55 threeVector4.cpp
-rwxr-xr-x  1 jtbell  staff  28960 Dec 15 00:04 threeVector5
-rw-r--r--  1 jtbell  staff   5687 Dec 15 00:04 threeVector5.cpp

Files without an extension are the compiled executable programs. The file sizes are in the middle, before the dates. They're in bytes.
 
  • Like
Likes Vanadium 50

Similar threads

  • Programming and Computer Science
Replies
15
Views
1K
  • Programming and Computer Science
12
Replies
397
Views
13K
  • Programming and Computer Science
Replies
16
Views
2K
  • Programming and Computer Science
Replies
1
Views
729
  • Programming and Computer Science
Replies
8
Views
878
  • Programming and Computer Science
Replies
22
Views
2K
  • Programming and Computer Science
Replies
16
Views
1K
  • Programming and Computer Science
4
Replies
107
Views
5K
  • Programming and Computer Science
2
Replies
58
Views
3K
  • Programming and Computer Science
Replies
11
Views
1K
Back
Top