For those who ask: "What programming language should I learn?"

  • Thread starter Thread starter pbuk
  • Start date Start date
Click For Summary
Choosing a programming language to learn depends on your goals. Python is recommended for machine learning, web applications, and data analysis, while JavaScript is ideal for web development. C# is suited for game development and Windows applications, whereas C++ is best for performance-critical applications like games and operating systems. Beginners are advised against starting with C++ due to its complexity and potential for confusion, with suggestions leaning towards Python or Java for foundational learning. Ultimately, the choice of language should align with the specific applications you wish to pursue in programming.
  • #61
.Scott said:
2) since most operating system are written in C/C++/C# and assembler,
I wouldn't treat C# in one breath with C and C++. C# is a higher structured language based on and inspired by Java. Just like Java, it is compiled to byte-code which then runs on a virtual machine (which nowadays uses advanced JIT compiler techniques to speed up the execution). It is not in the same category as C and C++
 
Technology news on Phys.org
  • #62
pbuk said:
TL;DR Summary: If you are new to programming, learn Python - it is relatively easy to learn and you can do a lot with it.
This sounds way too much like you are promoting a policy instead of offering direction.
How about, "If you are new to programming and looking for advice, start with Python - it ..." ?
pbuk said:
LearnIf you want to write code for

This "Learn X, if you want to develop Y" block has a context problem. By including it as "advice to novice" , you are advising someone who is "new to programming" to "Learn C if you want to code device drivers" - though you are reversing this in your first "tip".

So there are two issues: the second is phrasing the heading to match the audience; the first relates to qualifying the audience.

So the first: I would like to say that telling a novice to "Learn C if you want to code device drivers" is categorically wrong. But I know of several EE types that have done exactly that with serial drivers in Unix systems. That happens to be a relatively cookie-cutter type procedure. Is someone who has already read through the data sheet for a Intel processor and wired up a new device to his computer "new to programming"? I think the "qualifier" is what I mentioned before: are they "looking for advice".

Now the phrasing. How about "You'll eventually need", instead of "Learn"?

All novice coders that I have ever worked with are 1-language creatures. So, except in special situations, we don't want to tell them to tackle C until they say "how about C".

pbuk said:
  • Do not learn C++ (or any other dialect of C) because you believe that it is "fast" in some general sense - the contexts in which this may be relevant are limited to those mentioned above (games, embedded systems etc.)
I think your purpose with this tip is to discourage someone who is over-eager with C++ by souring the grapes. I wouldn't do that. All computer languages have friends (even RPG). C++ is great, so is Python, so it JavaScript, so is ... is ... Cobol (had a hard time getting that one out). Besides, you're describing C++ as a language of last resort - but for many large projects there are so many inherently complexities that the conciseness and power of C++ can be essential.

So I would say something like this:
C and C++ are powerful and concise languages often well suited for large projects, OS code, and code that needs to make the most of limited computer processor resources, it is also cryptic, subtle, and vast.
Picking this as your first language could end your career before it ever gets started.
pbuk said:
  • Do not learn C++ because you believe that all interviewers expect you to use it to answer coding questions - in 2023 this is no longer true in many cases, and it is easier and quicker to solve coding problems correctly in e.g. Python. Of course if you are going for a job writing device drivers at NVIDIA this is not going to apply, but if you think you can get a job writing device drivers at NVIDIA you don't need advice on learning to code anyway.
By "coding questions", you mean explaining a software concept or algorithm. The phrase "coding question" certainly includes questions on semantics - which would, of course, require language-specific knowledge.

I would replace this with:
In a job interview or other situation where you want to described software concepts or algorithms,
don't default to C/C++ unless both you and the interviewer are very familiar with it. Expect to find better success with Python, an improvised pseudo-code, a narrative procedure, a flow chart, or some combination.

pbuk said:
  • Do not learn how to implement algorithms and data structures like linked lists, hash tables, trees, indexing, sorting etc - these algorithms have already been implemented optimally in the languages that you will learn (although the comment about NVIDIA etc. above applies).
In a practical sense, what the difference between "learning to implement" and "learning how it has been implemented"?
Anyone who sticks with programming long enough will pick up almost all of that stuff - and it is all valuable. If your database is slow, it might be because some automatic decisions have been made by you software tools on your behalf. Understanding the mechanisms under the hood helps in directing you to the solution.
I would drop this item completely.
 
  • Like
Likes DrClaude and Vanadium 50
  • #63
Vanadium 50 said:
With strong typing, the language never needs to guess.
Python is strongly typed. What differentiates it from, for example, Java, is that it is dynamically typed.

I suppose you could say that Python never "needs" to guess, but I would put it that it refuses to guess.

The other key difference between Python and most other languages is the meaning of a variable in Python. In, say, Java, declaring a variable means setting aside a piece of memory to store a particular kind of data. The "type" of the variable is what kind of data will be stored in that piece of memory.

In Python, however, declaring a variable means declaring a namespace binding. In other words, a "variable" is an assignment of a name in the current namespace to an object. The namespace is just a Python dict, so the name is a dictionary key and the object is the dictionary value corresponding to that key. The object itself is a pointer to a data structure, and the "variable type" is the type of the object. Python type annotations are declarations that a particular name in a namespace will only be bound to variables of a particular type; but the interpreter currently doesn't enforce them (i.e., it doesn't check actual variable assignments to see if they are consistent with the type annotations).
 
  • #65
.Scott said:
But I know of several EE types that have done exactly that with serial drivers in Unix systems.
One of the very first pieces of code I wrote in physics, and I think it might have been the first that other people would use after I moved on was similar.

The task was to read out our data, arranged by crate, card and channel and decode it. e.g. Byte 3 in Card 7 in Crate 2 is temperature. I've said that people were preoccupied by speed, but the code I was replacing limited how fast we can take data.

I used a union of structs,

So not just C, but esoteric C. One side of the union was card/crate/channel and the other was what those variables meant,

It was blazingly fast. It was also very fragile, and the comments exceeded the length of the code, and expressed the idea "if you touch this code, it will break."

My first foray into 8086 assembler was a device driver. This was shorty after the IBM PC was released. Got a deak on a hard disk with one bad head, so I wrote a driver that ignored that head. I was not an expert in 8086 assemnler (I'm still not) but it was easy enough to take an existing driver and adjust it. (Finding which 4's had to be turned into 3's was a bit of work, to be sure)
 
  • Like
Likes .Scott and PeterDonis
  • #66
All this talk about addition, concatenation and data types handled so badly in all of these languages when you got PHP that handles it so well:

PHP:
$numberOfApples = 3;
$numberOfOranges = 2;

$numberOfApples  .= " apples";
$numberOfOranges .= " oranges";

echo "$numberOfApples + $numberOfOranges = ". ($numberOfApples + $numberOfOranges) . " fruits";

# Prints "3 apples + 2 oranges = 5 fruits"
# ... with 2 nice notices (not errors, not warnings)
# about "non well form numeric values encountered" on line 7.

Declaring variables ... Pfft! ... and data type ... Pfft! ... statically ... Pfft!
Using the same syntax for addition & string concatenation ... :doh:

It is not that difficult to understand what the user means when done correctly.

And if the user wants a specific type - when he needs it - he just has to specify it then:

PHP:
$numberOfApples = "3 apples";

echo $numberOfApples; # Prints "3 apples"
echo (int) $numberOfApples; # Prints "3"

I don't remember ever being surprised by the assumptions made by PHP. It works as expected, logically, all the time.

PHP takes the best out of every other language and incorporates it into its own. Once you understand PHP, every other language seems familiar somehow.

Do you want procedural or object-oriented? You have both. Do you need a list or an array? In PHP, there are no lists: a list is simply an array. On the plus side, because it is an array, you can even manipulate the indices of your "list"!

Because it was primarily designed for the web, you can easily use HTML for any user interface you need, and all that is needed is a server (yours or someone else's) to use your program. No need to compile it for different architectures or use some unique-use virtual machine. Perfect when you are just "learning" by doing fun stuff.

If that wasn't enough, you have the best documentation on the web - filled with examples to explain the different functions - and most likely the largest community to help solve the more complex problems. I think I never had a question that wasn't already asked & answered on the web.

If you need a better-suited language for some specialized task later on, already knowing PHP will probably be helpful for learning that new language (which is most likely more complicated/strict somehow).
 
  • #67
pbuk said:
# You can't mix types. print(2 + "2") # TypeError: unsupported operand type(s) for +: 'int' and 'str'

Vanadium 50 said:
I would have expected Python to promote teh 2 to a "2" and make this a "22".

PeterDonis said:
One could just as easily argue that it's more likely that the programmer meant to do integer addition so the "2" should be interpreted as an integer and added to the 2 to make 4.
With a slight change in the code ("2" replaced by '2') and language, I wouldn't expect either "22" or 4. Case in point:
C:
printf("%d", 2 + '2');
The above prints 52, using well-documented conversion rules.
 
  • #68
Mark44 said:
The above prints 52, using well-documented conversion rules.
Yes, and note that in this case, the string '2' is what is being "converted", not the integer 2 (though the "conversion" of the string is using its ASCII code, not its value if interpreted as a decimal digit).
 
  • #69
I have used both languages. Python is OK for small hacks but might be a nightmare for big jobs, the stuff that takes person-years.

I also could never figure what advantage Python had over MacLISP or Scheme or some other interpretive language that already existed.
 
  • #70
PeterDonis said:
Yes, and note that in this case, the string '2' is what is being "converted", not the integer 2 (though the "conversion" of the string is using its ASCII code, not its value if interpreted as a decimal digit).
Technically, '2' is a character, very different from the string "2". Internally, '2' is stored, as you note, as its ASCII code. In the expression, '2' is promoted to an int value (50) and then added to 2.
 
  • #71
Mark44 said:
promoted
In the sense of "you don't get to do science anymore. You've been promoted to management."
 
  • #72
I'm curious, why has Pascal declined in its popularity?
 
  • Like
Likes symbolipoint
  • #73
CGandC said:
I'm curious, why has Pascal declined in its popularity?
It boils down to productivity.
Here's the wiki introductory paragraph:
Pascal is an imperative and procedural programming language, designed by Niklaus Wirth as a small, efficient language intended to encourage good programming practices using structured programming and data structuring.
It encouraged the parochial view of "good programming" as the diligent pursuit of "tidiness" rules.
That kind of tidiness really is important, and I don't begrudge it being emphasized in schools. But it many cases, the path to tidiness is best left to the programmer.

For example, all of the early Pascal's and most of rest of them did not have any of the so-called "early out" options for loop - that is, the "break" and "continue" statements. The intend may be tidiness and predictable structure, but the results are anything but tidy. Without "continue", a loop that needs to check 20 "continue" conditions ends up going 20 levels deep in if statements. There are ways to avoid that, but only at the cost of concise and direct code.

Pascal did implement the "goto". The most intractable problem with the routine use of "goto" is that the person reading the code (which is most often the person who wrote it) needs to find the "goto" label to know where the landing point is. Whereas "break" or "continue" relate to the current loop - something that would already be known to the code reader.

Of course, "goto" is most notorious for creating "spaghetti code", the structured programming equivalent of obscure control flags.
 
  • Informative
  • Like
Likes symbolipoint and CGandC
  • #74
You could always write Pascal-like code in C. You could not always write C-like cofe in Pascal.

Furthermore, Wirth had already moved on to Modula-2. If the language's creator isn't an advocate, that's not a good thing.
 
  • #75
Hornbein said:
I also could never figure what advantage Python had over MacLISP or Scheme or some other interpretive language that already existed.
One big difference is Python's standard library, which includes support for enough things that many Python programs can be written without having to depend on any third-party libraries. The Lisp dialects you mention (and indeed Lisp in general) have never had such broad library support built in.

Another big difference is syntax; many people find it very difficult to wrap their minds around S-expressions, whereas Python's syntax is simple and obvious and basically works like pseudocode, so it matches up well with how many people conceptualize their programs.
 
  • Informative
Likes symbolipoint
  • #76
Mark44 said:
Technically, '2' is a character, very different from the string "2".
Yes, but as you note, it is still the value that gets type converted; the integer doesn't. Whereas in the example of implicit conversion with strings in other languages, the integer gets converted to a string.
 
  • #77
Vanadium 50 said:
In general, my experience is that people pay too much attention to speed too early in the process.

Getting the wrong answer more quickly is seldom helpful.
Working hard to speed up a piece of code that didnt take up much of the time to begin with likewise.
I saw a lot of that. Correctness is the real challenge. Get that then use a profiler.
 
  • #78
PeterDonis said:
Yes, but as you note, it is still the value that gets type converted; the integer doesn't.
In C, the char type is an integral type, similar to short int, int, long int, and several other integral types. What happens is that it gets promoted rather than converted. If you convert from a non-integral type to an integral type, the value will change, but that's not what happens when a char is promoted to an int.
 
  • #79
After reading this thread, I have come to some opinions - in some cases changed, and in others reinforced.

1. It is better to learn how to program than any particular language.
1A. While people focus on writing code, in real life reading code is at least as important. Writing something from scratch on your own is rare indeed.

2. if you have decided that you don't want to learn programming, and just want to cobble something together, your best choice is what most other people are using. Might be Python. Might be FORTRAN. Might be lots of things.

3. We lumped C and C++ together here, but they really shouldn't be. Their philosophies and styles are very different. I would view C as its own thing, and C++ as it's own thing that just happens to support a subset that is C. But even "Hello world" looks different in idiomatic C and C++.

4. A lot of useful advice is language independent and never touched. RTFM. UTSL. Use pencil and paper before writing the code. Comment as needed - no more and certainly no less. Eat green leafy vegetables. Use a debugger. Print statements can serve as a debugger. And so on.
 
  • #80
Vanadium 50 said:
1. It is better to learn how to program than any particular language.

Vanadium 50 said:
1A. While people focus on writing code, in real life reading code is at least as important. Writing something from scratch on your own is rare indeed.
However, writing something from scratch is what people do who are learning to program.
Vanadium 50 said:
4. A lot of useful advice is language independent and never touched. RTFM. UTSL. Use pencil and paper before writing the code.
Re the last sentence: An instructor I had a long ago said this: "The sooner you sit down to the keyboard, the longer it takes to write your program."
 
  • Like
Likes Eclair_de_XII, Vanadium 50, phinds and 1 other person
  • #81
Mark44 said:
"The sooner you sit down to the keyboard, the longer it takes to write your program."
what he said (very small).jpg
 
  • #82
Mark44 said:
However, writing something from scratch is what people do who are learning to program.
Maybe it shouldn't be. Being able to write HelloWorld, FizzBuzz and maybe even TicTacToe leaves people unprepared for geting an inch thick stack of green and white tractor-fed paper on their desk with the instruction "1% of our vendors aren't getting paid. Fix it."
Mark44 said:
The sooner you sit down to the keyboard, the longer it takes to write your program
A week of coding will save you an hour of thinking.
 
  • #83
Vanadium 50 said:
A week of coding will save you an hour of thinking.
I love it :oldlaugh:
 
  • #84
Vanadium 50 said:
Maybe it shouldn't be. Being able to write HelloWorld, FizzBuzz and maybe even TicTacToe leaves people unprepared for geting an inch thick stack of green and white tractor-fed paper on their desk with the instruction "1% of our vendors aren't getting paid. Fix it."
Well, of course -- it's a long way from writing a toy program to being able to analyze why a big program isn't working for 1% of the vendors, but you have to crawl before you walk, and walk before you run.
 
  • #85
I agree you have to start somewhere. But if you go to college to learn to write English, you spend a greate deal of time reading, and less time writing. If you go to school to learn to write code, it's the other way around. I am musing aloud as to whether the English Department may have it right.

There are two other problem, and maybe they should be attacked first. One is that there are graduates who still can't code. At all. They can;t write FizzBuzz, much less a fully-compliant payroll application.

The other is that they think they can. For some reason, there is a shockling lack of self-awareness on the part of some programmers that I just don't see in physics or math.
 
  • #86
Rene Dekker said:
For systems programming on any Apple platform: Swift.
I haven't programmed since MS quickbasic in the '90s. I'm reading Swift.org, but it is over my head.

If I write a program in Swift on Mac, can I save it as a windows program and send it to a windows user? Are their any program languages that can do this? Or do you need a windows machine to compile code that can run in windows?
 
  • #87
Vanadium 50 said:
But if you go to college to learn to write English, you spend a greate deal of time reading, and less time writing. If you go to school to learn to write code, it's the other way around. I am musing aloud as to whether the English Department may have it right.
I don't think it's actually the other way around with regard to teaching programming. Before a student is asked to write a program he or she will have been exposed to examples that use the same concepts as are asked for in the program to be written. I'm speaking as someone who has taught programming classes for about 35 years in at least six different languages.
Algr said:
If I write a program in Swift on Mac, can I save it as a windows program and send it to a windows user?
AFAIK, no. A compiler generates code that can run on a particular operating system. If you write a program in Swift that runs on a Mac, the OS is iOS <some version>. A Windows user would need to have some kind of emulation software on the Windows machine, software that would emulate the Mac OS.

It's possible that you could send the Swift source code to the Windows user, who might then be able to port that source code to C, C++, C#, or whatever, and then compile and run that ported version.
 
  • #88
Algr said:
Are their any program languages that can do this?
Java can do it because Java does not compile directly to machine code, but rather creates an intermediate form which can then be executed on any machine (including Mac's and Windows) that have a Java Virtual Machine (an app to run Java intermediate code)
 
  • Like
Likes Vanadium 50
  • #89
Algr said:
Are their any program languages that can do this?
Any interpreted language that has an interpreter on both platforms will do. Java is one, as @phinds mentioned. Another is Python. Another is C#.
 
  • #90
There are really four approaches to this:

(1) Use an interpreted language and have each platform provide its own interpreter. Python was mentioned as an example. BASIC was an older example. The issue here is that platform-specific versions may vary: running Applesoft BASIC code on an IBM PC might or might not work.

(2) Use a compiler, and port the compiler to different platforms. V and the GCC compiler is an example of that. The issue is that many faetures, especially the "cool" one like sound and graphics exist in extensions, so the programs are very "plain jane".

(3) Use a common environment like the Java Virtual Machine upthread, This solves both problems, but requires a strong core team - in this case provided by Sun - to write and test the virtual machines for each platfom.

(4) Write an automatic converter from one dialect to another, one environment to another, or even one language to another. I would say that this works for small programs, but as the program gets larger and more complex, the more likely some human intervention is required.
 

Similar threads

Replies
86
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 397 ·
14
Replies
397
Views
19K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 16 ·
Replies
16
Views
2K