For those who ask: "What programming language should I learn?"

  • Thread starter Thread starter pbuk
  • Start date Start date
AI Thread Summary
Choosing a programming language to learn depends on your goals. Python is recommended for machine learning, web applications, and data analysis, while JavaScript is ideal for web development. C# is suited for game development and Windows applications, whereas C++ is best for performance-critical applications like games and operating systems. Beginners are advised against starting with C++ due to its complexity and potential for confusion, with suggestions leaning towards Python or Java for foundational learning. Ultimately, the choice of language should align with the specific applications you wish to pursue in programming.
  • #51
pbuk said:
Although there are many opinions stated among posters here and elsewhere, I believe that there is now a consensus among the academic community, particularly in STEM, that Python is the best first language (some indicative evidence below). I think PF would benefit from a sticky post joining this consensus.
If such a sticky note is included, it should specify its advantages. In your "spoiler" section, only the availability of tutorials is mentioned as an advantage. If that is the only advantage, Python is not worth a sticky note. If there are more advantages, then they would need to be enumerated.
pbuk said:
Alternatively, there does at least appear to be a consensus on here for "don't learn C++ just because someone has told you it is fast", and I think a post along those lines that we could link in answers would be useful.
And don't learn assembler just because it is fast - or either C/C++ or assembler because they create binaries that are easy on the processors other resources.

What might interest someone to learn C/C++/C#:
1) it has application in industry;
2) since most operating system are written in C/C++/C# and assembler, you have more confidence that the restrictions to the resources in your operating environment will be at a minimum.
3) Once you gain some mastery in it, it is quick to code and check.

Previous academic "darlings" have included Basic, Pascal, ADA, Lisp, Forth, C, and C++.
The difference between C, C++, and later C# and those others is that C/C++/C# have led to high productivity in the workplace. Coders got good at it and and ran with it. It was a solid replacement to all of those Fortran-like languages that preceded it and it took the lead as the favorite language for "heavy lifting".

I don't know where Python lands. I find it a handy "supplemental" language. In contrast, I would describe Pascal as 'lame'. I don't thing Python falls into that category. Python has found a place in industry.
 
Technology news on Phys.org
  • #52
Greg Bernhardt said:
Would you like to take a try at writing this post?
Yes, I started this thread as a first prototype :smile:
 
  • #53
shivajikobardan said:
I'm pretty sure there's not [an equivalent to Code::Blocks for] Python.
You must be joking - there are many IDEs for Python, including in no particular order:
  • IDLE which is the 'default' Python IDE, shipped with Python since forever;
  • Thonny, which is easy to install along with Python and good for beginners;
  • PyCharm, as for Thonny but also part of a robust professional family offering from JetBrains;
  • Visual Studio Code, which is not really an IDE but has plugins that mean that it has pretty much replaced IDEs for a large part of the development community in most languages.
  • The online IDE from Programiz which is probably the easiest and quickest way to start programming and can run on almost anything e.g. a Chromebook.
 
Last edited:
  • #54
pbuk said:
unsupported operand type(s) for +: 'int' and 'str'
I admit I never tried that in Python, but I find that astonishing and not very Pythonic. I would have expected Python to promote teh 2 to a "2" and make this a "22".
 
  • #55
Vanadium 50 said:
I would have expected Python to promote teh 2 to a "2" and make this a "22".
That would be guessing that the intent was string concatenation rather than integer addition. But why should that be the default? One could just as easily argue that it's more likely that the programmer meant to do integer addition so the "2" should be interpreted as an integer and added to the 2 to make 4.

Python would rather make the programmer be explicit and correct about stating what they intend, instead of guessing about incorrect constructs that could be interpreted multiple ways.
 
  • #56
PeterDonis said:
Python would rather make the programmer be explicit and correct about stating what they intend, instead of guessing about incorrect constructs that could be interpreted multiple ways.
That was a good decision. Shame that whoever made it was out of office when the decisions about the relationships between True, False and 1, 0 were made (and why were they not fixed in v3?). Anyway this is OT.
Python:
# The power of truth.
print(10 ** True) # 10

# The power of double truth.
print(10 ** (True + True)) = 100

# Wait for it...
print(True == 1) # True
print((True + False) == 1) # True
one = 1
print((True + False) is one) # True
print(True is one) # False - gotcha!
 
  • #57
pbuk said:
the relationships between True, False and 1, 0
I agree that treating bool as a subtype of int doesn't make a lot of sense. Historically bool was tacked on as a separate type during the evolution of Python 2 but I don't think the consequences were ever fully considered.

pbuk said:
why were they not fixed in v3?
My ranting answer: they were too busy making bad decisions about how to "fix" the handling of Unicode to make good decisions about other stuff. (AFAIK making bool its own independent type has never been on the radar of the Python core developers anyway.)
 
  • #58
pbuk said:
I agree completely. Data types are encountered early in learning Python with
Python:
# Adding numbers.
print(2 + 2) # 4

# Adding strings (concatenation).
print("2" + "2") # 22

# You can't mix types.
print(2 + "2") # TypeError: unsupported operand type(s) for +: 'int' and 'str'
I don't want to be harsh on you, but this is exactly the kind of thing that we are trying to avoid. If "data types" are mentioned, and you can only think of numbers and strings, then you may have learned a programming language, but have not learned software design.

The most striking disadvantages of Python wrt. for example Java:
- limited focus on separated data structures with defined, encapsulated functionality.
- no static type checking, meaning that type faults come up at run-time rather than compile time.
- limited support for separation of concerns.

If you have already learned software design based on a more structured language, then it is easy to put something together in Python, and you can even write decent programs in it. But the lack of static type checking would still be blocking for large scale deployment.

So as a first language to learn software design, I would definitely say that there is a consensus NOT to use Python, or any other dynamically typed language.
There is a consensus to use a more structured language like Java.

I also agree with the others that C++ is not suited as first language. It is overly complex, with loads of features and peculiarities that distract from the program solving focus. Multiple inheritance, as discussed in this thread, is almost never required or used.
 
  • #59
Rene Dekker said:
limited support for separation of concerns
What exactly is the difference between Python and Java in this respect?
 
  • #60
PeterDonis said:
But why
I ask myself that every time I sue Python., Wht? Why of why?

With strong typing, the language never needs to guess. With weak typing, it does, although often the choice is obvious. Python is kind of slushy. But the guesswork goes like this: 2 + "a" can onlt be "2a" so 2 + "2" should be 22.

I'm not complaining about what it's doing, hjust astonished. Like running into an exotic deancer in church on Sunday. You can be glad she's there but still be surprised to see her.
 
  • #61
.Scott said:
2) since most operating system are written in C/C++/C# and assembler,
I wouldn't treat C# in one breath with C and C++. C# is a higher structured language based on and inspired by Java. Just like Java, it is compiled to byte-code which then runs on a virtual machine (which nowadays uses advanced JIT compiler techniques to speed up the execution). It is not in the same category as C and C++
 
  • #62
pbuk said:
TL;DR Summary: If you are new to programming, learn Python - it is relatively easy to learn and you can do a lot with it.
This sounds way too much like you are promoting a policy instead of offering direction.
How about, "If you are new to programming and looking for advice, start with Python - it ..." ?
pbuk said:
LearnIf you want to write code for

This "Learn X, if you want to develop Y" block has a context problem. By including it as "advice to novice" , you are advising someone who is "new to programming" to "Learn C if you want to code device drivers" - though you are reversing this in your first "tip".

So there are two issues: the second is phrasing the heading to match the audience; the first relates to qualifying the audience.

So the first: I would like to say that telling a novice to "Learn C if you want to code device drivers" is categorically wrong. But I know of several EE types that have done exactly that with serial drivers in Unix systems. That happens to be a relatively cookie-cutter type procedure. Is someone who has already read through the data sheet for a Intel processor and wired up a new device to his computer "new to programming"? I think the "qualifier" is what I mentioned before: are they "looking for advice".

Now the phrasing. How about "You'll eventually need", instead of "Learn"?

All novice coders that I have ever worked with are 1-language creatures. So, except in special situations, we don't want to tell them to tackle C until they say "how about C".

pbuk said:
  • Do not learn C++ (or any other dialect of C) because you believe that it is "fast" in some general sense - the contexts in which this may be relevant are limited to those mentioned above (games, embedded systems etc.)
I think your purpose with this tip is to discourage someone who is over-eager with C++ by souring the grapes. I wouldn't do that. All computer languages have friends (even RPG). C++ is great, so is Python, so it JavaScript, so is ... is ... Cobol (had a hard time getting that one out). Besides, you're describing C++ as a language of last resort - but for many large projects there are so many inherently complexities that the conciseness and power of C++ can be essential.

So I would say something like this:
C and C++ are powerful and concise languages often well suited for large projects, OS code, and code that needs to make the most of limited computer processor resources, it is also cryptic, subtle, and vast.
Picking this as your first language could end your career before it ever gets started.
pbuk said:
  • Do not learn C++ because you believe that all interviewers expect you to use it to answer coding questions - in 2023 this is no longer true in many cases, and it is easier and quicker to solve coding problems correctly in e.g. Python. Of course if you are going for a job writing device drivers at NVIDIA this is not going to apply, but if you think you can get a job writing device drivers at NVIDIA you don't need advice on learning to code anyway.
By "coding questions", you mean explaining a software concept or algorithm. The phrase "coding question" certainly includes questions on semantics - which would, of course, require language-specific knowledge.

I would replace this with:
In a job interview or other situation where you want to described software concepts or algorithms,
don't default to C/C++ unless both you and the interviewer are very familiar with it. Expect to find better success with Python, an improvised pseudo-code, a narrative procedure, a flow chart, or some combination.

pbuk said:
  • Do not learn how to implement algorithms and data structures like linked lists, hash tables, trees, indexing, sorting etc - these algorithms have already been implemented optimally in the languages that you will learn (although the comment about NVIDIA etc. above applies).
In a practical sense, what the difference between "learning to implement" and "learning how it has been implemented"?
Anyone who sticks with programming long enough will pick up almost all of that stuff - and it is all valuable. If your database is slow, it might be because some automatic decisions have been made by you software tools on your behalf. Understanding the mechanisms under the hood helps in directing you to the solution.
I would drop this item completely.
 
  • Like
Likes DrClaude and Vanadium 50
  • #63
Vanadium 50 said:
With strong typing, the language never needs to guess.
Python is strongly typed. What differentiates it from, for example, Java, is that it is dynamically typed.

I suppose you could say that Python never "needs" to guess, but I would put it that it refuses to guess.

The other key difference between Python and most other languages is the meaning of a variable in Python. In, say, Java, declaring a variable means setting aside a piece of memory to store a particular kind of data. The "type" of the variable is what kind of data will be stored in that piece of memory.

In Python, however, declaring a variable means declaring a namespace binding. In other words, a "variable" is an assignment of a name in the current namespace to an object. The namespace is just a Python dict, so the name is a dictionary key and the object is the dictionary value corresponding to that key. The object itself is a pointer to a data structure, and the "variable type" is the type of the object. Python type annotations are declarations that a particular name in a namespace will only be bound to variables of a particular type; but the interpreter currently doesn't enforce them (i.e., it doesn't check actual variable assignments to see if they are consistent with the type annotations).
 
  • #65
.Scott said:
But I know of several EE types that have done exactly that with serial drivers in Unix systems.
One of the very first pieces of code I wrote in physics, and I think it might have been the first that other people would use after I moved on was similar.

The task was to read out our data, arranged by crate, card and channel and decode it. e.g. Byte 3 in Card 7 in Crate 2 is temperature. I've said that people were preoccupied by speed, but the code I was replacing limited how fast we can take data.

I used a union of structs,

So not just C, but esoteric C. One side of the union was card/crate/channel and the other was what those variables meant,

It was blazingly fast. It was also very fragile, and the comments exceeded the length of the code, and expressed the idea "if you touch this code, it will break."

My first foray into 8086 assembler was a device driver. This was shorty after the IBM PC was released. Got a deak on a hard disk with one bad head, so I wrote a driver that ignored that head. I was not an expert in 8086 assemnler (I'm still not) but it was easy enough to take an existing driver and adjust it. (Finding which 4's had to be turned into 3's was a bit of work, to be sure)
 
  • Like
Likes .Scott and PeterDonis
  • #66
All this talk about addition, concatenation and data types handled so badly in all of these languages when you got PHP that handles it so well:

PHP:
$numberOfApples = 3;
$numberOfOranges = 2;

$numberOfApples  .= " apples";
$numberOfOranges .= " oranges";

echo "$numberOfApples + $numberOfOranges = ". ($numberOfApples + $numberOfOranges) . " fruits";

# Prints "3 apples + 2 oranges = 5 fruits"
# ... with 2 nice notices (not errors, not warnings)
# about "non well form numeric values encountered" on line 7.

Declaring variables ... Pfft! ... and data type ... Pfft! ... statically ... Pfft!
Using the same syntax for addition & string concatenation ... :doh:

It is not that difficult to understand what the user means when done correctly.

And if the user wants a specific type - when he needs it - he just has to specify it then:

PHP:
$numberOfApples = "3 apples";

echo $numberOfApples; # Prints "3 apples"
echo (int) $numberOfApples; # Prints "3"

I don't remember ever being surprised by the assumptions made by PHP. It works as expected, logically, all the time.

PHP takes the best out of every other language and incorporates it into its own. Once you understand PHP, every other language seems familiar somehow.

Do you want procedural or object-oriented? You have both. Do you need a list or an array? In PHP, there are no lists: a list is simply an array. On the plus side, because it is an array, you can even manipulate the indices of your "list"!

Because it was primarily designed for the web, you can easily use HTML for any user interface you need, and all that is needed is a server (yours or someone else's) to use your program. No need to compile it for different architectures or use some unique-use virtual machine. Perfect when you are just "learning" by doing fun stuff.

If that wasn't enough, you have the best documentation on the web - filled with examples to explain the different functions - and most likely the largest community to help solve the more complex problems. I think I never had a question that wasn't already asked & answered on the web.

If you need a better-suited language for some specialized task later on, already knowing PHP will probably be helpful for learning that new language (which is most likely more complicated/strict somehow).
 
  • #67
pbuk said:
# You can't mix types. print(2 + "2") # TypeError: unsupported operand type(s) for +: 'int' and 'str'

Vanadium 50 said:
I would have expected Python to promote teh 2 to a "2" and make this a "22".

PeterDonis said:
One could just as easily argue that it's more likely that the programmer meant to do integer addition so the "2" should be interpreted as an integer and added to the 2 to make 4.
With a slight change in the code ("2" replaced by '2') and language, I wouldn't expect either "22" or 4. Case in point:
C:
printf("%d", 2 + '2');
The above prints 52, using well-documented conversion rules.
 
  • #68
Mark44 said:
The above prints 52, using well-documented conversion rules.
Yes, and note that in this case, the string '2' is what is being "converted", not the integer 2 (though the "conversion" of the string is using its ASCII code, not its value if interpreted as a decimal digit).
 
  • #69
I have used both languages. Python is OK for small hacks but might be a nightmare for big jobs, the stuff that takes person-years.

I also could never figure what advantage Python had over MacLISP or Scheme or some other interpretive language that already existed.
 
  • #70
PeterDonis said:
Yes, and note that in this case, the string '2' is what is being "converted", not the integer 2 (though the "conversion" of the string is using its ASCII code, not its value if interpreted as a decimal digit).
Technically, '2' is a character, very different from the string "2". Internally, '2' is stored, as you note, as its ASCII code. In the expression, '2' is promoted to an int value (50) and then added to 2.
 
  • #71
Mark44 said:
promoted
In the sense of "you don't get to do science anymore. You've been promoted to management."
 
  • #72
I'm curious, why has Pascal declined in its popularity?
 
  • Like
Likes symbolipoint
  • #73
CGandC said:
I'm curious, why has Pascal declined in its popularity?
It boils down to productivity.
Here's the wiki introductory paragraph:
Pascal is an imperative and procedural programming language, designed by Niklaus Wirth as a small, efficient language intended to encourage good programming practices using structured programming and data structuring.
It encouraged the parochial view of "good programming" as the diligent pursuit of "tidiness" rules.
That kind of tidiness really is important, and I don't begrudge it being emphasized in schools. But it many cases, the path to tidiness is best left to the programmer.

For example, all of the early Pascal's and most of rest of them did not have any of the so-called "early out" options for loop - that is, the "break" and "continue" statements. The intend may be tidiness and predictable structure, but the results are anything but tidy. Without "continue", a loop that needs to check 20 "continue" conditions ends up going 20 levels deep in if statements. There are ways to avoid that, but only at the cost of concise and direct code.

Pascal did implement the "goto". The most intractable problem with the routine use of "goto" is that the person reading the code (which is most often the person who wrote it) needs to find the "goto" label to know where the landing point is. Whereas "break" or "continue" relate to the current loop - something that would already be known to the code reader.

Of course, "goto" is most notorious for creating "spaghetti code", the structured programming equivalent of obscure control flags.
 
  • Informative
  • Like
Likes symbolipoint and CGandC
  • #74
You could always write Pascal-like code in C. You could not always write C-like cofe in Pascal.

Furthermore, Wirth had already moved on to Modula-2. If the language's creator isn't an advocate, that's not a good thing.
 
  • #75
Hornbein said:
I also could never figure what advantage Python had over MacLISP or Scheme or some other interpretive language that already existed.
One big difference is Python's standard library, which includes support for enough things that many Python programs can be written without having to depend on any third-party libraries. The Lisp dialects you mention (and indeed Lisp in general) have never had such broad library support built in.

Another big difference is syntax; many people find it very difficult to wrap their minds around S-expressions, whereas Python's syntax is simple and obvious and basically works like pseudocode, so it matches up well with how many people conceptualize their programs.
 
  • Informative
Likes symbolipoint
  • #76
Mark44 said:
Technically, '2' is a character, very different from the string "2".
Yes, but as you note, it is still the value that gets type converted; the integer doesn't. Whereas in the example of implicit conversion with strings in other languages, the integer gets converted to a string.
 
  • #77
Vanadium 50 said:
In general, my experience is that people pay too much attention to speed too early in the process.

Getting the wrong answer more quickly is seldom helpful.
Working hard to speed up a piece of code that didnt take up much of the time to begin with likewise.
I saw a lot of that. Correctness is the real challenge. Get that then use a profiler.
 
  • #78
PeterDonis said:
Yes, but as you note, it is still the value that gets type converted; the integer doesn't.
In C, the char type is an integral type, similar to short int, int, long int, and several other integral types. What happens is that it gets promoted rather than converted. If you convert from a non-integral type to an integral type, the value will change, but that's not what happens when a char is promoted to an int.
 
  • #79
After reading this thread, I have come to some opinions - in some cases changed, and in others reinforced.

1. It is better to learn how to program than any particular language.
1A. While people focus on writing code, in real life reading code is at least as important. Writing something from scratch on your own is rare indeed.

2. if you have decided that you don't want to learn programming, and just want to cobble something together, your best choice is what most other people are using. Might be Python. Might be FORTRAN. Might be lots of things.

3. We lumped C and C++ together here, but they really shouldn't be. Their philosophies and styles are very different. I would view C as its own thing, and C++ as it's own thing that just happens to support a subset that is C. But even "Hello world" looks different in idiomatic C and C++.

4. A lot of useful advice is language independent and never touched. RTFM. UTSL. Use pencil and paper before writing the code. Comment as needed - no more and certainly no less. Eat green leafy vegetables. Use a debugger. Print statements can serve as a debugger. And so on.
 
  • #80
Vanadium 50 said:
1. It is better to learn how to program than any particular language.

Vanadium 50 said:
1A. While people focus on writing code, in real life reading code is at least as important. Writing something from scratch on your own is rare indeed.
However, writing something from scratch is what people do who are learning to program.
Vanadium 50 said:
4. A lot of useful advice is language independent and never touched. RTFM. UTSL. Use pencil and paper before writing the code.
Re the last sentence: An instructor I had a long ago said this: "The sooner you sit down to the keyboard, the longer it takes to write your program."
 
  • Like
Likes Eclair_de_XII, Vanadium 50, phinds and 1 other person
  • #81
Mark44 said:
"The sooner you sit down to the keyboard, the longer it takes to write your program."
what he said (very small).jpg
 
  • #82
Mark44 said:
However, writing something from scratch is what people do who are learning to program.
Maybe it shouldn't be. Being able to write HelloWorld, FizzBuzz and maybe even TicTacToe leaves people unprepared for geting an inch thick stack of green and white tractor-fed paper on their desk with the instruction "1% of our vendors aren't getting paid. Fix it."
Mark44 said:
The sooner you sit down to the keyboard, the longer it takes to write your program
A week of coding will save you an hour of thinking.
 
  • #83
Vanadium 50 said:
A week of coding will save you an hour of thinking.
I love it :oldlaugh:
 
  • #84
Vanadium 50 said:
Maybe it shouldn't be. Being able to write HelloWorld, FizzBuzz and maybe even TicTacToe leaves people unprepared for geting an inch thick stack of green and white tractor-fed paper on their desk with the instruction "1% of our vendors aren't getting paid. Fix it."
Well, of course -- it's a long way from writing a toy program to being able to analyze why a big program isn't working for 1% of the vendors, but you have to crawl before you walk, and walk before you run.
 
  • #85
I agree you have to start somewhere. But if you go to college to learn to write English, you spend a greate deal of time reading, and less time writing. If you go to school to learn to write code, it's the other way around. I am musing aloud as to whether the English Department may have it right.

There are two other problem, and maybe they should be attacked first. One is that there are graduates who still can't code. At all. They can;t write FizzBuzz, much less a fully-compliant payroll application.

The other is that they think they can. For some reason, there is a shockling lack of self-awareness on the part of some programmers that I just don't see in physics or math.
 
  • #86
Rene Dekker said:
For systems programming on any Apple platform: Swift.
I haven't programmed since MS quickbasic in the '90s. I'm reading Swift.org, but it is over my head.

If I write a program in Swift on Mac, can I save it as a windows program and send it to a windows user? Are their any program languages that can do this? Or do you need a windows machine to compile code that can run in windows?
 
  • #87
Vanadium 50 said:
But if you go to college to learn to write English, you spend a greate deal of time reading, and less time writing. If you go to school to learn to write code, it's the other way around. I am musing aloud as to whether the English Department may have it right.
I don't think it's actually the other way around with regard to teaching programming. Before a student is asked to write a program he or she will have been exposed to examples that use the same concepts as are asked for in the program to be written. I'm speaking as someone who has taught programming classes for about 35 years in at least six different languages.
Algr said:
If I write a program in Swift on Mac, can I save it as a windows program and send it to a windows user?
AFAIK, no. A compiler generates code that can run on a particular operating system. If you write a program in Swift that runs on a Mac, the OS is iOS <some version>. A Windows user would need to have some kind of emulation software on the Windows machine, software that would emulate the Mac OS.

It's possible that you could send the Swift source code to the Windows user, who might then be able to port that source code to C, C++, C#, or whatever, and then compile and run that ported version.
 
  • #88
Algr said:
Are their any program languages that can do this?
Java can do it because Java does not compile directly to machine code, but rather creates an intermediate form which can then be executed on any machine (including Mac's and Windows) that have a Java Virtual Machine (an app to run Java intermediate code)
 
  • Like
Likes Vanadium 50
  • #89
Algr said:
Are their any program languages that can do this?
Any interpreted language that has an interpreter on both platforms will do. Java is one, as @phinds mentioned. Another is Python. Another is C#.
 
  • #90
There are really four approaches to this:

(1) Use an interpreted language and have each platform provide its own interpreter. Python was mentioned as an example. BASIC was an older example. The issue here is that platform-specific versions may vary: running Applesoft BASIC code on an IBM PC might or might not work.

(2) Use a compiler, and port the compiler to different platforms. V and the GCC compiler is an example of that. The issue is that many faetures, especially the "cool" one like sound and graphics exist in extensions, so the programs are very "plain jane".

(3) Use a common environment like the Java Virtual Machine upthread, This solves both problems, but requires a strong core team - in this case provided by Sun - to write and test the virtual machines for each platfom.

(4) Write an automatic converter from one dialect to another, one environment to another, or even one language to another. I would say that this works for small programs, but as the program gets larger and more complex, the more likely some human intervention is required.
 
  • #91
Vanadium 50 said:
Use a common environment like the Java Virtual Machine upthread
How is this different from 1? The JVM is just a bytecode engine that runs Java bytecode; it's no different from the Python interpreter running Python bytecode, or any other interpreted language. You still have to have a JVM separately compiled for each platform. Yes, Java has historically had a core team that devotes attention to the JVM for multiple platforms; but so has Python, and so have other interpreted languages, like Perl.
 
  • #92
PeterDonis said:
How is this different from 1?
There is more standardization.
 
  • #93
Vanadium 50 said:
There is more standardization.
In what way? Python code runs the same on every platform that has a Python interpreter.
 
  • #94
Thank you all!

It sounds like Python is the best choice for me, or maybe Java. I'm not really looking to make any money off of programming, just make fun little programs to send to friends.
 
  • #95
Algr said:
Thank you all!

It sounds like Python is the best choice for me, or maybe Java. I'm not really looking to make any money off of programming, just make fun little programs to send to friends.
I recommend javascript or java.
 
  • #96
I'm still researching for an improved answer for the question in the title and a couple of points that have been made in this thread and are put well here have come out as important so I thought I would jot them down here so they don't get lost:

Relevant considerations:
  • What should someone gain from their experience with a first language?
  • The first language learned is not the only language ever learned.
  • A first language should not force a particular programming paradigm (sorry Java: not everything is a noun).
Not relevant:
  • Is language X "better" than language Y? (Unanswerable without context.)
 
  • #97
I think your third point somewhat contradicts your second. While I am not 100% sure what you mean by "paradigm" or even "force" (is making it easy to accomplish a task in one way 'forcing'?) but if the thought is the student will eventually learn multiple languages, why is this a problem?

I've been thinking about what one expects students to be able to do. This is probably the upper limit:
  1. Read a text file of ~100 words. If there are more, read the first 100.
  2. Convert each word to a number by summing the ASCII codes for its letters.
  3. Drop the leading digit from each number, then square it.
  4. Sort this list.
  5. Write the sorted list to a file, and display the mean. median and mode.
If you think this is too easy, look at some of the questions we get. Look at people struggling with FizBuzz.

This can be done in C. C++, FORTRAN, Pascal, Ada, Java, BASIC and lots more. (I might try it in SQL, which looks like a fun project). The code - indeed - the approach - will be different in several of them, although it lends itself to block-structured procedural cide with several independent subroutines. Is this a problem?
 
  • #98
pbuk said:
I'm still researching for an improved answer for the question in the title and a couple of points that have been made in this thread and are put well here have come out as important so I thought I would jot them down here so they don't get lost:

Relevant considerations:
  • What should someone gain from their experience with a first language?
  • The first language learned is not the only language ever learned.
  • A first language should not force a particular programming paradigm (sorry Java: not everything is a noun).
Not relevant:
  • Is language X "better" than language Y? (Unanswerable without context.)
Funny, reading this post made me think about a similar question that could be asked:
  • When learning how to speak, what language should a child learn?
And then all of your points or questions would still be as relevant:
  • What should someone gain from their experience with a first language?
  • The first language learned is not the only language ever learned.
  • A first language should not force a particular grammar paradigm
And the answer to the original question would seem to be: The language your teacher best mastered.

This is how we learn how to speak: By watching our parents (and other people in our surroundings) how they speak themselves.

Do you move to another environment, with other people? You learn the local language by default. If it is vastly different, you may have some difficulties that will follow you for the rest of your life because of how you first learned how to speak. Maybe not.

You may even end up translating everything into your original language. It might not be as efficient, as trying to translate the variety of Inuktitut words for "snow" into Arabic, but it is always possible. One language was just not made for the reality described by another one. So is Inuktitut or Arabic the best first language? It could be either or maybe even neither. It depends on where you live and with whom you hang out with.
 
  • #99
I think we're back to the audience question. It seems that there are three kinds of people who ask this question:
  1. Those who want to learn multiple languages and are picking their first
  2. Those who want to learn exactly one language
  3. Those who want to learn exactly zero
You might not think there is anyone in #3, but there are - it may even be the biggest. In the last few weeks we had two people (at least) on PF who asked how to do something and the answer was one line* - one line! - and they complained it was too hard. Why do people still use FizzBuzz as a test of programmers? Because people still fail it. It provides real, actionable information. Fortunately, we can disregard these people from this question: if you can't program in any language, you can't program.

The people who want to learn exactly one usually do so because they inherited a pile of code that they need to maintain and modify. It might be Python, FORTRAN. C/C++. or others. They are largely uninterested in programming - they just want to change the axes on these plots. And here the answer is easy too - learn what you need to maintain.

Which leads us to Category #1. And does it matter? If you are filling your toolbox, does it matter if you buy a flathead and then a Phillips screwdriver or do it in the other order? I probably wouldn't start with C++, just as I wouldn't start my tool collection with a Swiss Army knife, but I don't think someone who did would be scarred for life. I would consider C, but probably wouldn't either, because it lacks an enum and its pointer handling sacrifices logic for convenience. (Example, if x[5] is 10, what is 5[x]? I'd argue it shouldn't be anything at all!) There's a lot to be said about Ada, and if it weren't designed by people who had ambitions to be hall monitors in grade school, people would use it today. We mentioned Pascal, and despite its age, there are lots worse choices. But my point is that for this target audience, it's less important.

* I'm not counting includes or declaration of input or output variables,. Just the line that did the work.
 
  • #100
Been busy elsewhere so catching up...
Vanadium 50 said:
I am not 100% sure what you mean by "paradigm"
https://en.wikipedia.org/wiki/Programming_paradigm
Vanadium 50 said:
"force" (is making it easy to accomplish a task in one way 'forcing'?)
No, forcing is saying you can ONLY accomplish a task by jumping through non-intuitive hoops. For instance in some functional programming languages you can only create a loop by writing a recursive function and having the compiler sort out the fact that recursion is a really bad way for a computer to execute code and convert it to iteration.

Vanadium 50 said:
I think your third point somewhat contradicts your second ... if the thought is the student will eventually learn multiple languages, why is this a problem?
  • Because making something harder than it needs to be is detrimental to progress.
  • Because bad habits or misconceptions learned early need unlearning later.

Vanadium 50 said:
The code - indeed - the approach - will be different in several of them, although it lends itself to block-structured procedural cide with several independent subroutines. Is this a problem?
It becomes a problem when the student has to spend more time struggling to implement the algorithm in the target language than they spend working out the algorithm.
 
Back
Top