For those who ask: "What programming language should I learn?"

  • Thread starter Thread starter pbuk
  • Start date Start date
Click For Summary
Choosing a programming language to learn depends on your goals. Python is recommended for machine learning, web applications, and data analysis, while JavaScript is ideal for web development. C# is suited for game development and Windows applications, whereas C++ is best for performance-critical applications like games and operating systems. Beginners are advised against starting with C++ due to its complexity and potential for confusion, with suggestions leaning towards Python or Java for foundational learning. Ultimately, the choice of language should align with the specific applications you wish to pursue in programming.
  • #121
The good thing about Python is, it's easy to get started coding. In a few hours you can be hacking stuff that runs.

The *BAD* thing about Python is, it's easy to get started coding. In a few hours you can be hacking stuff that runs. So there's nothing to encourage you to look for good coding practices.

I met so many engineers who did not know how to code yet produced hundreds of lines of code per day. Consider Toyota and the global variables. Bad software can kill people.

https://users.ece.cmu.edu/~koopman/toyota/koopman-09-18-2014_toyota_slides.pdf

Your first coding language should not be your last. And when you finish your first little "Hello World!" program, you should open a book like the following. (And other books on the shelf nearby.)

https://www.amazon.com/Code-Complet...Construction/dp/0735619670/?tag=pfamazon01-20

You can produce bad software in any programming language.

Computer languages I learned:
- FORTRAN (from version IV on)
- C
- C++
- C#
- Word Perfect ver 5.1 macro language
- MS Office macro language (pre-VB implementation)
- Several variants of BASIC, starting on the TRS80 in the 1970s
- MS Visual Basic, both as a stand-alone and as part of MS Office
- Perl
- Python
- Unix scripts (couple of variants)
- MS Dos scripts
- PL/SQL for making Oracle apps
- machine code for three or four CPUs
- logic/language for several PLCs
- MATLAB
- LaTeX (The scripting language "under the hood" that lets you define document styles, as well as the codes that can be used in Physics Forums.)
- HTML, LISP, JAVA, and COBOL at an intro level
- several local variants and customs such as the interface for a computer controlled BBQ

Which one you learn first depends on which one you are going to work on a project in first.
 
Last edited by a moderator:
Technology news on Phys.org
  • #122
DEvens said:
The good thing about Python is, it's easy to get started coding...

The *BAD* thing about Python is, it's easy to get started coding.
LOL. :smile: But yes, it's true, "easy to get started coding" can go either way.

DEvens said:
Computer languages I learned
Wow, you've subjected yourself to even more than I have. :biggrin:

Of the ones you list, I haven't learned C# (too strong an allergic reaction to anything that involves Windows), Perl (executable line noise isn't my thing), or any kind of PLC code, and I've only learned machine language for x86 (and that only several decades ago, so I'm way, way out of date as far as all the newer stuff that's in those CPUs now). My BASIC knowledge doesn't go as far back as yours either, only to the Commodore PET, Vic-20, and Commodore 64 and varieties of MS-DOS BASIC. My SQL knowledge is based on using MySQL and SQLite, not Oracle apps.

I do know one that you didn't list: Lisp (Scheme/Racket). I don't use it much because it's just not a good fit for the programming projects I've had, but I think learning it was helpful.
 
  • #123
PeterDonis said:
I do know one that you didn't list
Actually, I realized there's another one as well: Pascal (thanks to Borland, from Turbo Pascal to Delphi).
 
  • #124
From over 30 years of programming as a profession, I would start with Python because it is widely used and easy - evidently, even elementary school students are being taught it.

Its issue is efficiency, as it is an intermediate bytecode language; it does not compile to machine code. Machine code runs much faster.

To circumvent that, there are variations very close to Python that compile to machine code and are much faster, e.g. Condon) It is easy to transition to those.

Occasionally, however, you want even more speed or want to program at the system level. That's where C or C++ comes in. Those languages are old these days. Zig is the new kid on the block:
https://ziglang.org/

A huge advantage is that Zig can output C, and nearly all environments have a C compiler. Also, Zig compiles both C and C++, meaning it can be your drop-in C and C++ compiler for legacy code that you can, if you wish, change to Zig. Why would you want to do that? See:


The Zig compiler is written in Zig but can output a variety of other languages, including C, the WASM intermediate language, and others. It is interesting how this feature can be used to get a Zig compiler for any machine (there are already many available for standard machines like Windows, Mac, etc, but just in case there isn't one for your machine):


There is even a C compiler written in Zig designed to integrate with Zig called ARO:
https://aro.vexu.eu/

Bottom line - start with Python, transition to something like Condon, then, when emboldened enough, learn Zig. That would be my suggestion.
 
Last edited:
  • #125
I'm always interested in new programming languages, and Zig looks pretty interesting.

I've traveled a similar path from FORTRAN IV to later fortrans, to C then C++ then doing C++ with SOM for DOS and OS/2. Taligent C++ frameworks. Then took a java track for the past 25 years. My most recent forays into alternative languages were in Groovy, a superset of Java that adds scripting capabilities, Scala, a better Java than Java, and Kotlin, another better Java than Java with scripting support.

Recently, I've been learning about programming NVIDIA GPUs. Would you happen to know if Zig handles that?

The NVIDIA nvcc command handles cu, c, and C++ code based on the file's type suffix. NVIDIA GPU code, aka CUDA, is a superset of C/C++ that adds the <<<>>> syntax to indicate that you are invoking a GPU/CUDA function call. Data transfer between the CPU and GPU is performed using cudaMalloc() and cudaMemcpy().

Here's an example:

C++:
// nvcc simple_cpp_cuda.cu -std=c++17 -o simple_cpp_cuda
#include <iostream>
#include <vector>

__global__ void addKernel(const int* a, const int* b, int* c, int n) {
    int i = threadIdx.x;
    if (i < n) c[i] = a[i] + b[i];
}

int main() {
    const int N = 5;

    // Use C++ vectors instead of raw arrays
    std::vector<int> a {1, 2, 3, 4, 5};
    std::vector<int> b {10, 20, 30, 40, 50};
    std::vector<int> c (N);

    int *d_a, *d_b, *d_c;

    cudaMalloc(&d_a, N * sizeof(int));
    cudaMalloc(&d_b, N * sizeof(int));
    cudaMalloc(&d_c, N * sizeof(int));

    cudaMemcpy(d_a, a.data(), N * sizeof(int), cudaMemcpyHostToDevice);
    cudaMemcpy(d_b, b.data(), N * sizeof(int), cudaMemcpyHostToDevice);

    addKernel<<<1, N>>>(d_a, d_b, d_c, N);

    cudaMemcpy(c.data(), d_c, N * sizeof(int), cudaMemcpyDeviceToHost);

    cudaFree(d_a);
    cudaFree(d_b);
    cudaFree(d_c);

    // Print using modern C++ style
    for (int i = 0; i < N; ++i) {
        std::cout << a[i] << " + " << b[i] << " = " << c[i] << '\n';
    }

    return 0;
}

An introduction to CUDA:

https://iccs.lbl.gov/research/isaac/docs/IntroductiontoCUDAprogramming.pdf

and one of many CUDA talks:

 
  • #126
Professionally, slow execution is a problem often addressed by throwing more hardware at it.

Also, if you're using a compiled language, compilers and linkers almost always offer switches to optimize a program specifically for speed or size (memory footprint). GCC has it's '-On' switch if I recall correctly.

If it's a program executing a specific job (as opposed to an application with a GUI) instead of rewriting for multithreading, simply running multiple instances of said program is often simpler.

Just my 2 cents.

EDIT: I also remember using a program called perl2exe, as I often found Perl easier to solve a problem in, then converting it to an executable. I bet there are more tools like that.
 
  • #127
DEvens said:
- machine code for three or four CPUs

PeterDonis said:
I've only learned machine language for x86 (and that only several decades ago,
I think you guys mean "assembly language," not "machine code/language." Assembly languages are more-or-less readable by humans while machine codes are strings of numbers that a cpu parses.

Some of the languages I've written code in:
  • PL/C (a dialect of PL-1, I believe)
  • Fortran 77
  • BASIC - several dialects, including one that ran on IBM minicomputers of some sort
  • Pascal
  • Modula-2
  • C
  • C++
  • C#
  • CUDA C (a little bit)
  • Python
  • Lisp
  • F#
  • Erlang
  • x86 assembly starting with 8088 all the way up to AVX and AVX512
  • Motorola 68000 assembly
  • MIPS assembly
  • IL (intermediate language, the byte codes generated by C# and other MSFT languages)
 
Last edited:
  • Like
Likes bhobba and sbrothy
  • #128
Many years ago, while programming under PC DOS, I wanted a Unix-style environment and found a company that made DOS-based Unix tools, Thompson Automation. They had a kit of the most valuable tools and another featuring AWK.

I got both, not knowing anything about AWK. For me, it was a game-changer. It came with the classic AWK book by Kernighan, which includes many great examples. I was bored with coding in C, but AWK tripled my productivity and rekindled my interest in programming. Every time I needed a special tool, I opted for AWK.

Their version created fully functional DOS executables that I could deploy wherever I needed. There was no GUI, so I resorted to using ANSI screen codes and reading the keyboard one character at a time (text-based UI), giving me complete access to the arrow keys and screen coloring.

Sadly, they went out of business and never released their source code to the public. I had hoped to get it to port to Windows, OS/2, or wherever I worked, but alas, that didn't happen.

Remnants of my AWK code in the form of ANSI codes still reside in my AWK and Python scripts, although I now use Python.

AWK programs can get tricky when using their pattern-matching structure across multiple files or interactively. Sometimes I had to fall back to just coding in the BEGIN block.
 
Last edited:
  • #129
Impressive list there @Mark44 ! I've also been somewhat all over the place. C/C++ of course (No way around that professionally I'd say.) Including PL/SQL and various Windows C/C++ frameworks and APIs such as MFC and ODBC. Also, lots of web application stuff like JSP and PHP.

As for awk, @jedishrfu , it is indeed a powerful tool, never quite got the real "hacker's feel" for it I admit. I guess grep, sed, find and regular expressions deserve mention too. *NIX is like that: a ton of little tools doing only one ting each but do them extremely well and are able to be piped together. The sky is the limit. It's a joy to be working with.

Although the learning curve may be a little steep! :woot:

EDIT: Thankfully, one has apropos and man at one's fingertips.
 
  • #130
Mark44 said:
I think you guys mean "assembly language," not "machine code/language."
I did, yes.
 
  • #132
Yes, there are other Unix tools, and that's why I ditched them: they could all be replicated in AWK with minimal effort.

The Zig benchmark is just incredible. Dave didn't really get into why that was the case though but I suspect it was the compile-time functions prepping the way.

For his example, the zig folks could have used a compile-time function to move the sieve pointer to a higher starting number and continue the sieve from there ie skipping 2, 3, 5, 7, 11... but that's just speculation.

I did find that a few entries were disqualified for doing things at compile time and not being faithful to the original algorithm. The winning zig entry won by squeezing out a few more percent in performance, using bitpacking, an LLVM back-end and running with zero GC overhead.
 
  • #133
My first few experiments with zig were impressive but apparently there is still API churn in the versions released since it is at the 0.15.2 version. The churn caused some confusion and issues to be debugged. Not as smooth as learning C.
 
  • #134
For zeroMQ having zig wrapper the calls is okay but for CUDA I'm not sure what you’d be missing using the zig approaches. Fun times to look at what can be done.
 

Similar threads

Replies
86
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 397 ·
14
Replies
397
Views
19K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 16 ·
Replies
16
Views
2K