C Programming: What's Next After Learning C?

  • Thread starter Thread starter Drakkith
  • Start date Start date
AI Thread Summary
The discussion revolves around the practical applications of C programming and the transition to learning C#. The original poster expresses frustration with their limited exposure to real-world programming scenarios, having only worked within Visual Studio and basic text-based interactions. Suggestions include deepening their understanding of C by applying it to personal interests, such as astrophotography or game development, and exploring graphics programming with OpenGL. Participants emphasize the importance of mastering C concepts like pointers, memory allocation, and data structures, as these skills are foundational for programming in any language. They also recommend using Integrated Development Environments (IDEs) like Eclipse or NetBeans for better programming experiences. The conversation touches on the value of practical projects, such as simulations or automation tasks, to enhance programming skills and understanding. Overall, the consensus is to focus on applying C effectively before moving on to higher-level languages like C#.
  • #51
True, I meant to say assembly language for whatever system it is.
Pure machine code is all 0/1 and might not even be written on a keyboard but might be input by physically toggling dozens of individual switches.
Does anyone anywhere actually do that these days?
 
Last edited:
Technology news on Phys.org
  • #52
Drakkith said:
So I'm about to be done with my first class in programming where we learned C programming. Unfortunately, I don't actually know what you can do with it. . . . I don't really have any specific applications that I'm thinking of.

Nidum said:
Real world applications programming is a thousand times more challenging and more interesting than just shuffling fictional data sets around and doing pointless graphics .

I agree with Nidum above.

Beyond that, speaking as someone who many years back did a lot of script-based programming to aid with my work as a technical writer documenting software; as well as many automation routines just for my own enjoyment at home; who used to interview a lot of programmers; and who taught himself the rudiments of C for the fun of it; and did other DIY projects like write an entire shared editing website (back before shared editing became commonplace) for a team of writers on a book project, completely in objects in Python; but who doesn't program at all these days - to me, the disconnect you are experiencing seems directly related to not having any applications in mind, nor perhaps any interest quite yet in programming as a culture (e.g. how language design relates to purpose, for example, as Paul Graham once wrote about).

Which I admit surprises me. I would have thought that in a school setting, your profs would have laid out a general path or out-branching paths in terms of why and how to learn programming. And likewise I would have thought it a natural inclination to come up with some candidate applications that either interest you personally, or seem possibly relevant to your potential career path or paths; and then to take it from there. But perhaps this opening course in C was taught the way my freshman year English lit classes were taught long ago - by rote, very distanced, not much guidance?

Anyway C is just one language - very far from a stopping point or even necessarily something you need to get good at right away; the point is that having learned a little bit about C, learning your next language ought to be easier. Play with C if you think you'd enjoy it; otherwise I'd suggest getting on with finding applications. You can do that by learning one or two more languages; this makes it more likely that at some point you will experience serendipity, i.e. learn about potential applications which are interesting and/or useful to you. Hardcore programmers used to be obsessed in maximizing efficiency with really frugal algorithms and I imagine there must still be some of that aspect today in some areas, in other words really geeky math-driven approaches; meanwhile, cheap fast hardware long ago opened up scripting. All in all programming always seemed more like play than work; so you could think in terms of playing around, experimenting.
- - -

P.S. Back when I was still doing freelance editing, I edited a textbook on programming for biologists - Python for scripting, various tools available in the Linux shell, and Arduino for field data: Practical Computing for Biologists, Haddock and Dunn. It's not anything to do with physics, but but a glance at the table of contents via https://www.amazon.com/dp/0878933913/?tag=pfamazon01-20 does demonstrate that a programming language by itself is just a small part of what computing in science is about. If you're a carpenter, you're going to want more than a single hammer to build something with.
 
Last edited by a moderator:
  • Like
Likes Drakkith
  • #53
rootone said:
True, I meant to say assembly language for whatever system it is.
Pure machine code is all 0/1 and might not even be written on a keyboard but might be input by physically toggling dozens of individual switches.
Does anyone anywhere actually do that these days?
I doubt anyone does now but I did it when I was starting out.
 
  • #54
phinds said:
I doubt anyone does now but I did it when I was starting out.
Me too when i got my first micro, MITS Altair 680 with the motorola 6800. The computer had a bank of toggle switches for addressing and data and it took forever to enter a program didnt work. I tried to light up the leds with a pattern but it just didnt do anything.

http://www.vintage-computer.com/mitsaltair680b.shtml

I was tempted to buy a televideo mterminal and use the onboard tty monitor program via rs232 but it was not cost effective at $700 for the televideo. I eventually went with the trs80 from radio shack.

http://www.vintage-computer.com/trs80mod1.shtml

In hind sight the MOS KIM 1 6500 system board was a better deal but i was out of money by then. Programming for it was in hexidecimal

http://www.vintage-computer.com/kim1.shtml
 
  • Like
Likes OmCheeto
  • #55
UsableThought said:
I would have thought that in a school setting, your profs would have laid out a general path or out-branching paths in terms of why and how to learn programming.

Lord, no. I only took the class because I needed "programming experience" before taking a Digital Logic class later on. There was absolutely no information given to us about branching out or anything else like that.
 
  • Like
Likes OmCheeto
  • #56
Drakkith said:
...There was absolutely no information given to us about branching out or anything else like that.

I can see now, that you are on the path, young Drakkths.

Acknowledging the landing of the flock of impending black swans, is a sure sign, that you are on the path, to... the dark side...

Code is your father...
 
  • Like
Likes Drakkith
  • #57
Drakkith said:
So I'm about to be done with my first class in programming where we learned C programming. Unfortunately, I don't actually know what you can do with it. All of our programs in my class have involved us programming and running everything using Visual Studios, not developing standalone executables or something. The only way we've had our programs interact with the 'outside world' is through the keyboard and screen or through text files.

To be honest I feel like I've been trained to use a lot of specialized tools but don't have any use for them. Perhaps like a carpenter with no wood and no work.

I've also considered learning something like C# and I'm just curious as to what the differences are in what you can develop with each one, if there are any differences of course. I know that C# is a much higher language than C, but the extent of my knowledge mostly ends there. If I'm not looking to develop blazing-fast programs for huge amounts of calculations, is C# a good choice? I don't really have an specific applications that I'm thinking of.

Thanks.

In C# you can create any application of software which is GUI based easily
 
  • Like
Likes Drakkith
  • #58
jedishrfu said:
Me too when i got my first micro, MITS Altair 680 with the motorola 6800. The computer had a bank of toggle switches for addressing and data and it took forever to enter a program didnt work.
Did it take more time or less time to enter a program that did work? :oldbiggrin:
 
  • Like
Likes jedishrfu and Drakkith
  • #59
Mark44 said:
Did it take more time or less time to enter a program that did work? :oldbiggrin:
Oh, entering the ones that actually WORKED took WAY longer :smile:
 
  • #60
phinds said:
Oh, entering the ones that actually WORKED took WAY longer :smile:
Isn't that the truth?

In your upcoming Insights article, you mentioned some of the early programming languages, one of them being PL/1. The first programming class I took was in 1972, with the language used being PL/C. The C in the name indicated that it was a compact subset of PL/1.

Although we didn't have to set switches on a console, writing code seemed just as arcane. We used a keypunch machine to make the IBM (or Hollerith) cards, added a few Job Control Langauge (JCL) cards at the front and back of our card decks, and submitted them. The computer operator would run the cards through a reader, which would transcribe the code onto a tape reel that was subsequently mounted on the actual computer. The results came back several hours later, or even the next day. Most of my early programs produced no recognizable output -- just many pages of what looked like gibberish to me (a core dump of the computer's memory). Ahh! The good old days!
 
  • Like
Likes jedishrfu, Borg and QuantumQuest
  • #61
Mark44 said:
Isn't that the truth?

In your upcoming Insights article, you mentioned some of the early programming languages, one of them being PL/1. The first programming class I took was in 1972, with the language used being PL/C. The C in the name indicated that it was a compact subset of PL/1.

Although we didn't have to set switches on a console, writing code seemed just as arcane. We used a keypunch machine to make the IBM (or Hollerith) cards, added a few Job Control Langauge (JCL) cards at the front and back of our card decks, and submitted them. The computer operator would run the cards through a reader, which would transcribe the code onto a tape reel that was subsequently mounted on the actual computer. The results came back several hours later, or even the next day. Most of my early programs produced no recognizable output -- just many pages of what looked like gibberish to me (a core dump of the computer's memory). Ahh! The good old days!
Yeah, I remember them well. In addition to the old punch-card-overnight-submisison circus using Algol and later Fortran, I later ran minicomputers where you had to load an editor from punched paper tape, load your souce code into the editor, do your edit, output a new source code tape, load the assembler (there WAS no compiler on the first one I worked on), load the source code tape into the assembler, produce an output object code tape, load the object code tape and then run the program. I think I may have left out a few steps. It took most of a day to do a single program turn-around. What fun.
 
  • Like
Likes QuantumQuest
  • #62
Mark44 said:
Did it take more time or less time to enter a program that did work? :oldbiggrin:
The working ones took forever and a day and now presenting Mary Hopkins singing Those Were the Days:

 
  • Like
Likes QuantumQuest and Nidum
  • #63
I recall working on a character based plotting program which I got working kind of but it didn't work the way I expected the line plot was there in the output but everywhere I expected a space was a zero. It didn't dawn on me until much later to initialize the array of characters with spaces.

I would get two or three chances a week on Friday nights at our Explorer Post 635 meeting to fix the program hoping each time that it would work but to no avail. Finally one of my mentors mentioned the initialization solution.

I never got back to programming until I graduated from college and was snapped up by the company that sponsored the post. I was the first to return (poster boy for the success of the sponsorship) and they thought I had the best potential to succeed, boy were they wrong I mean right I mean...
 
  • #64
Anyone remember text based 'dungeon master' ?
The progenitor of nearly all games in existence today.
Unless you wanted impressive graphics, in which case 'pong' would have been the thing.
 
  • #66
I used Assembly language and C for developing firmware for Intel Micro Controllers years ago when I was designing controller boards. C# is not a bad language. It has direct file access capabilities but I still prefer C for certain tasks
 
  • #67
Vanadium 50 said:
There are a number of schools where a CS degree involves learning a bunch of languages, but not anything in data structures, algorithms, numerical methods, etc. I don't much care for these programs. Someone who knows one language well and knows how to program can figure out other languages if needed. If you can't code in one language, though, you probably can't code in any. Google "Fizzbuzz" for rants about this.

If you mean four-year university programs, then this is a rather shocking revelation. The essence of computer science, IMHO, is the study of data structures and algorithms. How can any four-year CS program get away with this? But if you mean shorter programs, designed to provide a credential in one or more programming languages, then perhaps they want to teach the core CS topics in the context of learning a programming language?
 
  • Like
Likes jedishrfu
  • #68
I've had a love/hate relationship with several programming languages. After several years of professional programming, I have finally settled on only two: C and assembly language. The latter is of course only for special situations where it's truly needed. Although I must admit I will always admire the original Roller Coaster Tycoon, not only because it's an excellent game, but because it was written entirely in assembly. Keep in mind, I no longer have anything to do with web programming, so I don't need to think about that zone of madness.

Once I was very enthusiastic about C++, until I realized that after luring me into its web with certain promises, it turned out to be a baroque nightmare. As for other languages, let's just say that aside from Pascal they have all left me with bad memories. Perhaps in a perfect world, as designed by the finest Swiss craftsmanship, we would all be programming on Oberon systems. But that's not real life.

My most traumatic love/hate affair has been with LISP. But I must forget that in order to move on with my life.

I've found that by limiting myself to C, I don't need to spend any time on learning language features, or worrying about whether there is a "better" language out there somewhere.

Of course I still need to wrestle with the so-called "Windows." The nightmare never ends. Sometimes I long for an extended DOS.
 
  • Like
Likes jedishrfu
  • #69
Yeah I've had similar experiences. I programmed the traditional four: verbose Cobol, fragile Fortran, majestically awesome Macro Assembler. After reading KR, i started learning concise C and when CFRONT came out a few years later, I started to learn complicated C++ and the STL which was somewhat painful.

I was glad to jump to Java with its single inheritance model and its run everywhere feature but disliked the infinite number of classes and methods and overloading which has since been tamed with the IDE tools like Netbeans and Eclipse.

I continually search for the perfect language like Diogenes of old, but have yet to find it. I do gravitate to scripting languages for prototyping ideas. My first goto language was amazing awk and when the idea outgrows awk, i switch to perennial python then on to groovy Groovy to leverage some java capability.

Ive dabbled in superformal Scala and cute Clojure but the learning curves were a bit much with little to spend. Also Scala tended to break compatibility with each new major release even though its very powerful and still developing. My coworker had sent me an article on folks at startups who abandoned Scala for Clojure. Clojure is basically LISP on Java with extras to seamlessly work with Java and run everywhere Java can.

My current ventures have been with nodejs and javascript which has really surprised me in its expressiveness. I've used it to do web apps without the complexity and builtin restrictions of tomcat or jetty mostly with restricted access to dynamically generated files.

I've been looking at Elm and its promise of functional reactive programming and its possible successorship to Javascript. Elm actually compiles to Javascript. Note its time traveling debugger feature where you can replay your sequence of interactions while tweaking the code live. I also like the functional programming aspect of Elm as it makes code far easier to debug since you trace where data came from and where its going.

Julia is another interesting functional programming language with method overloading but without the OO aspect. Its a potential open source successor to matlab. The ijulia notebook is an interesting way to develop or teach coding. Its a webpage input editor and a program output display that's great for setting up a lesson and walking through various snippets running each as you go along.

My favorites are Basic and Fortran for their simplicity and because they were the first ones i wrote games in.

Next came TEX a cool Honeywell 6000 scripting language built on top of the timesharing line editor. It was like Awk and Lisp merged where code could write code and then execute what it wrote. There's a wikipedia article that I once wrote on TEX aka Text Executive programming language if you want to know more about it.

Next comes C and Awk with Awk being a kind scripting version of C without structs and unions. Awk rejuvenated my interest in programming years ago after I found a version that ran on DOS. I added ansi codes to my programs for character based color loaded screens.

Next comes Rexx and Python two fairly syntax free languages that were quite expressive. Rexx had a feature that default values for variable were their names instead if the now more popular null. Rexx also had dot notation to make a variable have attributes with values too.

I also liked Prolog and Forth for their unique programming paradigms. Forth was like a reverse polish notation of Lisp without parentheses and Prolog's goal directed paradigm was a challenge to master. Somestimes its good to break out of the procedural OO way of thinking.

And the search continues...
 
  • Like
Likes Aufbauwerk 2045 and QuantumQuest
  • #70
"If you think C++ is not overly complicated, just what is a protected abstract virtual base pure virtual
private destructor, and when was the last time you needed one?"

From van der Linden, Expert C Programming.

For those who have not read it, here is the famous anti-C++ rant by Linus Torvalds.

https://lwn.net/Articles/249460/

I would put it differently. C and assembly language is enough. Apply Occam's razor. The time I spent learning C++ and other languages could have been better spent developing tools in C. Now I'm a born-again C programmer and the universe is my oyster.
 
  • Like
Likes FactChecker and jedishrfu
  • #71
and what a universal C to frolic in...
 
  • Like
Likes Aufbauwerk 2045
  • #72
jedishrfu said:
I've been looking at Elm and its promise of functional reactive programming and its possible successorship to Javascript. Elm actually compiles to Javascript. Note its time traveling debugger feature where you can replay your sequence of interactions while tweaking the code live. I also like the functional programming aspect of Elm as it makes code far easier to debug since you trace where data came from and where its going.

Julia is another interesting functional programming language with method overloading but without the OO aspect. Its a potential open source successor to matlab. The ijulia notebook is an interesting way to develop or teach coding. Its a webpage input editor and a program output display that's great for setting up a lesson and walking through various snippets running each as you go along.

I'm not totally comfortable calling Julia a functional language -- but I'm happy to see it mentioned in these forums. It is a very cool language that is immensely useful for technical computing. People need at least one other language under their belt first, in my opinion, as it is still an immature language and you'll have to figure some stuff out on your own. But again, its a very cool language.

I haven't used Elm, but a friend went to their hack night last week and spoke quite highly of it.
 
  • #73
I hope someone is not REALLY writing code this way. Making code unnecessarily complicated leads to system reliability problems. The concepts of abstraction is very powerful when one looks at a class as a specification or blueprint. Contractually abstract methods in a base class require the inheritor to implement those methods. Virtual methods require a default implementation in the base class in the event the inheritor does not implement those methods. Hence virtual methods are contractually optional in the derived class. If a virtual method is implemented it the derived class it becomes useful polymorphically in code.
 
  • #74
Here's a Reddit discussion on Julia and its functional programming features.

https://www.reddit.com/r/Julia/comments/5892nc/why_is_julia_called_a_functional_language/

Its not a pure functional programming language but you can do functional programming in it and here's a collection of methods to provide more support for lazy evaluation in Julia:

https://github.com/MikeInnes/Lazy.jl

It I guess it boils down to what your definition of functional programming is unless you prefer alternative facts.
 
Last edited:
  • #75
I don't know about being a functional language (since this depends on what you include in the definition and how strict you are) but Julia clearly draws a lot of inspiration from Lisp (up to and including a representation for code as trees of symbols and other objects, and Lisp-like macros).

One very functional-/Lisp-ish feature is that Julia doesn't seem to make much of a distinction between statements and expressions and some of the block/control-flow constructs can return useful values. So Julia will happily run code like
Code:
println(if i % 15 == 0
            "FizzBuzz"
        elseif i % 3 == 0
            "Fizz"
        elseif i % 5 == 0
            "Buzz"
        else
            i
        end)
and
Code:
# Compute and print 10th Fibonacci number
println(let
            a, b = 1, 0
 
            for n = 2:10
                a, b = a + b, a
            end

            a
        end)
(Though I don't know if you're likely to see much Julia code written like this in practice.)

Julia's version of object-oriented programming (basically structures/records and, separately, methods that you can specialise for different argument types) also looks a lot like a simplified version of CLOS.
 
Last edited:
  • #76
David Reeves said:
"If you think C++ is not overly complicated, just what is a protected abstract virtual base pure virtual
private destructor, and when was the last time you needed one?"

From van der Linden, Expert C Programming.

For those who have not read it, here is the famous anti-C++ rant by Linus Torvalds.

https://lwn.net/Articles/249460/

I would put it differently. C and assembly language is enough. Apply Occam's razor. The time I spent learning C++ and other languages could have been better spent developing tools in C. Now I'm a born-again C programmer and the universe is my oyster.

I won't disagree that C++ finally became an untamed beast but there is a multitude of factors that explain this, not the least of which are ever-increasing complexity, fierce competition and personal ambitions. In a non identical but very similar way, JavaScript was put on steroids and today you can't even think writing your own JS code for any serious sized site. Using a framework is the only way to go. Now, this is definitely not a good think regarding the freedom that have been stolen from the hands of web programmers. But on the other hand that is the course that any widely used programming language takes sooner or later.

Now, saying that C and assembly is enough, is somewhat general and exaggerated in my opinion. I am a also a big fan of C and yes I agree that writing tools in C is a very productively spent time. Assembly programming as far as I can tell - as I am not a system-level programmer, has few uses anymore. I mostly program in Java but I have done a fair amount of programming in C++ in my career and I don't think in any way that we can throw it to the garbage bin just because of its high complexity. There are methods, strategies and frameworks that can make your life a whole lot easier albeit with the price of heavy dependence on others code and constructs. But again, such things are intimately related with the evolution of software.
 
Last edited:
  • Like
Likes Aufbauwerk 2045
  • #77
The thing about programming languages though is that they often build on one another. Java is built on C and C is built on assembler and assembler on machine code at some point.

Its true that there are cross-compilers for various processor hardware but it still boils down to a dependency on assembler and ultimately machine code somewhere back in time.

We stand on the shoulders of giants...

"If I have seen further, it is by standing on the shoulders of giants."[2]

-- Sir Isaac Newton in a letter to Sir Robert Hooke

https://en.wikipedia.org/wiki/Standing_on_the_shoulders_of_giants

and so it is with programming languages each learning from and depending on the ones of the past...
 
  • #78
I would add that for beginners, or even for people who have studied some C but don't understand how to write simple games and so on, BASIC is a good choice. FreeBasic is OK.

It's amazing what you can do with BASIC if you know enough about programming. For example, Black Annex is a game that got some attention because it was written in BASIC.

http://www.pcworld.com/article/2033318/black-annex-is-the-best-qbasic-game-youve-ever-seen.html

One issue that seems to worry people about C is pointers. They think they lead too easily to bugs, but wonder how to implement certain data structures without them. You don't always need C-style pointers. There's a good book called Visual Basic Algorithms by Stephens, that shows how to implement things like linked lists without C-style pointers. I think this may be useful instruction for the beginner.

Of course I'm not saying BASIC is the language of choice for commercial applications, but I think it still has value for learners.

 
Last edited by a moderator:
  • Like
Likes Borg and Drakkith
  • #80
jedishrfu said:
The thing about programming languages though is that they often build on one another. Java is built on C and C is built on assembler and assembler on machine code at some point.
From a historical perspective, the guys that developed the first version of C were aware of a language called B, a simplified version of BCPL, but it's not clear how much of C was based on B.

http://en.wikipedia.org/wiki/C_(programming_language)#History

High level programming languages like Fortran, Cobol, APL, predate C, and these languages don't have much of a tie to assembler, other than the initial version of a compiler is written using some existing language, which could include assembly.
 
  • #81
rcgldr said:
but it's not clear how much of C was based on B.

Well an early tutorial is available online: https://www.bell-labs.com/usr/dmr/www/btut.html. It contains what is apparently the original "hello, world" example:
Code:
main( ) {
 extrn a, b, c;
 putchar(a); putchar(b); putchar(c); putchar('!*n');
}

a 'hell';
b 'o, w';
c 'orld';
I think the reason for the way the characters are grouped together is that B had essentially only one datatype, which was a machine word on whatever architecture B was implemented on, which happened to be large enough to pack four characters into.

AFAIK the curly brackets {} also come from B.
 
  • #82
David Reeves said:
After several years of professional programming, I have finally settled on only two: C and assembly language.
Those are probably my favorites, as well. My background is pretty mixed, as I taught lots of math classes and lots of programming classes at a community college before switching careers and working as a programming writer at the large software firm in Redmond, WA. I've since retired, but have continued to teach programming classes at one community college (C) and will teach C++ at a different college next quarter. In the falll quarter I'm on the schedule to teach a class in MIPS assembly, something I'm really looking forward to. If someone is learning assembly for the first time, MIPS is a really good place to start.

David Reeves said:
The latter is of course only for special situations where it's truly needed. Although I must admit I will always admire the original Roller Coaster Tycoon, not only because it's an excellent game, but because it was written entirely in assembly.
My first computer was an Apple //e. A favorite game of mine was a pinball simulation called Night Mission, which I believe was written entirely in 6502 assembly. It was very responsive, not easy to do on an 8-bit processor. Back in about that same period, there was a very popular word process called WordStar, that was coded entirely in 8088 assembly.
 
  • Like
Likes Aufbauwerk 2045
  • #83
Mark44 said:
If someone is learning assembly for the first time, MIPS is a really good place to start.
Assuming it's not the version of MIPS with the delayed branch that executes one instruction after the branch before actually branching. I've seen this used on embedded processors like the MicroBlaze, but those aren't intended to teach assembly.
Mark44 said:
My first computer was an Apple //e. A favorite game of mine was a pinball simulation called Night Mission, which I believe was written entirely in 6502 assembly. It was very responsive, not easy to do on an 8-bit processor.
Almost all of the games in those days were written in assembly, with a few exceptions, such as a turn based artillery game that was written in Basic. Most games were only 8KB to 16KB (for the code or in some cases cartridge rom). The Atari 400 / 800 / 65XE / 130 XE were also 6502 based, but they ran at ~1.78MHZ (for NTSC or a bit less for PAL) instead of 1MHZ.

Mark44 said:
Back in about that same period, there was a very popular word process called WordStar, that was coded entirely in 8088 assembly.
Wordstar was originally coded in 8080 assembly for use on CP/M systems. It was ported to the PC with the help of an 8080 to 8088 assembly translator. As a bit of trivia, part of the reason for the 8088 / 8086 LAHF and SAHF instructions was to make translating 8080 assembly code to 8088 / 8086 code simpler (to handle push psw, pop psw, ... ).
 
Last edited:
  • #84
Mark44 said:
If someone is learning assembly for the first time, MIPS is a really good place to start.

rcgldr said:
Assuming it's not the version of MIPS with the delayed branch that executes one instruction after the branch before actually branching. I've seen this used on embedded processors like the MicroBlaze, but those aren't intended to teach assembly.
What I'm working with at the moment is QtSPIM. Apparently there's a switch in the GUI that you can set things like delayed branches, delayed loads, and a few other settings. There's another simulator, MARS. I don't have it, but am considering downloading it.
 
  • #85
jedishrfu said:
Here's a Reddit discussion on Julia and its functional programming features.

https://www.reddit.com/r/Julia/comments/5892nc/why_is_julia_called_a_functional_language/

Its not a pure functional programming language but you can do functional programming in it and here's a collection of methods to provide more support for lazy evaluation in Julia:

https://github.com/MikeInnes/Lazy.jl

It I guess it boils down to what your definition of functional programming is unless you prefer alternative facts.

I'm not totally convinced that every language needs to fit in one of 2 or 3 boxes like functional or object oriented -- though people seem to want to place things in said boxes. Some languages reek of functional, some less so. People tend to be a bit over the top about this kind of thing on the internet, but it's an interesting enough that I'll ask around at Juliacon this year.
 
  • #86
QuantumQuest said:
I won't disagree that C++ finally became an untamed beast but there is a multitude of factors that explain this, not the least of which are ever-increasing complexity, fierce competition and personal ambitions. In a non identical but very similar way, JavaScript was put on steroids and today you can't even think writing your own JS code for any serious sized site. Using a framework is the only way to go. Now, this is definitely not a good think regarding the freedom that have been stolen from the hands of web programmers. But on the other hand that is the course that any widely used programming language takes sooner or later.

Now, saying that C and assembly is enough, is somewhat general and exaggerated in my opinion. I am a also a big fan of C and yes I agree that writing tools in C is a very productively spent time. Assembly programming as far as I can tell - as I am not a system-level programmer, has few uses anymore. I mostly program in Java but I have done a fair amount of programming in C++ in my career and I don't think in any way that we can throw it to the garbage bin just because of its high complexity. There are methods, strategies and frameworks that can make your life a whole lot easier albeit with the price of heavy dependence on others code and constructs. But again, such things are intimately related with the evolution of software.

You make good points. In fact, I may need to do some more C++ programming myself. This is because I am working on a large simulation program, large in the sense that it has many types of entities, and is complicated, and I really think it makes sense to use C++ for this project. OOP was in fact developed for simulations. Those who are interested may enjoy reading about the history of Simula.

I used C++ when I was working on a game project with many kinds of actors, and it did fit very nicely into the OOP paradigm. But at the same time, I hate running into tasks that would be simple in C, but cause me to jump through hoops in C++. I suppose as you say there is a price to pay for the benefits of C++.

I think my main gripe against OOP in general is that I see so much OOP code where it is not necessary. What I do like about C++ is that you are not required to use OOP. I refuse to have anything to do with languages that force OOP on the programmer.
 
  • #87
I want to update my remarks about C++. I was enthusiastic about writing something in C++ again, but that lasted about five minutes. Then my natural skepticism activated, and I decided to see what's happening in the C++ world. Among other things, I looked for a well-known software application that uses C++. I wanted to see what issues came up. The inventor's website lists a number of impressive applications.

http://www.stroustrup.com/applications.html

For example, C++ is used in the F-35. This is very impressive. However, unfortunately the F-35 has been plagued by numerous software problems. I don't know how much of that software is in C++. I know that the DoD has used Ada in the past. I'm not sure why they are now using C++.

As usual, there are lots of posts on various websites by people who are just speculating about this topic. I read these speculations but I tend to regard them as useless. Probably those who know can't say. Meanwhile I am not blaming C++.

I did come across this C++ coding standard for the F-35 project. It seems rather complicated, at least to me.

http://www.stroustrup.com/JSF-AV-rules.pdf

In the old days, they had to use assembly language. Some would say it was primitive computing, but it did take us to the Moon. I saw a video recently by von Braun about the Apollo 11 Moon landing. After the landing party astronauts returned to the Command Module, one of them stated that it was easy. But if memory serves, there was a last minute glitch in the software that required Armstrong to do the landing manually.

https://qz.com/726338/the-code-that...-to-github-and-its-like-a-1960s-time-capsule/

I found these remarks about C++ in an interview with Niklaus Wirth. The interviewer asked him why no one has developed a safe version of C++.

"One may indeed wonder why nobody in the vast software industry has undertaken the task proposed by you: Defining a safe subset of C++. I can figure out two reasons: (1) The software world is eager for "more powerful" languages, but not for restrictive subsets. And (2), such attempts are doomed to fail just like attempts to strengthen the structure of a house built on sand. There are things that you simply cannot add as an afterthought."

http://www.eptacom.net/pubblicazioni/pub_eng/wirth.html
 
Last edited by a moderator:
Back
Top