Crosson said:
I wouldn't say that Perl fully supports functional programming, one reason being that functions in Perl are not treated as first class values the same way they are in LISP or Mathematica.
In what sense? What can I not do to a function in Perl? I checked Wikipedia, and it says first-class functions are one of Perl's strengths.
Maybe we are talking about the same thing, maybe not. I'd study any example you provided.
You can certainly build regular expressions, compile them, and use them all at runtime in Perl (and several other languages).
That is a judgement of what you prioritize. Instead of "don't matter" say "doesn't interest me" because it matters a lot to some people.
Well OK, point taken, but I'll go with "isn't useful to me". To me, interest and usefulness are not synonymous. That may have been our misunderstanding.
No, my issue with Matlab runs much deeper then aesthetics. The issue is RELIABILITY. All programmers make mistakes, and although it is a hassle to fix those caught at compile time, in this day and age it can literally be FATAL if the error is exposed at run time.
You don't have to talk down to me. I think it's pretty clear that chroot and I are both professional software developers with rigorous college educations. Believe it or not, I am familiar with the concept of reliability being important, but this is the first time you've mentioned that as a reason for preferring Mathematica.
In fact, the tradeoffs between reliability, writability, maintainability, functionality, etc. are fairly well known in the world of computer science. I can name other high-reliability languages. MATLAB (like Perl) was, quite explicitly, not intended to be a high-reliability language. Both are optimized for rapid development and should not be used in situations where catching and fixing bugs is hard. MATLAB would be the wrong tool for the job in those cases. If that's a strength of Mathematica, fine. It then occupies a crowd of languages that are optimized for safety.
Suppose you make a typo when typing the call to a function, but all that happens is that you instantiate an array of garbage.
Wait, can you write a code example? Not that I'm claiming that MATLAB is very safe (even MATLAB doesn't make that claim), but in this particular case I don't see what you mean. An array is instantiated when it appears on the LHS of an assignment statement, something you can't usually do with functions. And when you instantiate an array, you don't get garbage, at least in MATLAB.
My problem with the "bolt on packages" is the lost oppurtunity for an orthogonal design that they cause, and that is a design issue, not aethetics.
Orthogonal design means splitting up code in such a way that the pieces are not interdependent. It has little to do with syntax. While there certainly are issues with syntax design, they are engineering issues (tradeoffs between competing goals), not "right versus wrong" issues.
No, I'm not. I am referring to a compiler in the sense of my previous post, in this case an interpreted compiler.
Did you invent this term for the purposes of this conversation? In my giant Webster's dictionary, and in all 5 of the definitions on dictionary.com (one of which is from a specialized computing jargon dictionary), and on Wikipedia, and in every programming-related college course I've ever taken, a compiler is defined as a program that turns source code into some lower-level language, usually machine code. I think the term you are actually looking for is "lexer".
Our brains are very powerful; they interpret raw sensory data, which is a syntax more complex than any that has been constructively imagined.
Well how would you expect to imagine it if it's supposed to be beyond what a human can process? (Just kidding.)
No, it isn't. Sensory data is not a syntax. What's more, it is processed by specialized brain structures devoted to processing that specific data. Syntax of languages (including computer languages) is processed in a different region of the brain. And http://www.unreasonableman.net/2005/03/how_many_variab.html" that humans can only handle about 4 variables in a problem before they lose track.
Non-elegant programs typically overspecify tasks, and elegant programs read more similarly to natural language instructions.
Actually, you're confusing two different properties. I can write code that is not very readable, but does not "overspecify" much at all, and I can write code that "overspecifies" a lot in the interests of improving readability. And yes, sometimes it's nice to strive for both qualities, when I don't have to worry much about other considerations.
By the way, overspecifying is not something "generally agreed" to be a bad thing. It makes it less likely that one minor mistake will cause your program to do something entirely different from what you intended, without any errors. And, it can make code more readable.
I know (all too well) that mathematicians despise this "redundant information", but experienced programmers gradually learn to love it. Redundant information can be great. It can be a really good thing because it gets your point across firmly (both to the reader and the compiler). Redundant information can be good. It can be good.
Think of using subprocedures to improve the understandibility of a program, some ways are more elegant than others i.e. a few subprocedures that work together well vs. many highly independent and special case functions.
Actually, the latter is preferred in all cases I know of except in languages where function calls have high overhead. Highly independent, special case functions are "orthogonal design". We generally want that, unless we don't have time. It's a huge win for many reasons.