Mathematica Mathematica 6.0 Changes Everything

  • Thread starter Thread starter Crosson
  • Start date Start date
  • Tags Tags
    Mathematica
Click For Summary
Wolfram Research has launched Mathematica 6.0, claiming it as the most significant update in two decades, aiming to integrate more technical computing features cohesively. While Mathematica 5.2 was already a leader in technical computing, the new version is expected to further replace various specialized software. Discussions highlight that Mathematica excels in symbolic computation and plotting, whereas MATLAB is preferred for numerical calculations and engineering tasks due to its simplicity and user-friendly syntax. Users recognize the distinct strengths of both programs, with Mathematica's steep learning curve being a notable drawback compared to MATLAB's ease of use. The ongoing conversation suggests that while Mathematica is improving, it still has a way to go before it can fully compete with MATLAB in engineering applications.
  • #31
Crosson said:
The mainstream position is that reliability. writability, etc. are trade-offs in the short term, and that they all improve in the long-term. I reference you to Fortran - less writable, less readable, less reliable, less functional etc. then languages created more recently.

I don't know Fortran, but what I've been told is that it was (and still is) uniquely suited to be compiled to very fast code (faster than compiled C code). I've discussed this with people who use it to this day, and they point out things that seem like flaws, and show how those things improve "compilability". I also think it's considered more reliable than the average modern language, though I'm not sure.

I acknowledge that old versions of Fortran (such as F77) have some limitations that are not really necessary. That is, they have some drawbacks that don't give you anything in return (for example, 6-character function names). But my understanding was that as it has matured, it has "kept up" with the latest generation of programming languages in these respects (the 6-character limit is gone, for example), and has remained mostly a language optimized for compilation to fast code.

I suppose it can be said that genuine improvements have been made since the 1970s, but that was a long time ago, when "computer" was still an odd word to most people. Computers today are important enough and widespread enough that I think most major programming languages have come close to the edge of what can be achieved without trading off something. An exception to this is that most languages today are text-based, and it is conceivable that more sophisticated representations could become popular.

It is evidence that, if we solved syntax, we would solve it all. You are acting as if NP has been shown to be not equal to P. The fact that syntax processing is NP-complete, and yet the human brain seems to do it in polynomial time, suggests to me that P = NP. I really hope P = NP.

That's a surprising assertion. The human brain processes syntax in a small amount of time, and the syntax we use is simple syntax. I don't see how you come to the conclusion that we do it in polynomial time. We could even be doing it in exponential time. No way to know, since we don't know the "constant factor" involved, and since the problems we solve are very simple. And people make syntax errors all the time (you've made a few in this thread, and I probably have too). The more complex the syntax, the more errors people make. At least that's always been true for me. Maybe you would argue that I haven't had enough time to "adapt".

I'm going to have to create an example syntax that you can't figure out. Give me some time to work on it.

Also, I (along with the majority of CS experts) seriously doubt P=NP (new thread, maybe?). The problem has been explored so much, and the NP class of problems includes so many things that really "feel" similar to each other, and the known P problems "feel" so different from NP problems, it just seems intuitively unlikely that they are the same category. Besides which, they've both been explored so thoroughly and aggressively that I think the counterexample or "bridge" between the two would have been found by now if it existed.

Axioms are not elligible for proof or disproof (I will excuse this, and not question that you don't know what an axiom is).

I was specifically pointing out that it would be nonsense to make such a claim.

The theorem you linked me to has very little to do with the topic at hand, besides analogy (show me wrong, describe exactly how the theorem applies to what we are talking about, or reference someone who has).

I didn't mean it as only an analogy, I meant it quite literally. A programming language is literally a coding (think about the ASCII codes of the characters, if you need a bit stream). Programmer goofs are literally noise in this coding (the bitstream represents something different than the sequence of instructions in the programmer's mind). Compilers catching errors are literally using redundant information in the coding to detect bit errors (usually multi-bit).

Therefore, number of errors in the code that is sent through the "channel" (the human-computer interface) is directly dependent on how redundant your coding is.

I find this 'theorem intimidation' to be a form of sophistry. The fact is that mathematics is inexact in its application to reality, and impossibility theorems are especially slippery to apply: usually someone circumvents them by a loophole in the hypothesis.

I don't mean to intimidate. Maybe there is a loophole, but I've thought this way for some time, so if I am wrong I would honestly like to know how. When I look at languages that overspecify, which require you to type unnecessary extra information to do things, I find that the compiler catches more errors, because I'm unlikely to make the same mistake everywhere.

Let me use a concrete example. Imagine a language that does not require you to declare variable types. If the type is clear from the operations you perform on it, this information is redundant. Declarations are unnecessary.

But languages which require declarations remain popular precisely because they require this redundant information. With that information, the compiler can detect when you have mistakenly typed a nonexistent variable (like your MATLAB example), or when you are abusing a variable (trying to do something that doesn't apply to that data type). In a less overspecified language, these errors would compile just fine, but would be bugs.
 
Physics news on Phys.org
  • #32
The ability to recognize complex patterns (with our eyes) is an NP problem. You can write a polynomial algorithm to check if you have a match with a given memory. No one has an algorithm to search for a match to the pattern in less than an exponential number of steps in the size of the problem. I agree we don't know how the brain works. I think P = NP, and 15 years ago the majority thought it was true.

Because mathematica wants to accommodate fast scripting and reliable projects, lots of reliability stuff is optional. For example you can define a function:

f [x_] = MyFavorite[SomeFunction[x]]

Where the x_ will match any expression (dynamic typing). If SomeFunction is made to act on integers, we could define it as:

f[x_Integer] = ...

More complex patterns can be allowed:

f[ {x_Integer, p_list} , y__ ] = ...

The first argument is a pair, with an integer followed by a list of a arbitrary types, then a sequence of arguments of optional (indefinite) length.

These features are not amazing, but they do show how the idea of type checking parameters has evolved since the 1980s.
 
  • #33
Crosson said:
Mathematica and Matlab are similar in performance. Mathematica has a few performance pitfalls, but so does Matlab. If programmed correctly, either one can be within 10 to 100 times slower than Fortran.

Can you substantiate that statement, for Matlab? (I neither know nor care about the performance of Mathematica).

It seems completely wrong, based on my experience with both languages (and I wrote my first Fortran program about 40 years ago). The computational kernels built into Matlab are usualy more efficient that the average programmer's attempt to do the same thing, in ANY language. Writing a Matlab program that doesn't use anything except its "low-level" programming language capabilities is (to put it politely) completely dumb. That's not what it's designed for.
 
  • #34
Writing a Matlab program that doesn't use anything except its "low-level" programming language capabilities is (to put it politely) completely dumb. That's not what it's designed for.

I was going to give you an example, but now I don't have to. All I was saying is that it is possible to be "dumb" in either language. Part of the design of a programming language is that people will use it in ways that it was not intended to be used, and in Matlab this can affect performance badly (using for loops instead of vector operations, for example). This is what I call a "performance pitfall" because someone new to the language could fall into it.

It seems completely wrong, based on my experience with both languages

Sounds like you had a pitfall in Mathematica, but learned Matlab more thoroughly. Or you used Mathematica exclusively before version 5.0, which was about 5 years ago.

The computational kernels built into Matlab are usualy more efficient that the average programmer's attempt to do the same thing, in ANY language.

That's an interesting usage of 'usually' followed by 'ANY'. Since you don't know and don't care about Mathematica, which is just one of the many languages, you obviously don't know what you are talking abot with the use of the universal quantifier.

Please recognize that you use Matlab because of its libraries, and not because of its efficiency in execution. Because Matlab code is interpreted, it is often 100 to 1000 times slower then compiled C++ code (on a basic For loop, for example). If you are relying on the strength of the Mathematics in the Matlab libraries, I would like to see an example of what you are saying Matlab does faster than an average programmer could in ANY language.
 
Last edited:
  • #35
Maybe I'm just weird but I'm suprised people are saying MatLab is easier to use off the bat than Mathematica. I had to learn to use one of the two of them at the beginning of the year and I tried MatLab first. Didn't have a clue WTF was going on. No idea how to make it do anything, despite reading the help documentation. Mathematica was much more straight forward. The help documentation made everything obvious and within a few minutes I'd managed to do a few basic things like plot graphs, algebraicly manipulate expressions etc.

I'd done some basic C coding a year or two before that and so get the whole defining variables thing but MatLab really just made no sense to me at all.
 
  • #36
Crosson said:
The ability to recognize complex patterns (with our eyes) is an NP problem. You can write a polynomial algorithm to check if you have a match with a given memory. No one has an algorithm to search for a match to the pattern in less than an exponential number of steps in the size of the problem.

I'm well aware of that. But what makes you think the brain is solving this problem in polynomial time? I'm pretty sure it isn't. Neural networks which can solve problems of this general type have certainly been built. They require more than polynomial time. But if you have 30 billion processors working on a relatively small NP-hard problem, things go pretty fast.

These features are not amazing, but they do show how the idea of type checking parameters has evolved since the 1980s.

Sure, I'll grant you that. But this is just a way of making a tradeoff easier. You still have to actually make the tradeoff. You aren't getting both the reliability and the looseness/ease at the same time. You still have to choose one or the other when you write the code. Things are a little better because you can make this tradeoff per-function rather than per-program.

This reminds me of some C arguments I've had with some people. C gives you the same kind of freedom in some areas (I am certainly NOT trying to claim that C is on the level of a high-level language like Matlab or Mathematica, so don't even start down that path). It allows you to do some things the comparatively easy, loose, quick way or the more rigorous way. But where I work, we're required to do everything the rigorous way all the time! D'oh!

Xezlec said:
I'm going to have to create an example syntax that you can't figure out. Give me some time to work on it.

Given the direction this thread has taken, and my general lack of time (note how long it's been since last I posted), I'm abandoning this. It's not directly relevant to any of my points anyway. I'm not actually going to argue that Mathematica's syntax might be "too hard for the human mind". That would be obviously false since it's regularly used by lots of people.

AlphaNumeric said:
Maybe I'm just weird but I'm suprised people are saying MatLab is easier to use off the bat than Mathematica. I had to learn to use one of the two of them at the beginning of the year and I tried MatLab first. Didn't have a clue WTF was going on.

Weird. I got it almost immediately. I guess it depends on the person, or something. Maybe it helps if you used to code in BASIC back in the day. :cool:

Well, as a final note, I would really like to try out Mathematica. Unfortunately, I'm no longer a student, and $2500 is a lot of cash to scrape together. So it might not happen. :frown:
 
  • #37
The only thing that's ever really irked me about Mathematica (aside, from the rather 'kludgy' programming language, which I don't use much, anyway) is the licensing regarding UNIX systems. Wolfram does not consider licenses for UNIX systems to be applicable to the student, and thus, does not include the UNIX versions under the student license. In order to legally use Mathematica on a UNIX system, you must have the full $2,500 license, which of course, I've had to acquire to use Mathematica on my AIX and Solaris systems.
 
  • #38
Mathematica is lousy for making GUIs?

Weighing in very late. I'll be surprised if there's a reply, but here goes:

Where I work (gov't lab), a common situation is that an expert, who has developed a program to calculate something nontrivial, needs to put this program into a form that others can easily use. The expert is usually some sort of physical scientist, not a CS type. The "others" are physical scientists or technicians who only care about ease of use, and do not necessarily know how to use whatever package was used to develop the program. So, of course they want a GUI. These GUIs usually have lots of inputs and lots of functions. They are non-trivial. See attached jpg for example (and note that the other 3 tabs are equally complicated).

I have used VisualBasic, Visual C, Visual Fortran, and Matlab. My experience so far has been the following:
1) Nothing beats VB for making nice GUIs easily. Its built-in graphing is awful, though. However, you can buy inexpensive add-ons to give nice graphing, so that's not really an issue. The killer is interfacing it with C or Fortran, because VB itself is worthless for computing anything. (caveat: I haven't done this since VB4. Anyone know if interfacing has become easier?)
2) VC, VF, and Matlab are all about equally painful. Simply put, making user-friendly GUIs with them is not very user-friendly or GUI-driven. However, they do try. At least they have a nice GUI designer.
3) I have looked at Mathematica 6, and passed. Maybe I missed something huge when looking at the online documentation, but it looks to me like making cool GUIs for super-simple tasks is now amazingly simple, but real-life GUIs still need to be coded entirely by hand, line by line. (Corrections to this are welcome. If you're a CS-type or a Mathematica expert, don't worry about talking down to me -- I won't even realize it. I have a current project for which I'd like to use Mathematica, but this issue is a killer.)

In case this helps, my programming background is as follows: trained as physicist; started on mainframes with punch cards in F77 and on PDP-11s in assembly code; over the years have written programs (as part of my research) in F77, assembly, Pascal, F90, VB4, C++, Matlab, MathCad, and probably others I've forgotten about.
 

Attachments

  • ScreenShotMMIcalc.jpg
    ScreenShotMMIcalc.jpg
    30 KB · Views: 501

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
10K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K