After a week or two writing and speed profiling some Fortran code, I can honestly say that I'm extremely happy with it's performance and it's expressive power. As a matter of fact my fairly simple least squares solver appears to be competitive with Lapack's gels subroutine. Especially nice...
Back in the day I was using cython to hook into Nvidia's CUDA. It's great for fine grained parallelization, and thus works best for the same code replicated over a lot of data. I might be able to shoehorn my problem into that, but the lowest hanging fruit is to simply be able to do something...
Yes I've already completed a good sized project in Julia and it was pretty nice. Fortran is faster though isn't it? Why can't Fortran be cutting edge? It seems that Fortran 2008 incorporates a bunch of modern idioms. Maybe the only thing they aren't doing is making functions first class...
It all depends on what you call "learn". Like many engineers/scientists back in the day, Fortran was the language they taught you in your intro programming course. It was not a pleasant experience, but punchcards, limited computer access, and a hostile, student hating professor were part of...
Super appreciate your reply StoneTemplePython. Over the years I have become leery of complete rewrites, so I'm somewhat partial to my cython code since I have reasonable confidence it is correct. Although it would be reasonable to consider converting it back to python and then optimizing hot...
I have a simulation that involves a lot of dense linear algebra with a lot of complex arrays, that was written in python's numpy and then sped up with cython. Unfortunately I'm stuck with python's GIL (global interpreter lock), which prevents the use of numpy with multi-threaded code.
I'm...