- #1

loukoumas

- 15

- 0

I hope you may be able to shed light on a recent problem that i have to deal with.

I keep working on a numerical solution on a heat diffusion problem. It is a numerical solution for the PDE

θT/θt=a*(θ^2T/θx^2 + θ^2Τ/θy^2),

that describes the heat diffusion on a two dimensional problem. The numerical method is Crank-Nicolson and says that every point's temperature of the mesh, is given by the temperature of the four points surrounding it and by the temperature of these points on the previous time point. ( i guess you are familiar with all these, but I'm writing them just in case...)

So i use a dx,dy mesh and i also use a time step as a dt, to describe a heat diffusion process on an aluminium plate. The problem started when i tried to use shorter dx,dy spatial divisions!The results in matlab's workspace were "NaN". The solution was to use a shorter time step. When the time step was short enough my results became the expected! The algorithm was written in MATLab.

Here are my questions and my problem!

My professor at the university, that i am working these algorithms for, claims that he can have results, from an algorithm written in FORTRAN, by not depending on the time step. In fact he did never mentioned anything about the dependence of the time step. And he really sent me some results, close enough to my results, that were calculated with a time step longer than mine(0.01 against 0.008 which is the longer i can use).How can this be possible?

I found and sent him a page on wikipedia about the "CFL condition". The answer was that it refers on hyperbolic PDEs while our equation is parabolic.

But there is something more important for me. Here is my real problem. While i get the mesh larger by using shorter dx,dy i use shorter time step "dt". That means more calculations because i use more points. And also much more calculations to go further on the diffusion process, due to the short time step (i need 100000 iterations to get results for the 800th second with dt=0.008s). My MATLAB algorithm takes about half an hour for 2000 iterations! Pretty disapointing! Again my professor says that he needs about a minute for most of the calculations. My computer is a notebook Toshiba, 1GB RAM and 1,66GH processor. (my algorithm is about 120 lines, pretty simple i believe!). Whould it be much better if i wrote my code in FORTRAN? I am not familiar enough with this language.

Thanks a lot for your time!