Mathematica Mathematica Animate function question

AI Thread Summary
The discussion revolves around a 2D heat spreading simulation using Mathematica, where the user encounters issues with the Animate function for visualizing density plots over time. Initially, the Animate function only displays the first time step, but the user resolves this by adjusting the ColorFunction to use a predefined temperature map. However, they still struggle with scaling colors correctly across all time steps. A suggested solution involves defining a separate function for color mapping based on the entire dataset, ensuring consistent color representation throughout the animation. This approach addresses the issue of color function dependency on individual frames, allowing for a uniform color scale across the simulation.
Munin
Messages
4
Reaction score
0
Hi

Im doing a 2 dimensional heat spreading simulation.
I've created a matrix with 3 indices with the first index being for time step and the two other for element coordinates.


Code:
height = 20;
width = 4;
a = 0.5;
J = Round[height/a];
L = Round[width/a];
h = 0.1;
roomT = 20;
T = 90;
t = 1000;
Dev = Normal[ConstantArray[1, {t, J, L}]];
u = T*Normal[ConstantArray[1, {J, L}]];

J*L

v = Normal[ConstantArray[0, {J*L, 1} ]];
A = Normal[ConstantArray[0, {J*L, J*L}]];


u[[All, 1]] = roomT;
u[[All, L]] = roomT;
u[[1, All]] = roomT;
u[[J, All]] = roomT;

For[j = 1, j <= J, j++,
  For[l = 1, l <= L, l++,
    index = (j - 1)*L + l;
    v[[index, 1]] = u[[j, l]];
    ];
  ];

For[i = 1, i <= J*L, i++,
 
 A[[i, i]] = -4;
 
 If[i + L <= J*L, A[[i, i + L]] = 1, A[[i, i]] = A[[i, i]] + 1];
 
 If[i - L >= 1, A[[i, i - L]] = 1, A[[i, i]] = A[[i, i]] + 1];
 
 If[i + 1 <= J*L, A[[i, i + 1]] = 1, A[[i, i]] = A[[i, i]] + 1];
 
 If[i - 1 >= 1, A[[i, i - 1]] = 1, A[[i, i]] = A[[i, i]] + 1];
 ]
A = A/a^2;


Ett = SparseArray[{i_, i_} -> 1, {J*L, J*L}];
Mtemp = Ett - h*A;
M = Inverse[Mtemp];

For[i = 1, i <= L, i++,
  M[[i, All]] = 0;
  M[[(J - 1)*L + i, All]] = 0;
  M[[i, i]] = 1;
  M[[(J - 1)*L + i, (J - 1)*L + i]] = 1;
  
  ];
For[i = 1, i <= J, i++,
  M[[1 + L*(i - 1), All]] = 0;
  M[[1 + L*(i - 1), 1 + L*(i - 1)]] = 1;
  M[[L*i, All]] = 0;
  M[[L*i, L*i]] = 1;
  
  ];
k = 1;

While[k < t + 1,
  For[j = 1, j <= J, j++,
   For[l = 1, l <= L, l++,
     index = (j - 1)*L + l;
     u[[j, l]] = v[[index, 1]];
     ];
   ];
  Dev[[k, All, All]] = u;
  If[u[[Round[J/2], Round[L/2]]] < 21,
   Break[]
   ];
  v = M.v;
  k++;
  ];
t = k;
Print[t]

The simulation works fine but when I try to run an animate function over the time steps it only plots a density plot in the first time step. Is this the correct way to do an animate funtion for my density plot?

Code:
Animate[ListDensityPlot[Dev[[m, All, All]], 
  ColorFunction -> (RGBColor[1, 1 - #, 0] &)], {m, 1, t}, 
 AnimationRunning -> False]

As it is now the color in the color function is relative to the highest value for the time step. Is there a way to get the colors relative to the highest value for all steps?
 
Physics news on Phys.org
Ok, I fixed the animate problem with this.
Code:
Animate[ListDensityPlot[Dev[[m, All, All]], 
  ColorFunction -> (ColorData["TemperatureMap"]), 
  ColorFunctionScaling -> True], {m, 1, t, 1}, 
 AnimationRunning -> False]

But I still can't get the colors to scale correctly, any advice on how to do this would be very helpful.
 
Define a function outside your Animate that is based on your entire dataset that would map any value to the color you desire and then use that inside your animate as an argument to your ColorFunction.

If you are defining your argument to ColorFunction the way you wrote it inside your Animate then it is only dependent on the available data for that frame of the Animate and that was exactly what you observed, the color function changes as the Animate frame changes.
 

Similar threads

Replies
1
Views
428
Replies
2
Views
2K
Replies
1
Views
2K
Replies
3
Views
4K
Replies
4
Views
3K
Replies
0
Views
1K
Back
Top