Mathematica Slows Down with Repeated Evaluation

  • Mathematica
  • Thread starter michaelp7
  • Start date
  • Tags
    Mathematica
In summary, the conversation discusses a code written to analyze a large collection of PDB files for cation-pi interactions. The code works as intended, but slows down significantly after multiple runs. The issue is identified as the need to close all open streams, and a solution is provided to do so. It is suggested that the code would be faster if each file was opened, processed, and closed individually.
  • #1
michaelp7
6
0
I'm trying to analyze a fairly large (order 10^3) collection of PDB files to look for cation-pi interactions for a class. This requires me to parse each PDB file into a table which gives the position of each atom in the file. To do this, I've written the following code:

Timing[Open[AllFiles[[2]]]; (*AllFiles is a list of the filenames in the directory. I intend to replace the 2 with an iteration index when the code is working*)
Lines = Import[AllFiles[[2]], "Lines"];
FullTable = {};
Do[
LineStream = StringToStream[Lines[[j]]];
QATOM = Read[LineStream, Word];
If[QATOM == "ATOM", (*This condition looks for lines that actually describe atoms, instead of other information*)
ThisLine =
Read[LineStream, {Number, Word, Word, Word,
Number, {Number, Number, Number}}];
If[Or[StringLength[ThisLine[[3]]] == 3, StringTake[ThisLine[[3]], 1] == "A" (*This condition eliminates duplicate listings*)],
FullTable = Append[FullTable, ThisLine]]
]
, {j, 1, Length[Lines]}];
]

The code does what it's supposed to, but it slows down significantly each time I run it. The first run takes less than .2 seconds, but by the fifth run, it's already above 25 seconds to parse the same file. Quitting the kernel session solves the speed problem, but of course this deletes all my data. CleanSlate, ClearAll, and adjusting $HistoryLength all had no effect. I haven't come across a solution on this forum yet, so I would appreciate any suggestions.
 
Physics news on Phys.org
  • #2
Update-- I think the problem is that I need to close all these input streams. This post seems to address the same issue:

http://groups.google.com/group/comp...a/browse_thread/thread/5dc2bf7e4793418d?pli=1

I get some improvements when I close the streams in a modified version of this program, but when I try it on the original code, it still slows down. I think I'm missing some streams. Is there a command that would let me close ALL open streams?
 
  • #3
If you evaluate

s = Streams[]

you will see that s is the list of the open streams you have.

It looks like, unless you are fiddling around with stdout and stderr, that

Map[Close, Drop[s, 2]]

will close everything except the first two, stdout and stderr.

BUT I would be very cautious with that. You might want to do more processing on the result from Streams[] to make sure you weren't closing something you didn't want to.
 
  • #4
Thanks! That sped it right up.
 
  • #5
You would likely find your code would be faster if you had

Open[]
Read[]
Close[]

and you did that for each individual file, rather than opening thousands of streams, crunching the data and then closing the thousands of streams.
 

1. Why does Mathematica slow down with repeated evaluation?

Mathematica slows down with repeated evaluation because each time a function or expression is evaluated, it stores the result in memory. As more evaluations are performed, the memory usage increases, causing the program to slow down.

2. How can I prevent Mathematica from slowing down with repeated evaluation?

To prevent Mathematica from slowing down, you can use the Clear or ClearAll functions to clear all stored values and definitions. This will free up memory and allow for faster evaluations.

3. Is there a way to speed up Mathematica when working with large datasets?

Yes, there are several ways to speed up Mathematica when working with large datasets. One option is to use the Compile function to create compiled versions of your code, which can significantly improve performance. Another option is to use parallel processing techniques, such as ParallelTable or ParallelMap, to distribute the workload across multiple cores or processors.

4. Can inefficient coding cause Mathematica to slow down with repeated evaluation?

Yes, inefficient coding can contribute to Mathematica slowing down with repeated evaluation. This can include using unnecessary or redundant functions, not taking advantage of built-in functions, or using inefficient data structures. It is important to write efficient code to avoid slowing down the program.

5. Are there any settings or options I can change to improve Mathematica's performance?

Yes, there are several settings and options that can be changed to improve Mathematica's performance. These include adjusting the memory allocation, changing the evaluation mode, and setting the appropriate working precision. It is recommended to experiment with different settings to find the optimal performance for your specific task.

Similar threads

  • MATLAB, Maple, Mathematica, LaTeX
Replies
6
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
1
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
22
Views
3K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
1
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
2
Views
3K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
2
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
  • MATLAB, Maple, Mathematica, LaTeX
2
Replies
52
Views
11K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
746
Back
Top