Why Does My Perl Script Crash with Large Data Sets on Raspbian?

  • Thread starter Thread starter Borek
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around a Perl script running on Raspbian that crashes when processing large data sets. Participants explore potential causes for the crashes, including memory limitations and data formatting issues.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested

Main Points Raised

  • One participant notes that the script works for small data sets but fails for larger ones, suggesting a possible memory limitation.
  • Another participant observes that the script uses a significant amount of CPU but only a small percentage of RAM, indicating that memory may not be the issue.
  • A later reply identifies a potential cause related to line ending formats (CRLF vs LF) as the reason for the crashes.
  • One participant requests the script and data set for further analysis but acknowledges that the original poster has resolved the issue.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the initial cause of the crashes, as the discussion evolves from memory concerns to data formatting issues. The final resolution regarding line endings suggests a shift in understanding, but earlier disagreements about memory limitations remain unresolved.

Contextual Notes

Limitations include the lack of detailed information about the script's implementation and the specific nature of the data sets being processed. The discussion does not clarify whether other potential issues could contribute to the crashes.

Borek
Mentor
Messages
29,203
Reaction score
4,626
Disclaimer: the only thing I know about Perl is the language name.

Raspbian on Pi (512 MB RAM).

I have a Perl script that I want to use. And it works - sort of.

Problem is, it works OK for small data sets, but it fails for large ones. When it works, it works for some time and then it spits out the result. In case of known errors (like a wrong file name) it displays an error message (while I don't know Perl I see these are coded in the script). When it doesn't work - it initially works, then just ends, without any messages.

My first odea was that it is limited by the memory. For a large set of data I see it (with top) using 98% of CPU and allocating more and more memory, but when it stops it is at 3% RAM, so just about 15 MB, not that much.

Any ideas what I can do? Is there a way of checking why it crashes? Can it be related to some memory limit per process? Or is there some limit set to amount of memory available for Perl?

I did some blind googling but all I see seems to be suggesting memory should not be a problem here.
 
Technology news on Phys.org
OK, for now ignore my message.

I did some more checks and it looks like the problem can't be memory related. Data sets are just list of words. Script works for a fairly short data set that I entered manually, but it never works for a more complicated ones, prepared by other script (we are talking 3 words vs 84 words). It means there must be some other problem.
 
Sigh. CRLF vs just LF.

Case closed.
 
can you post the script and the dataset?

Edit: Just saw that you solved your problem.
 

Similar threads

Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 39 ·
2
Replies
39
Views
5K
  • · Replies 7 ·
Replies
7
Views
7K
Replies
2
Views
3K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
4K