Can a Large Matrix Allocation Cause a Windows Error in Circuit Simulation?

Click For Summary
SUMMARY

The discussion centers on the limitations of matrix allocation in C++ during circuit simulation on Windows systems. A user encountered a crash when allocating a matrix larger than 28,000 elements, which was resolved by switching the variable type from "int" to "unsigned long int," allowing for a greater range of positive integers. The user also reduced the matrix size by using three single arrays, which improved stability and allowed for a maximum of 149,896 elements. The issue was identified as a segmentation fault, likely due to memory allocation conflicts.

PREREQUISITES
  • Understanding of C++ data types, specifically "int" and "unsigned long int"
  • Knowledge of memory management and segmentation faults in programming
  • Familiarity with array allocation and manipulation in C++
  • Basic concepts of circuit simulation and its computational requirements
NEXT STEPS
  • Research C++ memory allocation techniques and best practices
  • Learn about handling segmentation faults in C++ applications
  • Explore the differences in memory management between Windows and Unix systems
  • Investigate dynamic memory allocation using "new" and "delete" in C++
USEFUL FOR

C++ developers, circuit simulation engineers, and anyone troubleshooting memory allocation issues in programming.

Phymath
Messages
183
Reaction score
0
Hey,

In doing a circuit simulation i want to allocate a matrix like this (c++)

double S1store[3][steps+1]

i get no compiling errors, but if i make steps greater then 28000 the program gives me a windows error and crashes, it runs fine on a unix system. is there a limit to the amount of data you can allocate at one time? (i have 1GB RAM 3.2 Ghz processor) i don't think that would fill up. Or do you think the compiler doesn't complie correctly when your matrix is that large?

thanks for any help, i would have put this is programming but i know there's better programmers in physics lol
 
Physics news on Phys.org
There is a limit, it should be either 32768 or 65536 , not 28000. But I'll assume you went from 28000 to 35000 or something, and that's when it crashed. Try changing the steps to be of type "unsigned long int" instead of type "int" that should give you some more room. Unsigned tells the computer not to allow the int to store negative numbers, doubling the number of positive numbers it can hold. Long int tells it to allocate more bits to storage increasing the number of ints it can hold (but sometimes exactly how much it increases depends on the compiler / operating system, which is probably why this is working on *nix and not windows)

~Lyuokdea
 
ya thanks a lot its a segmentation fault good idea about the unsigned i already have the long int, i also reduced the matrix to three single arrays

SnstoX[steps], SnstoY[steps],SnstoZ[steps], now I am maxing out at around 149896 = steps. I'm thinking that the program is trying to "point" to memory spots already taken up in the program, as in SnstoX[1] could have the same mem spot as SnstoX[120000] and the computer is obviously segfaulting...im 99.5% positive I am not accessing memory not allocated in the arrays...as in I am not accessing SnstoX[steps] literally but SnstoX[steps-1] which is suposed to be the max, ill try the unsigned thing thanks if you got any more ideas that'd be helpful
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • Sticky
  • · Replies 13 ·
Replies
13
Views
8K
  • · Replies 3 ·
Replies
3
Views
8K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 7 ·
Replies
7
Views
7K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
6
Views
5K