How is Int32 stored physically?

  • Thread starter Thread starter frenzal_dude
  • Start date Start date
AI Thread Summary
Int32 variables in C# are typically stored in memory rather than directly in a 32-bit register, as registers are limited in number. When calculations are performed, the variable's value is moved from memory to a register for processing. Memory is organized in a hierarchy, with registers being the fastest and closest to the CPU, followed by cache and then RAM, which is slower and external to the CPU. The operating system manages this hierarchy, moving data between levels as needed for performance. Understanding this memory structure is crucial for optimizing application performance, especially when tuning for speed.
frenzal_dude
Messages
76
Reaction score
0
I understand that in 32 bit windows, the registers are 32 bits long.
Does this mean that when you create an Int32 variable in C#, the actual number is physically stored on a 32bit register?
 
Technology news on Phys.org
yes, and in 32 bit memory when you store it in memory.
 
frenzal_dude said:
I understand that in 32 bit windows, the registers are 32 bits long.
Does this mean that when you create an Int32 variable in C#, the actual number is physically stored on a 32bit register?
When you create an Int32 variable, it's almost certain to be stored in memory somewhere, not in a register, since there are only a small number of them. The value of this variable is copied to a register when some calculation using the variable needs to be done.
 
Mark44 said:
When you create an Int32 variable, it's almost certain to be stored in memory somewhere, not in a register, since there are only a small number of them. The value of this variable is copied to a register when some calculation using the variable needs to be done.

When you say stored in memory, doesn't that mean it will be stored in a physical register? Because memory is made up of registers?
 
Thinking of memory as lots of registers is not quite right. It is more of a high speed parking lot for data. A register (which lives in the ALU of the cpu) can perform operations like shift left, or, and xor. In order to perform an operation on data in memory the data is first moved into a register, then the operation is performed on the data. Next, the new data is placed back in the same parking spot in memory.

This is an awful oversimplification, but you see how memory is different from registers.

Please read this. It will explain things in enough detail to help you a lot:

http://literacy.kent.edu/Midwest/Materials/ndakota/complit/introbasics.html
 
As a second thought maybe be the confusion you have comes from some computer languages.

Code:
register int foo=0;

The above C language snippet is telling the compiler to [please try] to store the foo variable in a register because it is faster than having to move it back and forth between the register and memory. Is something like this what you've seen? Unless the system has lots of registers, this declaration becomes more of a mild suggestion to store foo in a register, than a mandate.
 
Where it goes when you create a variable depends on the computer architecture, compiler, operating system.

There is a hierarchy of memory storage in a computer architecture: registers, cache, RAM. Moving left to right through that list you go from faster/smaller to slower/bigger.

When you create a variable most likely it is going in cache. As your operating system/architecture manages tasks/memory the cache data may be moved into RAM and back as required to maximize performance. As you operate on the variable (like adding two together) the architecture/operating system will move the variable between cache and registers.
 
Floid said:
Where it goes when you create a variable depends on the computer architecture, compiler, operating system.

There is a hierarchy of memory storage in a computer architecture: registers, cache, RAM. Moving left to right through that list you go from faster/smaller to slower/bigger.

When you create a variable most likely it is going in cache. As your operating system/architecture manages tasks/memory the cache data may be moved into RAM and back as required to maximize performance. As you operate on the variable (like adding two together) the architecture/operating system will move the variable between cache and registers.

you mentioned registers, cache, and RAM, but RAM is just a stack of registers, and I think a cache would physically be made up of registers too right?
 
frenzal_dude said:
you mentioned registers, cache, and RAM, but RAM is just a stack of registers, and I think a cache would physically be made up of registers too right?

Usually by "register" we mean kind of a storage that is present inside processor, and is not only used for storing the information, but also to make operations on it. You can't make operations of information stored in RAM nor cache, you have to fetch it into processor register first. I guess it is only a nomenclature thing, but the distinction is well established.
 
  • #10
frenzal_dude said:
you mentioned registers, cache, and RAM, but RAM is just a stack of registers, and I think a cache would physically be made up of registers too right?
No and no. RAM, cache, and registers are forms of memory, but they're all different. As already noted, registers are part of the CPU and usually have names (e.g. AX, EAX, and others on the Intel CPUs). Registers are physically very close to the ALU, so access is very fast. Cache memory is (I believe) also located in the CPU, so access is fast, but not as fast as register access. Cache memory is organized by what are called lines, a string of memory locations.

RAM is located on RAM chips, which are external to the CPU, hence farther away, so access is slower. They are relatively slow, but make up for their slowness by huge numbers of addressible memory locations.

Another form of memory is the storage on hard disks and CD/DVD drives memory, in addition to read-only memory (ROM), which the computer uses to keep track of time and to perform basic operations such as loading an OS into memory when the computer starts up.
 
  • #11
Computer memory is a bunch of layers, starting with smallest-fastest memory and going to largest-slowest memory. The layers are transparent to programmers (except at machine language level). The operating system and hardware move data up and down the hierarchy or layers as data is needed. The hierarchy looks something like this on most machines:
* registers
* cache (L1)
* cache (L2)
* RAM
* swap file on a disk

In order for data to be transformed or manipulated, it must move all the way up the hierarchy, through each layer, until it resides in a register. There it can be changed, then it moves back down to the hierarchy. Much of the time, a data item or number is just sitting in "memory" not being used, and at those times, it can be anywhere along the hierarchy, including out on the disk.

This is highly simplified, but as a programmer (except when programming against the metal) you usually don't care too much about where the operating system stores a number. The time when you DO care is if you have to "tune" an application to run faster. And that is a very complex matter. You'd need to read an entire book on memory hierarchy (in a computer architecture book) to even begin to start understanding all the tricks employed in the OS and hardware to make this work well.

Howeverk, programmers generally DO care about how "big" the data item is, that is, 32 bits for an Int32. The data width determines the largest (or smallest) integer that can be used.

In real life, we do a similar thing with books. A book can only be read when it is out on the table and open ("in a register"). We keep some books on the bookshelf (i.e., in "RAM"), and go get them when we need to read them. We keep other, less-often-needed books, maybe in a box in the basement (i.e., "in the swap file on disk").
 
Back
Top