Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Memory Management

  1. Oct 18, 2009 #1
    I'm learning about memory management in my programming languages class, and I thought it would be beneficial to post a small write-up detailing my current understanding so that more knowledgable people could critique (and hopefully expand) it.

    Static: Static allocation refers to objects & variables whose "lifetimes" are equal to that of the running programming (i.e. they're around from the beginning to the end of execution). Global variables are a great example of static objects. Global variables belong to the program as a whole (as opposed to a local variable or an instance field).

    Stack: Stack allocation usually refers to a call stack. The first function called (e.g. main) is located at the bottom of the stack. Any subroutines called by main are then pushed on the stack, meanwhile main waits for said subroutines to returns. When a particular function is called, it is given a "frame" on the stack. A frame contains information regarding the function call it represents (e.g. local variables, argument names, etc.). It can also be said that a single stack represents a single thread. Therefore, concurrency would imply multiple stacks because concurrency involves multiple threads.

    Heap: Heap allocation usually refers to the space reserved for dynamically-created things (e.g. objects/class instances). A variety of different algorithms are used to manage the space within the heap (e.g. first fit, best fit, etc.). Two problems usually creep up when managing the heap: internal & external fragmentation. Internal fragmentation occurs when a larger-than-required block of memory is assigned to an object (effectively wasting the unused space within the block because you cannot access the space). External fragmentation occurs when the remaining unused space is fragmented into blocks that are too small to really fit any objects that may be created in the future.
  2. jcsd
  3. Oct 22, 2009 #2


    User Avatar
    Gold Member

    that's a pretty good summary; some students do a lot of programming and never achieve this much insight. good job.
  4. Oct 22, 2009 #3


    User Avatar
    Homework Helper

    From what I've been told by another programmer, windows .net framework has a collection - compaction algorithm to merge small allocated objects into common memory pages when other objects are released from memory pages. This eliminates the problem of large number of small objects being allocated and released, but consuming partially used pages of memory. I'm not sure of the algorithm used to keep track of the small allocated objects.

    Page size on an Intel CPU is 4096 bytes (or 4 MB, but that's rarely used), so it's mostly an issue for small objects.
  5. Oct 23, 2009 #4
    Also, most operating system utilize virtual memory, where the assigned memory space does not represent the physical addresses of ram. For example, a program can see pages of virtual memory increased linearly, but they actually lie in totally different areas of ram.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook