Loading .nii images while saving memory

In summary, the conversation discusses the development of a project for Deep Learning exam that generates 3D images using the DCGAN algorithm trained with real MRI scans of brains of patients with Alzheimer's. The main issue is the limitation of a 4GB RAM and 2GB swap memory on an old PC, making it impossible to upload three datasets at once. The person asks for alternative solutions and discusses the possibility of using an iterator to load files on demand. There is also a mention of using Keras for the project.
  • #1
BRN
108
10
Hello everybody,
for my Deep Learning exam, I have to develop a project that provides for the generation of 3D images in format .nii using the DCGAN algorithm trained with some real images of MRI scans of brains of patients with Alzahimer.

I have a serious problem. I should load three different datasets that weigh 3GB, 5GB and 8GB respectively. Unfortunately I am forced to use an old PC with only 4GB of RAM and 2GB of Swap memory, so I am impossible to upload the files at one time using this simple code:

loading files:
train_data = []
data_path = './data/ADNI_test/'
for i in range(len(filenames)):
    mri_file = data_path + filenames[i]
    train_data.append(nib.load(mri_file).get_fdata())

Would any of you know how to give me some alternative solution? Is there a way to upload the files a little at a time without overloading memory? In DCGAN algorithm the batch size is set to 64 files, but I will certainly have to decrease to 30.

Thank you!
 
Technology news on Phys.org
  • #2
I wouldn't have thought it would be necessary to have all the image files in memory; can you provide the ML engine with an iterator that loads each file on demand?
 
  • Like
Likes BRN
  • #3
I hope it can be done, but at the moment I don't know how.
Should the iterator be implemented directly in the loss function of the GAN discriminator?

discriminator:
def discriminator_model(strides, kernel_size, input_img, weight_initializer, downsample_layers):
    rate = 0.2
    filters = input_img.shape[1]
    
    model = Sequential()

    model.add(tf.keras.layers.Conv3D(strides, kernel_size, filters, input_img.shape, padding = 'same', 
                                     kernel_initializer = weight_initializer))
    model.add(tf.keras.layers.LeakyReLU())
    model.add(tf.keras.layers.Dropout(rate = rate))

    for l in range(downsample_layers - 1):
        filters = int(filters * 2)
        model.add(tf.keras.layers.Conv3D(strides, kernel_size, filters, padding = 'same'))
        model.add(tf.keras.layers.LeakyReLU())
        model.add(tf.keras.layers.Dropout(rate = rate))
        
    model.add(tf.keras.layers.Flatten())
    model.add(tf.keras.layers.Dense(1))
    return model

def discriminator_loss(real_output, fake_output):
    real_loss = cross_entropy(tf.ones_like(real_output), real_output)
    fake_loss = cross_entropy(tf.zeros_like(fake_output), fake_output)
    total_loss = real_loss + fake_loss
    return total_loss

where 'real_output' would be the real image that is compared with that artificially generated.

Some idea?
 

1. How can I load .nii images while saving memory?

To load .nii images while saving memory, you can use a specialized library or software such as Nibabel or ITK to handle the loading and saving of NIfTI files. These libraries are designed to efficiently manage memory usage and optimize loading and saving processes.

2. Will loading .nii images use a lot of memory?

The amount of memory used when loading .nii images will depend on the size and complexity of the image. However, using a specialized library or software can help minimize memory usage and optimize the loading process.

3. Can I reduce the memory usage when loading multiple .nii images?

Yes, you can reduce the memory usage when loading multiple .nii images by using a library or software that supports batch loading. This allows the images to be loaded in smaller chunks, reducing the overall memory usage.

4. Are there any techniques to optimize memory usage when loading .nii images?

Yes, there are several techniques that can help optimize memory usage when loading .nii images. These include using a specialized library or software, loading images in smaller chunks, and using compression techniques to reduce the size of the image data.

5. How can I check the memory usage when loading .nii images?

You can check the memory usage when loading .nii images by using system monitoring tools or by using the memory profiling features of the library or software you are using. This can help identify any potential memory issues and optimize the loading process.

Similar threads

  • Programming and Computer Science
Replies
3
Views
2K
  • Programming and Computer Science
Replies
1
Views
1K
  • Programming and Computer Science
Replies
3
Views
5K
Back
Top