Python Loading .nii images while saving memory

  • Thread starter Thread starter BRN
  • Start date Start date
  • Tags Tags
    Images Memory
AI Thread Summary
The discussion centers on the challenge of loading large .nii datasets for a Deep Learning project on a PC with limited RAM and swap memory. The user seeks alternatives to load the datasets incrementally to avoid memory overload, suggesting the need for an iterator to load files on demand. They also mention the necessity to adjust the batch size for the DCGAN algorithm due to memory constraints. Suggestions include implementing an iterator in the loss function of the GAN discriminator and utilizing TensorFlow's image loading utilities. The conversation emphasizes the importance of efficient memory management in deep learning tasks.
BRN
Messages
107
Reaction score
10
Hello everybody,
for my Deep Learning exam, I have to develop a project that provides for the generation of 3D images in format .nii using the DCGAN algorithm trained with some real images of MRI scans of brains of patients with Alzahimer.

I have a serious problem. I should load three different datasets that weigh 3GB, 5GB and 8GB respectively. Unfortunately I am forced to use an old PC with only 4GB of RAM and 2GB of Swap memory, so I am impossible to upload the files at one time using this simple code:

[CODE lang="python" title="loading files"]
train_data = []
data_path = './data/ADNI_test/'
for i in range(len(filenames)):
mri_file = data_path + filenames
train_data.append(nib.load(mri_file).get_fdata())
[/CODE]

Would any of you know how to give me some alternative solution? Is there a way to upload the files a little at a time without overloading memory? In DCGAN algorithm the batch size is set to 64 files, but I will certainly have to decrease to 30.

Thank you!
 
Technology news on Phys.org
I wouldn't have thought it would be necessary to have all the image files in memory; can you provide the ML engine with an iterator that loads each file on demand?
 
I hope it can be done, but at the moment I don't know how.
Should the iterator be implemented directly in the loss function of the GAN discriminator?

[CODE lang="python" title="discriminator"]
def discriminator_model(strides, kernel_size, input_img, weight_initializer, downsample_layers):
rate = 0.2
filters = input_img.shape[1]

model = Sequential()

model.add(tf.keras.layers.Conv3D(strides, kernel_size, filters, input_img.shape, padding = 'same',
kernel_initializer = weight_initializer))
model.add(tf.keras.layers.LeakyReLU())
model.add(tf.keras.layers.Dropout(rate = rate))

for l in range(downsample_layers - 1):
filters = int(filters * 2)
model.add(tf.keras.layers.Conv3D(strides, kernel_size, filters, padding = 'same'))
model.add(tf.keras.layers.LeakyReLU())
model.add(tf.keras.layers.Dropout(rate = rate))

model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(1))
return model

def discriminator_loss(real_output, fake_output):
real_loss = cross_entropy(tf.ones_like(real_output), real_output)
fake_loss = cross_entropy(tf.zeros_like(fake_output), fake_output)
total_loss = real_loss + fake_loss
return total_loss
[/CODE]

where 'real_output' would be the real image that is compared with that artificially generated.

Some idea?
 
Dear Peeps I have posted a few questions about programing on this sectio of the PF forum. I want to ask you veterans how you folks learn program in assembly and about computer architecture for the x86 family. In addition to finish learning C, I am also reading the book From bits to Gates to C and Beyond. In the book, it uses the mini LC3 assembly language. I also have books on assembly programming and computer architecture. The few famous ones i have are Computer Organization and...
hi; i purchased 3 of these, AZDelivery 3 x AZ-MEGA2560-Board Bundle with Prototype Shield and each is reporting the error message below. I have triple checked every aspect of the set up and all seems in order, cable devices port, board reburn bootloader et al . I have substituted an arduino uno and it works fine; could you help please Thanks Martyn 'avrdude: ser_open(): can't set com-state for "\\.\COM3"avrdude: ser_drain(): read error: The handle is invalid.avrdude: ser_send(): write...
Back
Top