Python Loading .nii images while saving memory

  • Thread starter Thread starter BRN
  • Start date Start date
  • Tags Tags
    Images Memory
AI Thread Summary
The discussion centers on the challenge of loading large .nii datasets for a Deep Learning project on a PC with limited RAM and swap memory. The user seeks alternatives to load the datasets incrementally to avoid memory overload, suggesting the need for an iterator to load files on demand. They also mention the necessity to adjust the batch size for the DCGAN algorithm due to memory constraints. Suggestions include implementing an iterator in the loss function of the GAN discriminator and utilizing TensorFlow's image loading utilities. The conversation emphasizes the importance of efficient memory management in deep learning tasks.
BRN
Messages
107
Reaction score
10
Hello everybody,
for my Deep Learning exam, I have to develop a project that provides for the generation of 3D images in format .nii using the DCGAN algorithm trained with some real images of MRI scans of brains of patients with Alzahimer.

I have a serious problem. I should load three different datasets that weigh 3GB, 5GB and 8GB respectively. Unfortunately I am forced to use an old PC with only 4GB of RAM and 2GB of Swap memory, so I am impossible to upload the files at one time using this simple code:

[CODE lang="python" title="loading files"]
train_data = []
data_path = './data/ADNI_test/'
for i in range(len(filenames)):
mri_file = data_path + filenames
train_data.append(nib.load(mri_file).get_fdata())
[/CODE]

Would any of you know how to give me some alternative solution? Is there a way to upload the files a little at a time without overloading memory? In DCGAN algorithm the batch size is set to 64 files, but I will certainly have to decrease to 30.

Thank you!
 
Technology news on Phys.org
I wouldn't have thought it would be necessary to have all the image files in memory; can you provide the ML engine with an iterator that loads each file on demand?
 
I hope it can be done, but at the moment I don't know how.
Should the iterator be implemented directly in the loss function of the GAN discriminator?

[CODE lang="python" title="discriminator"]
def discriminator_model(strides, kernel_size, input_img, weight_initializer, downsample_layers):
rate = 0.2
filters = input_img.shape[1]

model = Sequential()

model.add(tf.keras.layers.Conv3D(strides, kernel_size, filters, input_img.shape, padding = 'same',
kernel_initializer = weight_initializer))
model.add(tf.keras.layers.LeakyReLU())
model.add(tf.keras.layers.Dropout(rate = rate))

for l in range(downsample_layers - 1):
filters = int(filters * 2)
model.add(tf.keras.layers.Conv3D(strides, kernel_size, filters, padding = 'same'))
model.add(tf.keras.layers.LeakyReLU())
model.add(tf.keras.layers.Dropout(rate = rate))

model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(1))
return model

def discriminator_loss(real_output, fake_output):
real_loss = cross_entropy(tf.ones_like(real_output), real_output)
fake_loss = cross_entropy(tf.zeros_like(fake_output), fake_output)
total_loss = real_loss + fake_loss
return total_loss
[/CODE]

where 'real_output' would be the real image that is compared with that artificially generated.

Some idea?
 
Thread 'Is this public key encryption?'
I've tried to intuit public key encryption but never quite managed. But this seems to wrap it up in a bow. This seems to be a very elegant way of transmitting a message publicly that only the sender and receiver can decipher. Is this how PKE works? No, it cant be. In the above case, the requester knows the target's "secret" key - because they have his ID, and therefore knows his birthdate.
Thread 'Project Documentation'
Trying to package up a small bank account manager project that I have been tempering on for a while. One that is certainly worth something to me. Although I have created methods to whip up quick documents with all fields and properties. I would like something better to reference in order to express the mechanical functions. It is unclear to me about any standardized format for code documentation that exists. I have tried object orientated diagrams with shapes to try and express the...
Back
Top