MHB Maximising the difference between multiple distributions

AI Thread Summary
The discussion focuses on developing a parent loss function for a neural network model that utilizes the quad-tree compression algorithm to process images. The goal is to map a noise vector directly to an image while maximizing the Kullback-Leibler (KL) divergence between segments identified by the quad-tree method. This involves iteratively de-resolving the image to find new segments and optimizing the distance between them, with an emphasis on segments at higher resolutions. Additionally, the approach must account for each RGB channel separately. The intention is to create a loss function for each image in the training set, leveraging the quad-tree algorithm for parameter extraction.
moyo
Messages
27
Reaction score
0
I am trying to come up with a parent loss function for the following neural network model. On top of that the algorithm for processing an image would also be helpful.

The quad-tree compression algorithm divides an image into ever increasingly small segments (squares) and stops in a particular region when all the pixels are the same value.

I would like a situation where I map a noise vector directly to an image. On top of that , the loss function will maximize the distance(KL) between the segments found using the quad-tree algorithm on the image.

This is involved because we have to alliteratively de-resolve slightly the image and find the new segments after that . Then to maximize the distance between all these segments at the same time. Perhaps with a bias towards the segments found at the highest resolution.

Another consideration is that this happens for each channel in the RGB channels.

Thankyou!
If this gets to a paper i will mention contributors :)
 
Physics news on Phys.org
So there will be a loss function for each image in the training set, and we process the image with the quadtree algorithm before in order to get its parameters.
 
Back
Top