I have a set a of stars, each of which have a magnitude in the K band ( K ) and a 1 sigma uncertainty ( dK ) associated with it. For a program I'm writing, I need to see whether each star lies within a maximum K magnitude (K_max ) and a minimum K magnitude ( K_min ). I also need to find a probability ( p ) for this star to lie between these two values. For example, if star 1 has magnitude K_1, I need to find what is the probability p for K_1 to lie between K_min and K_max. I have the 1 sigma uncertainty dK_1 for this magnitude. Extra info: The reason I want this probability p is to use it as an uncertainty. For each star, I'm going to check whether it's K lies within K_max and K_min, and assign a value F = 1 if it does, and F = 0 if it does not. Then I want to use p to find the uncertainty of F, or dF. If F = 1, then I'll use dF = p. If F = 0, then I'll use dF = (1-p). Why I want dF is to use it in an adaptive binning program, that someone has already written.