1. The problem statement, all variables and given/known data if the intrinsic brightness of a class of radio galaxies fades as the universe ages, such that those that we see further away are brighter, having a linear increase in luminosity with distance r, what would be the slope of a graph of log N(s) versus log S for these objects. 2. Relevant equations 3. The attempt at a solution I know that usually, the slope would be -1.5, but as it says those that we see further away are brighter, i would expect that you would see more objects above the threshold S. So im thinking that the slope would not be -1.5. Instead of using S=L/(4pi r^2), would i say that L=r, as it says having a linear increase in luminosity with r?