|Feb19-10, 02:51 AM||#1|
1. The problem statement, all variables and given/known data
if the intrinsic brightness of a class of radio galaxies fades as the universe ages, such that those that we see further away are brighter, having a linear increase in luminosity with distance r, what would be the slope of a graph of log N(s) versus log S for these objects.
2. Relevant equations
3. The attempt at a solution
I know that usually, the slope would be -1.5, but as it says those that we see further away are brighter, i would expect that you would see more objects above the threshold S. So im thinking that the slope would not be -1.5.
Instead of using S=L/(4pi r^2), would i say that L=r, as it says having a linear increase in luminosity with r?
|Similar Threads for: source counts|
|counts per pixel||Advanced Physics Homework||0|
|PhysForum Post Counts||General Discussion||6|
|circuit with capacitor, current source, voltagr source... and a switch||Engineering, Comp Sci, & Technology Homework||1|
|CIRCUIT ANALYSIS: Use superposition - 2 Current source, 1 Voltage source, 4 resistors||Advanced Physics Homework||6|
|Post Counts||Forum Feedback & Announcements||3|