Hey guys, stupid question probably, but I somehow don't get it. I'm just reading an article about dark matter in elliptical galaxies. They mention at a certain point that while for the detection of DM in spiral galaxies we measure the rotational velocity of its stars around the center, for the detection of DM in elliptical galaxies we measure the velocity dispersion. They then write: "this gives us an estimate of how fast the stars move as a function of the distance from the center". What I don't get is: 1. How do we get this estimate? As I understand dispersion - I could have a dispersion of 400 km/s but still have an average of 0. Or I could have a dispersion of 10 Km/s with an average of 1000 km/s. How are the average and the dispersion connected? 2. Why do we talk of the dispersion and not of the mean velocity then? Thanks a lot - this is really important! Tomer.