Sure we are. Noise is the random variation in the signal away from a theoretical 'ideal' value. If I take an image of a star and count the number of electrons generated in the sensor's pixels by that star's light, I'll get some value, X. I then take another identical image and count the electrons again. This time X is a bit more or less than before. I then take another identical image and do the same thing, again getting a slightly different value for X. Let's say I continue to take images and I also average the values after each image to get the mean electron count, Y. As the number of images taken increases towards infinity, Y approaches some particular value, which I'll call the signal's theoretical perfect value. This theoretical perfect value is the value I'd get for each X in a world without any sources of noise. But noise causes X to vary around Y's value in each image. Is that any clearer?