- #1

- 84

- 1

## Homework Statement

You observe a star with a count rate of 0.1 counts per second and the background

is 0.05 counts per second.

(a) How long do you need to observe for in order to detect 100 counts from the

star? [2] - 1000 seconds

(b) Estimate the signal-to-noise reached in the exposure time from (a) (assuming

the background is the only source of noise).

for a snapshot i know that S/N = 0.1/0.05 = 2 sigma

this isn't really considered a result in astronomy.

However if the exposure time is increased the S/N ratio also increases and really I just need an equation so I can calculate it!

I know that S is proportional to t

N is proportional to the square root of t

A bit of help would be appriciated, can't seem to find what I'm after on google, and it seems like it's going to be really simple! Just need the damn equations!