Astronomy - Signal to noise

  • Thread starter leonmate
  • Start date
  • #1
84
1

Homework Statement



You observe a star with a count rate of 0.1 counts per second and the background
is 0.05 counts per second.
(a) How long do you need to observe for in order to detect 100 counts from the
star? [2] - 1000 seconds
(b) Estimate the signal-to-noise reached in the exposure time from (a) (assuming
the background is the only source of noise).


for a snapshot i know that S/N = 0.1/0.05 = 2 sigma
this isn't really considered a result in astronomy.
However if the exposure time is increased the S/N ratio also increases and really I just need an equation so I can calculate it!

I know that S is proportional to t
N is proportional to the square root of t

A bit of help would be appriciated, can't seem to find what I'm after on google, and it seems like it's going to be really simple! Just need the damn equations!
 

Answers and Replies

Related Threads on Astronomy - Signal to noise

  • Last Post
Replies
2
Views
1K
Replies
0
Views
4K
  • Last Post
Replies
4
Views
7K
  • Last Post
Replies
6
Views
700
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
2
Views
4K
  • Last Post
Replies
7
Views
2K
Top