1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Astronomy - Signal to noise

  1. Nov 6, 2012 #1
    1. The problem statement, all variables and given/known data

    You observe a star with a count rate of 0.1 counts per second and the background
    is 0.05 counts per second.
    (a) How long do you need to observe for in order to detect 100 counts from the
    star? [2] - 1000 seconds
    (b) Estimate the signal-to-noise reached in the exposure time from (a) (assuming
    the background is the only source of noise).


    for a snapshot i know that S/N = 0.1/0.05 = 2 sigma
    this isn't really considered a result in astronomy.
    However if the exposure time is increased the S/N ratio also increases and really I just need an equation so I can calculate it!

    I know that S is proportional to t
    N is proportional to the square root of t

    A bit of help would be appriciated, can't seem to find what I'm after on google, and it seems like it's going to be really simple! Just need the damn equations!
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?