1. The problem statement, all variables and given/known data An optical communication system operating at λ = 1550nm is transmitting pulses at 10 Gb/s. The magnitude of the optical pulses is the same. Calculate the number of photons received within each bit. Assume that the received average optical power is 10 mW. 2. Relevant equations 3. The attempt at a solution I calculated the energy per photon Ep = hc/λ ---> 1.28x10-19 then i calculated the energy per bit Eb = Pτ ----> (.01)(10-9) = 10-11 In class we did a similar example where the number of photons = Eb/Ep I do that here and get 78 million.... that seems way too big to me. Can someone explain what I did wrong if its wrong.