1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wireless Internet

  1. Sep 24, 2011 #1
    1. The problem statement, all variables and given/known data


    Consider the multi-path propagation phenomenon discussed in class. Assume the signal from a sender takes 4 paths to arrive at the receiver, and the delay along each path is 2, 6, 8, 12 (in micro-seconds), respectively. Each symbol is 1 bit long. Two symbols can be successfully received/detected at the receiver if their received impulses are at least 1 micro-second apart. What is the maximum allowed transmission rate from the sender to the receiver?

    2. Relevant equations

    delay along each path is 2, 6, 8, 12 (in micro-seconds)


    3. The attempt at a solution

    First path to last path
    total delay is 10 micro-seconds. I don't know this sentence Two symbols can be successfully received/detected at the receiver if their received impulses are at least 1 micro-second apart.
     
  2. jcsd
  3. Sep 24, 2011 #2

    NascentOxygen

    User Avatar

    Staff: Mentor

    If the direct signal gets interfered with by a reflected signal, the receiver still must be able to detect the correct TRUE or FALSE condition for each symbol. It can manage this provided the symbols don't fully overlap. A difference of at least 1 microsecond allows the symbol to be correctly detected.
    Code (Text):
          _______      
    _____|       |______
           _______                
    ______|       |______
     
  4. Sep 24, 2011 #3
    Thank you for your reply.
    But I still not understand how can i achieve the maximum allowed transmission rate from the sender to the receiver?
     
  5. Sep 25, 2011 #4
    upupup....
     
  6. Sep 28, 2011 #5

    NascentOxygen

    User Avatar

    Staff: Mentor

    You have an advantage over us, since you were present when this was discussed in class.

    But I'm guessing that the minimum time needed for the transmission of a single bit is double the maximum delay between the direct signal and its echo, plus the minimum detection window. See how I arrived at this:
    Code (Text):


    TTTTTTTTTTTTTTTTTTT[COLOR="Red"]TTT[/COLOR]______________________   : direct signal

    [color="white"]TTTTTTTTTTTTTTTTTTTTTT[/color]TTTTTTTTTTTTTTTTTTT[COLOR="Red"]TTT[/COLOR]   : delayed by 10 microsecs

    TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT[COLOR="Red"]TTT[/COLOR]   : sum of above
    The string of consecutive black letters represent the pulses that make up just one bit. The white represent the delay. (Max delay is 10 microsecs.) The red represents the pulses that make up the minimum detection window. The bottom line represents what the receiver sees: the sum of the direct signal plus its worst echo.

    The above is speculative. I may be totally wrong. :smile:
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook