Max Transmission Rate for Multi-Path Propagation with 4 Paths and Varying Delays

  • Thread starter Thread starter jessesiu
  • Start date Start date
  • Tags Tags
    Internet Wireless
AI Thread Summary
The discussion centers on calculating the maximum transmission rate for a signal experiencing multi-path propagation with delays of 2, 6, 8, and 12 microseconds. The total delay from the first to the last path is 10 microseconds, and symbols can only be detected if they are spaced at least 1 microsecond apart. To achieve the maximum allowed transmission rate, the minimum time for transmitting a single bit must account for both the maximum delay and the detection window. The proposed method suggests that the minimum time needed for transmission is double the maximum delay plus the detection window. Understanding these parameters is crucial for determining the effective transmission rate from sender to receiver.
jessesiu
Messages
3
Reaction score
0
1. Homework Statement


Consider the multi-path propagation phenomenon discussed in class. Assume the signal from a sender takes 4 paths to arrive at the receiver, and the delay along each path is 2, 6, 8, 12 (in micro-seconds), respectively. Each symbol is 1 bit long. Two symbols can be successfully received/detected at the receiver if their received impulses are at least 1 micro-second apart. What is the maximum allowed transmission rate from the sender to the receiver?

2. Homework Equations

delay along each path is 2, 6, 8, 12 (in micro-seconds)


3. The Attempt at a Solution

First path to last path
total delay is 10 micro-seconds. I don't know this sentence Two symbols can be successfully received/detected at the receiver if their received impulses are at least 1 micro-second apart.
 
Physics news on Phys.org
If the direct signal gets interfered with by a reflected signal, the receiver still must be able to detect the correct TRUE or FALSE condition for each symbol. It can manage this provided the symbols don't fully overlap. A difference of at least 1 microsecond allows the symbol to be correctly detected.
Code:
      _______       
_____|       |______
       _______                 
______|       |______
 
NascentOxygen said:
If the direct signal gets interfered with by a reflected signal, the receiver still must be able to detect the correct TRUE or FALSE condition for each symbol. It can manage this provided the symbols don't fully overlap. A difference of at least 1 microsecond allows the symbol to be correctly detected.
Code:
      _______       
_____|       |______
       _______                 
______|       |______

Thank you for your reply.
But I still not understand how can i achieve the maximum allowed transmission rate from the sender to the receiver?
 
upupup...
 
You have an advantage over us, since you were present when this was discussed in class.

But I'm guessing that the minimum time needed for the transmission of a single bit is double the maximum delay between the direct signal and its echo, plus the minimum detection window. See how I arrived at this:
Code:
TTTTTTTTTTTTTTTTTTTTTT______________________   : direct signal

TTTTTTTTTTTTTTTTTTTTTT[/color]TTTTTTTTTTTTTTTTTTTTTT   : delayed by 10 microsecs

TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT   : sum of above
The string of consecutive black letters represent the pulses that make up just one bit. The white represent the delay. (Max delay is 10 microsecs.) The red represents the pulses that make up the minimum detection window. The bottom line represents what the receiver sees: the sum of the direct signal plus its worst echo.

The above is speculative. I may be totally wrong. :smile:
 
Back
Top