Error injection using PC based modulator

Click For Summary
SUMMARY

The discussion centers on the impact of error injection using a PC-based modulator on the Bit Error Rate (BER) of a transport stream. Participants noted that while the BER remained stable during testing, the count of transport stream error packets increased. The conversation highlighted the role of X25 packet data protocol and cyclic redundancy checks (CRC) in managing data integrity, suggesting that error detection and correction mechanisms may obscure the true BER in noisy environments.

PREREQUISITES
  • Understanding of transport stream protocols
  • Familiarity with X25 packet data protocol
  • Knowledge of cyclic redundancy check (CRC) mechanisms
  • Basic concepts of Bit Error Rate (BER) measurement
NEXT STEPS
  • Research error injection techniques using PC-based modulators
  • Explore advanced error detection and correction methods in data transmission
  • Learn about the implications of X25 protocol on network performance
  • Investigate the relationship between transport stream errors and BER metrics
USEFUL FOR

Network engineers, telecommunications professionals, and anyone involved in data transmission and error management will benefit from this discussion.

kirubanithi
Messages
11
Reaction score
0
Injecting error in the transport stream using the pc based modulator whether will affect the BER of the stream. Because while my testing the BER is not even varying but Transport stream error packet count will increase. Any one could you please the above beghaviour.
 
Physics news on Phys.org
Hi Kirubanithi
It is a long time since my training and not only have I forgotten much but much has changed. Also I admit to some guesswork in what follows.

It sounds to me as though you are sending yourself some data, messing with the data stream, and watching the error rate at the receive end of the bearer.
I believe any packets which are found to be corrupted at receive will be resent under command of the application which wants the data.
The numbered packet will possibly have gone missing when it is blocked at X25 level (packet data protocol) by the X25 modems in the local concentrator, whose functions must always include a test of the crc (cyclic redundancy check) which is sent as part of the packet.

After all nobody is interested in corrupt data, and data corrupted (due to faulty cable for instance) can be prevented from uselessly loading a network by careful testing to see that only actual data is sent.

(sorry no snappy signature yet)

Anyway, that's what I think I would have said 20 years ago.
 
As per the above explanation is there any possibilities to increase in BER of the stream. Or if there is no increment in BER of the stream means could you please explain the cause.
 
Hi again Kirubanithi
If my notion above is correct, I would elaborate by mentioning that more than one means of measuring 'bit error rate' exists.
In the opinion of a high-level application sending pseudo-random code over a noisy packet link, the link will be slower but still noise-free up to some limit. This is an artifact of the protocol, and in fact is the whole reason for the packet protocol's existence.
So the error-detection/correction in a shell of the communications system beyond ken of your BER software may be responsible for the intriguing effect.
It's a suggestion, no more.
 
Last edited:

Similar threads

  • · Replies 10 ·
Replies
10
Views
957
  • · Replies 1 ·
Replies
1
Views
4K
Replies
18
Views
3K
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
5
Views
15K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 54 ·
2
Replies
54
Views
6K