Error injection using PC based modulator

AI Thread Summary
Injecting errors into a transport stream using a PC-based modulator does not appear to affect the Bit Error Rate (BER), although the count of transport stream error packets increases. The discussion suggests that corrupted packets may be detected and resent by the application, preventing an increase in BER. The presence of error-detection and correction mechanisms within the communication protocol could explain why BER remains stable despite the errors. This behavior highlights the effectiveness of packet protocols in managing data integrity. Overall, the interaction between error injection and BER is complex and influenced by underlying protocol functions.
kirubanithi
Messages
11
Reaction score
0
Injecting error in the transport stream using the pc based modulator whether will affect the BER of the stream. Because while my testing the BER is not even varying but Transport stream error packet count will increase. Any one could you please the above beghaviour.
 
Physics news on Phys.org
Hi Kirubanithi
It is a long time since my training and not only have I forgotten much but much has changed. Also I admit to some guesswork in what follows.

It sounds to me as though you are sending yourself some data, messing with the data stream, and watching the error rate at the receive end of the bearer.
I believe any packets which are found to be corrupted at receive will be resent under command of the application which wants the data.
The numbered packet will possibly have gone missing when it is blocked at X25 level (packet data protocol) by the X25 modems in the local concentrator, whose functions must always include a test of the crc (cyclic redundancy check) which is sent as part of the packet.

After all nobody is interested in corrupt data, and data corrupted (due to faulty cable for instance) can be prevented from uselessly loading a network by careful testing to see that only actual data is sent.

(sorry no snappy signature yet)

Anyway, that's what I think I would have said 20 years ago.
 
As per the above explanation is there any possibilities to increase in BER of the stream. Or if there is no increment in BER of the stream means could you please explain the cause.
 
Hi again Kirubanithi
If my notion above is correct, I would elaborate by mentioning that more than one means of measuring 'bit error rate' exists.
In the opinion of a high-level application sending pseudo-random code over a noisy packet link, the link will be slower but still noise-free up to some limit. This is an artifact of the protocol, and in fact is the whole reason for the packet protocol's existence.
So the error-detection/correction in a shell of the communications system beyond ken of your BER software may be responsible for the intriguing effect.
It's a suggestion, no more.
 
Last edited:
Back
Top