# Calculating throughput for serial vs parallel

ver_mathstats
Homework Statement:
We are given two choices for interfacing a serial interface with a latency of 20 microseconds and a parallel interface with a width of 16 bits and a latency of 200 microseconds. How long would it take to transfer 16 bits over each interface?
Relevant Equations:
throughput = data item / latency
I know that a serial interface has a single data line whereas a parallel can have several data lines. Could someone check over my work please and thank you. Would it just be, a serial interface: 16 bits x 20 microseconds for 320 bits per microsecond, and then for parallel it would just take 200 microseconds to transfer 16 bits? So for this it would be better to go with parallel interface?

Thank you and help would be appreciated

Mentor
Your serial answer should be 320 microseconds not 320 bits per microseconds.

20 usecs/bit x 16 bits = 320 usecs

so yes 200 usecs is faster than 320 usecs.

• ver_mathstats
ver_mathstats
Your serial answer should be 320 microseconds not 320 bits per microseconds.

20 usecs/bit x 16 bits = 320 usecs

so yes 200 usecs is faster than 320 usecs.
Sorry, I just realized that but thank you I got it now

• jedishrfu
Gold Member
2022 Award
And @ver_mathstats just to be sure you know, this is a textbook problem and VERY unrealistic. It would be a very odd circuit technology indeed that could manage 20 microseconds for a serial line but require 200 microseconds each for the parallel lines. PLUS, the serial has to be turned from parallel into serial, transmitted, then turned back into parallel.

• • • jedishrfu, pbuk, 256bits and 1 other person
Gold Member
And @ver_mathstats just to be sure you know, this is a textbook problem and VERY unrealistic. It would be a very odd circuit technology indeed that could manage 20 microseconds for a serial line but require 200 microseconds each for the parallel lines. PLUS, the serial has to be turned from parallel into serial, transmitted, then turned back into parallel.
Is that the brief on SATA vs PATA. https://en.wikipedia.org/wiki/SATA

• pbuk
Homework Helper
Gold Member
Oh dear, this seems to have gone off-track. I think the problem starts here:

Relevant Equations:: throughput = data item / latency
This is not correct. I don't know what "data item" means in this context, but throughput and latency are not related in this (or any similar) way.

Now
a parallel interface with a width of 16 bits and a latency of 200 microseconds
will take at least 200 μs to transmit a message of 16 bits. and
a serial interface with a latency of 20 microseconds
will take 20 μs to transmit a message of 1 bit, but in order to work out how long it will take to transmit a message of 16 bits we need to know its data rate, or throughput (also called, erroneously, bandwidth). If it has a data rate of 100 kbps (i.e. 1 bit every 10 μs) then it will take 16 x 10 μs + 20 μs = 180 μs to transfer 16 bits.

Last edited:
• jedishrfu and 256bits
Homework Helper
Gold Member
And @ver_mathstats just to be sure you know, this is a textbook problem and VERY unrealistic. It would be a very odd circuit technology indeed that could manage 20 microseconds for a serial line but require 200 microseconds each for the parallel lines. PLUS, the serial has to be turned from parallel into serial, transmitted, then turned back into parallel.
In case @256bits excellent piece of humour is lost on anyone, note that you can in general achieve much higher frequencies on serial cables than parallel cables because you don't have the problem of signals interfering with one another (crosstalk). Also, the "problem" of turning your data from (16 bits) parallel to serial and back again only exists if your data is 16 bits wide in the first place, and ends up 16 bits wide at the other end.

• ver_mathstats, jedishrfu and DaveE
ver_mathstats
Oh dear, this seems to have gone off-track. I think the problem starts here:

This is not correct. I don't know what "data item" means in this context, but throughput and latency are not related in this (or any similar) way.

Now

will take at least 200 μs to transmit a message of 16 bits. and

will take 20 μs to transmit a message of 1 bit, but in order to work out how long it will take to transmit a message of 16 bits we need to know its data rate, or throughput (also called, erroneously, bandwidth). If it has a data rate of 100 kbps (i.e. 1 bit every 10 μs) then it will take 16 x 10 μs + 20 μs = 180 μs to transfer 16 bits.
I did not use the formula "throughput = data item / latency" for this problem in particular as I am told I have to calculate throughput in a different part. In my class, I was taught that the data item is denoted as d_i, and can be a bit, and then latency is denoted as delta t, then the throughput is calculated using the formula |d_i|/dt. You then get your answer as throughput in bits per millisecond in this particular problem.

Last edited: