- #1
SlurrerOfSpeech
- 141
- 11
Let's say I have a client that asks for a file from a remote server that then asks for the file from some other remote server (Let's say Azure Blob Storage, for example). Let's say that the file can be transferred from blob storage to the web server at 60mb/s and from the web server to the client at 40mb/s.
Client <==== T1=40mb/s ==== Web Server <===== T2=60mb/s ==== Blob Storage
Is my intuition correct that 40mb/s is essentially the total transfer rate, since the bytes are basically being produced faster than they can be consumed?
So, basically, the formula is like
Sorry if this sounds like a super-n00b question. It's only recently that I've dealt with these kinds of questions.
Client <==== T1=40mb/s ==== Web Server <===== T2=60mb/s ==== Blob Storage
Is my intuition correct that 40mb/s is essentially the total transfer rate, since the bytes are basically being produced faster than they can be consumed?
So, basically, the formula is like
- If T1 < T2, then T = T1
- If T1 >= T2, then T = T1 + T2
Sorry if this sounds like a super-n00b question. It's only recently that I've dealt with these kinds of questions.