This seems to suggest that error-limited performance is very roughly bandwidth = (sqrt(mss) * C) / (rtt * sqrt(ber)) where the units are: :mss = max segment size in bits :C = dimensionless constant, approx 0.9 (see paper for more details) :rtt = round trip time in seconds :ber = bit error rate in per-bit Note that this is a small-ber approximation, assuming loss is dominated by full-length packets. Needless to say, the bandwidth does not go to infinity if the ber goes to zero: packet drops will occur when the b/w tries to exceed the physical link b/w. This model appears to fit reality pretty well, according to the paper. |