How can I calculate the optimal UDP packet size for a data stream?

Short radio communication with a data source connected with the required bandwidth of 1280 Kbps via IPv6 protocol with Stop-and-wait UDP protocol, other clients or noticeable noise sources in this area. How can I calculate which maximum packet size should minimize overhead?

UPDATE

I thought it would be an idea to show my work so far: IPv6 has a 40-byte header, so including ACK responses, it's 80 bytes per packet. To satisfy the bandwidth requirements, it is necessary to send a second packet in the amount of 1280 K / p, where p is the packet payload size.

So, according to my calculations, this means that the common waybill (1280 K / p) * (80) and throwing this at Wolfram gives a function without minima, therefore there is no β€œoptimal” value.

I did a lot more math, trying to calculate the error rate of errors per bit, but ran into the same thing; if there are no minimums, how to choose the optimal value?

+4
source share
2 answers

It’s best to use a simulation infrastructure for networks. This is a complex problem and does not have an easy answer.

NS2 or SimPy can help you design discrete event simulations to find the best conditions if you know your model in terms of packet loss.

+1
source

Always work with the maximum packet size available on the network, and then configure the network MTU in the deployment for the most reliable configuration.

Consider the delay requirements, how is the payload generated, do you need to wait for enough data before sending the packet, or can you send immediately?

The radio channel is already optimized for noise as a low level of packets, usually you will have other implementation requirements, such as power requirements: sending large batches or continuous continuous download.

0
source

Source: https://habr.com/ru/post/1309497/


All Articles