I have a problem that requires me to calculate the maximum load and load, and then limit the use of my program to a percentage of it. However, I cannot come up with a good way to find the maximum values.
Currently, the only solution I can come up with is transferring a few megabytes between the client and the server, and then measuring how the transfer occurs. However, this solution is very undesirable, because with 100,000 clients this can lead to too much increase in the bandwidth usage of our server (which is already too large).
Does anyone have a solution to this problem?
Please note that what interests me most is the restriction of data transfer until it leaves the Internet provider network; I think this is likely to cause a bottleneck that will cause a deterioration in the communication of other programs. Correct me if I am wrong.
EDIT: After further investigation, I do not think this is possible; too many variables to accurately measure the maximum transfer rate when exiting the Internet service provider's network. If you leave the question open, in case someone comes up with an exact solution.
source share