Speed ​​limiting ruby ​​file stream

I am working on a project that involves uploading flash video files to an S3 bucket from multiple geographically distributed sites.

Video files are about 2-3 MB each, and we send only one file (per node) every ten minutes, however, the bandwidth used should be limited to ~ 20 fps, since these nodes deliver streaming media to the CDN, and from for locations we can get a maximum load of 512k.

I studied the ASW-S3 gem, and although it does not offer any speed limits, I know that you can go through the input / output stream. With this in mind, I wonder if it is possible to create a stream with a speed limit that overrides the method read, adds speed limits to the logic (for example, in its simplest form, a call sleepbetween reads), and then calls to a super overridden method.

Another option I considered is to crack the code for Net :: HTTP and the speed limit in a method send_request_with_body_streamthat uses a loop while, but I'm not quite sure which one would be the best option.

I tried to extend the IO class, however this did not work at all, just inheriting from the class using class ThrottledIO < IOnothing did.

Any suggestions would be greatly appreciated.

+3
source share
2 answers

You need to use a delegate if you want to "increase" IO. This creates a "facade" around your I / O object, which will be used by all the "external" readers of the object, but will not affect the operation of the object itself.

I extracted it in a gem, since it turned out to be generally useful

Here is an example for an IO that is read from

http://rubygems.org/gems/progressive_io

There is an aspect added to all reading methods. I think you could expand this to do basic throttling. After you're done, you can wrap it in, say, “File”:

 throttled_file = ProgressiveIO.new(some_file) do | offset, size |
    # compute rate and if needed sleep()
 end
+4
source
0

Source: https://habr.com/ru/post/1737001/


All Articles