Efficient reliable incremental software for uploading multiple HTTP files (or the entire directory)

Imagine that you have a website to which you want to send a lot of data. Say 40 files with the equivalent of 2 hours of traffic. You expect to have 3 loss of connection along the way (think: mobile data connection, WLAN and microwave). You cannot worry about repeating. It needs to be automated. Interrupts should not cause more data loss than necessary. Re-downloading the complete file is a waste of time and bandwidth.

So, here is the question: is there a software package or framework that

  • synchronizes the local directory (content) with the server via HTTP,
  • is multi-platform (Win XP / Vista / 7, MacOS X, Linux),
  • can be delivered as one standalone executable file,
  • restores partial file downloads after interrupted network connections or restarting the client,
  • can be generated on the server to enable authentication tokens and download targets,
  • can be made super easy to use

or what would be a good way to build it?

The options I have found so far are:

  • Neat packaging rsync. This requires an rsync (server) instance on the server side that knows the privilege system.
  • Flash user program. As far as I understand, Flash 10 can read the local file as bytearray (indicated here ) and, obviously, can talk HTTP with the source server. In the field is question 1344131 ("Upload image to server without uploading the entire image").
  • .

!

:

+3

Source: https://habr.com/ru/post/1740377/


All Articles