Large Data Transfer Strategy

I am writing my master's thesis and I am in contact with a digital signage company where I write about the dissemination of a large amount of data. I need some ideas or some documented impressions when transferring a large amount of data (these are images and video, ~ 100 Mb - ~ 1 Gbps), but any data will do, large arrays of data will give the same problems) to several clients.

Does anyone know of a method that I could talk about how to approach it in a structured way or at least point me in the direction (another thesis, books, documents, people).

My main approach now is to solve a few things: 1. How can I make sure that the data remains intact when it arrives (without messing up, pp will work) 2. How can I determine if I received all the data? 3 ...

Any input is welcome, the current approach is through WebServices, and I'm going to learn BitTorrent aproach (P2P), but this does not seem to be the right strategy, as each client can display different content.

Can any of you guys work in a company with digital signage, tell me a little about how your approach to this is? Or if you have experience moving large datasets from server to client ... what is your approach?

+3
source share
5

- 140 , . :

1 - , - - , - , "" - , . , , ( ) , .

2 ( ). - ; ; , .

3 , , , ( ). , . ; ; ..

4 , . , , .

- ; ; .

, :

, ?

?

, ?

?

+4

Standford , .

, . . standford . , . .

0

, , . , , XMPP BitTorrent. , , , , , .

0

() - . - ( , -).

...

0

Source: https://habr.com/ru/post/1768880/


All Articles