Download large files from the Internet in Haskell

Are there any suggestions for uploading large files to Haskell? I believe Http.Conduit is a library for this. However, how to solve this? There is an example in the documentation, but it is not suitable for downloading large files, it just downloads the file:

import Data.Conduit.Binary (sinkFile) import Network.HTTP.Conduit import qualified Data.Conduit as C main :: IO () main = do request <- parseUrl "http://google.com/" withManager $ \manager -> do response <- http request manager responseBody response C.$$+- sinkFile "google.html" 

I want to download large files and not exit RAM, for example. do it efficiently in terms of performance, etc. Preferably, the ability to continue downloading them "later", which means "part now, another part later."

I also found a boot-twisting package on the hack , but I'm not sure if this works well or even that it downloads files piece by piece, as I need.

+3
source share
1 answer

Network.HTTP.Conduit provides three functions for executing the request:

Of the three functions, the first two functions will make the entire response body live in memory. If you want to work in read-only memory, use the http function. The http function gives you access to the streaming interface through a ResumableSource

The example you provided in your code uses I / O striping to write the response body to a file in a constant memory space. Thus, when downloading a large file, you will not run out of memory.

+10
source

Source: https://habr.com/ru/post/1012868/


All Articles