We have a problem when some of the files in the s3 directory are in the range of ~ 500MiB, but many other files are in KiB and Bytes. I want to combine all the small files into several large files of the order of ~ 500MiB.
What is the most efficient way to overwrite data in s3 folder instead of downloading, merging on local and back to s3. Is there any utility / aws command that I can use to achieve it?
S3 is a storage service and does not have the ability to compute. For what you ask, you need to calculate (combine). Thus, you cannot do what you want without loading, merging and unloading.
Source: https://habr.com/ru/post/1692445/More articles:Как эмулировать и звать вызов от XS? - perlРекурсивно обматывать вызовы метода с помощью плагинов/макросов компилятора - scalaHow to extract contents in square brackets in ruby - ruby | fooobar.comWhat is the numpy int0 method? - pythonWhat does IE do with my select clause? - javascriptIs there a danger associated with streaming data to the C ++ exception class? - c ++Implementation of unification and omissions of variables - javaR: Split row into new row and column - rрядные операции, выберите помощники и функцию mutate в dplyr - selectThere is a function that searches for an attractive fixed point through iteration. Can we generalize it to monadic functions? - haskellAll Articles