With Rails, I followed this Heroku article to allow direct file uploads to the S3 bucket. I really followed this article because my previous implementation did not work for multi-page downloads (such as large files). As soon as I implemented this method, large files downloaded just fine, except for really large files.
I should note that I deviated a bit from this article as I am using v1 from awsgem, due to our version of Rails.
Here is how I am configured:
S3_BUCKET = AWS::S3.new.buckets[ENV['S3_BUCKET_NAME']]
def set_s3_post_url
@s3_media_post_url = S3_BUCKET.presigned_post(key: "product_media/#{SecureRandom.uuid}-${filename}", success_action_status: '201', acl: 'public-read')
end
As already mentioned, this works for large files (~ 1 GB), but when I try to load the one that has, say, 10 GB, it gets into a mostly loaded state and then accidentally crashes. Sometimes after 20 minutes, sometimes after an hour. I thought it might be that the signed URL is expiring, so I explicitly set the long expiration time with expires: Time.now + 4.hours, but that didn't work.
I would really appreciate any help on this if anyone has any ideas!
Update
I tried @bbozo's answer for use maxChunkSize, but unfortunately it doesn't look like this. However, since I was watching XHR requests in the console, one that was not able to return the following XML response from AWS:
<Error>
<Code>InternalError</Code>
<Message>We encountered an internal error. Please try again.</Message>
<RequestId>1231BD4A29EE5291</RequestId>
<HostId>f5muQPj2lT2Tmqi49ffqjT4ueLimYvrWUJL6WRW+F7vgm2rL1+FOD3pmsKOEYxFaSFXzLiEZjTg=</HostId>
</Error>