I am having problems uploading files to S3 from one of our servers. We use S3 to store our backups, and Ubuntu 8.04 with PHP 5.2.4 and libcurl 7.18.0 runs on all of our servers. Whenever I try to upload a file, Amazon returns a RequestTimeout error. I know that in our current version of libcurl there is an error preventing the download of more than 200 MB. For this reason, we split our backups into smaller files.
We have servers hosted on Amazon EC2 and servers hosted on client "private clouds" (the VMWare ESX field behind the company's firewall). The specific server I'm having problems with is located in the clientβs private cloud.
We are using the Amazon S3 PHP class from http://undesigned.org.za/2007/10/22/amazon-s3-php-class . I tried files of 200 MB, 100 MB and 50 MB in size, all with the same results. To download files, we use the following:
$s3 = new S3($access_key, $secret_key, false); $success = $s3->putObjectFile($local_path, $bucket_name, $remote_name, S3::ACL_PRIVATE);
I tried setting curl_setopt($curl, CURLOPT_NOPROGRESS, false); to view the progress bar while it is loading the file. The first time I ran it using this option, it worked. However, each subsequent time he failed. It seems to download a file at about 3 Mbps for 5-10 seconds, then drop to 0. After 20 seconds, sitting at 0, Amazon returns "RequestTimeout". The socket connection to the server was not read or written during the timeout period. Invalid connections will be closed. "Error.
I tried updating the S3 class to the latest version of GitHub , but that didn't make any difference. I also found the Amazon S3 Stream Wrapper class and tried using the following code:
include 'gs3.php'; define('S3_KEY', 'ACCESSKEYGOESHERE'); define('S3_PRIVATE','SECRETKEYGOESHERE'); $local = fopen('/path/to/backup_id.tar.gz.0000', 'r'); $remote = fopen('s3://bucket-name/customer/backup_id.tar.gz.0000', 'w+r'); $count = 0; while (!feof($local)) { $result = fwrite($remote, fread($local, (1024 * 1024))); if ($result === false) { fwrite(STDOUT, $count++.': Unable to write!'."\n"); } else { fwrite(STDOUT, $count++.': Wrote '.$result.' bytes'."\n"); } } fclose($local); fclose($remote);
This code reads the file one MB at a time to transfer it to S3. For a 50 MB file I get "1: wrote 1048576 bytes" 49 times (the first number changes each time, of course), but at the last iteration of the loop I get the error message "Notice: fputs (): send of 8192 bytes with error = 11 The resource is temporarily unavailable in /path/to/http.php on line 230. "
My first thought was that this is a network problem. We called the client and explained the problem, and asked them to look at their firewall to see that they were throwing something. According to their network administrator, the traffic is going very well.
I do not understand what I can do next. I backed up manually and using SCP transferred them to another machine and downloaded them. This is obviously not perfect, and any help would be greatly appreciated.
Update - 06/23/2011
I tried many options below, but they all gave the same result. I found that I was even trying to copy the file from the server in question, to another server immediately stops and, in the end, shuts down. However, I can use scp to download the same file from another computer. This makes me even more convinced that this is a network problem on the client side, any further suggestions would be very helpful.