I'm trying to upload a large file (1.5GB) to Amazon S3 by using the REST Api and HTML5 file slicing. Here's how the upload code looks like (code stripped down for readability):
File.prototype.slice = File.prototype.webkitSlice || File.prototype.mozSlice || File.prototype.slice;
var length = u.settings.chunk_size; // 6MB
var start = chunk * length;
var end = Math.min(start + length, u.file.size);
var xhr = new XMLHttpRequest();
var path = "/" + u.settings.key;
path += "?partNumber=" + chunk + "&uploadId=" + u.upload_id;
var method = "PUT";
var authorization = "AWS " + u.settings.access_key + ":" + signature;
var blob = u.file.slice(start, end);
xhr.upload.addEventListener("progress", progress_handler, true);
xhr.addEventListener("readystatechange", handler, true);
xhr.addEventListener("error", error_handler, true);
xhr.addEventListener("timeout", error_handler, true);
xhr.open(method, u.settings.host + path, true);
xhr.setRequestHeader("x-amz-date", date);
xhr.setRequestHeader("Authorization", authorization);
xhr.setRequestHeader("Content-Type", u.settings.content_type);
xhr.setRequestHeader("Content-Disposition", "attachment; filename=" + u.file.name);
xhr.send(blob);
chunk_size
is 6MB. After a chunk finishes uploading, the next one follows, and so on. But sometimes (every 80 chunks or so), the PUT
request fails, with e.type == "error"
, e.target.status == 0
(which surprises me), and e.target.responseText == ""
. After a chunk fails, the code re-attempts to upload it, and gets the exact same error. When I refresh the page and continue the upload (the same chunk!), it works like a charm (for 80 chunks or so, when it gets stuck again). Here's how the request looks in chrome dev tools:
Any ideas why this might happen, or how to debug something like this?
EDIT: Here is the OPTIONS
response:
I'm trying to upload a large file (1.5GB) to Amazon S3 by using the REST Api and HTML5 file slicing. Here's how the upload code looks like (code stripped down for readability):
File.prototype.slice = File.prototype.webkitSlice || File.prototype.mozSlice || File.prototype.slice;
var length = u.settings.chunk_size; // 6MB
var start = chunk * length;
var end = Math.min(start + length, u.file.size);
var xhr = new XMLHttpRequest();
var path = "/" + u.settings.key;
path += "?partNumber=" + chunk + "&uploadId=" + u.upload_id;
var method = "PUT";
var authorization = "AWS " + u.settings.access_key + ":" + signature;
var blob = u.file.slice(start, end);
xhr.upload.addEventListener("progress", progress_handler, true);
xhr.addEventListener("readystatechange", handler, true);
xhr.addEventListener("error", error_handler, true);
xhr.addEventListener("timeout", error_handler, true);
xhr.open(method, u.settings.host + path, true);
xhr.setRequestHeader("x-amz-date", date);
xhr.setRequestHeader("Authorization", authorization);
xhr.setRequestHeader("Content-Type", u.settings.content_type);
xhr.setRequestHeader("Content-Disposition", "attachment; filename=" + u.file.name);
xhr.send(blob);
chunk_size
is 6MB. After a chunk finishes uploading, the next one follows, and so on. But sometimes (every 80 chunks or so), the PUT
request fails, with e.type == "error"
, e.target.status == 0
(which surprises me), and e.target.responseText == ""
. After a chunk fails, the code re-attempts to upload it, and gets the exact same error. When I refresh the page and continue the upload (the same chunk!), it works like a charm (for 80 chunks or so, when it gets stuck again). Here's how the request looks in chrome dev tools:
Any ideas why this might happen, or how to debug something like this?
EDIT: Here is the OPTIONS
response:
2 Answers
Reset to default 4I finally found the issue by sniffing packets: there are two issues:
for
PUT
requests that get a4xx
(didn't test for other non-2xx
responses), the xhr request returns as aborted (status = 0); still haven't found an explanation for that, check out Why does a PUT 403 show up as Aborted?Amazon S3 responded with a
403
that saidRequestTimeTooSkewed
, because my signatures are generated when the upload starts, and after 15 minutes (the timeout that triggers theRequestTimeTooSkewed
error), it fails, and the signatures have to be regenerated. That 403 error is never seen in the dev tools console or by the js code, because of the first problem..
After regenerating the signatures, everything works like a charm.
Did you verify whether browser is making any 'OPTIONS' request. If yes what is the response headers.