最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Amazon S3 CORS PUT fails - Stack Overflow

programmeradmin0浏览0评论

I'm trying to upload a large file (1.5GB) to Amazon S3 by using the REST Api and HTML5 file slicing. Here's how the upload code looks like (code stripped down for readability):

File.prototype.slice = File.prototype.webkitSlice || File.prototype.mozSlice || File.prototype.slice;

var length = u.settings.chunk_size; // 6MB
var start = chunk * length;
var end = Math.min(start + length, u.file.size);

var xhr = new XMLHttpRequest();
var path = "/" + u.settings.key;

path += "?partNumber=" + chunk + "&uploadId=" + u.upload_id;

var method = "PUT";
var authorization = "AWS " + u.settings.access_key + ":" + signature;
var blob = u.file.slice(start, end);

xhr.upload.addEventListener("progress", progress_handler, true);
xhr.addEventListener("readystatechange", handler, true);
xhr.addEventListener("error", error_handler, true);
xhr.addEventListener("timeout", error_handler, true);

xhr.open(method, u.settings.host + path, true);

xhr.setRequestHeader("x-amz-date", date);
xhr.setRequestHeader("Authorization", authorization);
xhr.setRequestHeader("Content-Type", u.settings.content_type);
xhr.setRequestHeader("Content-Disposition", "attachment; filename=" + u.file.name);

xhr.send(blob);

chunk_size is 6MB. After a chunk finishes uploading, the next one follows, and so on. But sometimes (every 80 chunks or so), the PUT request fails, with e.type == "error", e.target.status == 0 (which surprises me), and e.target.responseText == "". After a chunk fails, the code re-attempts to upload it, and gets the exact same error. When I refresh the page and continue the upload (the same chunk!), it works like a charm (for 80 chunks or so, when it gets stuck again). Here's how the request looks in chrome dev tools:

Any ideas why this might happen, or how to debug something like this?

EDIT: Here is the OPTIONS response:

I'm trying to upload a large file (1.5GB) to Amazon S3 by using the REST Api and HTML5 file slicing. Here's how the upload code looks like (code stripped down for readability):

File.prototype.slice = File.prototype.webkitSlice || File.prototype.mozSlice || File.prototype.slice;

var length = u.settings.chunk_size; // 6MB
var start = chunk * length;
var end = Math.min(start + length, u.file.size);

var xhr = new XMLHttpRequest();
var path = "/" + u.settings.key;

path += "?partNumber=" + chunk + "&uploadId=" + u.upload_id;

var method = "PUT";
var authorization = "AWS " + u.settings.access_key + ":" + signature;
var blob = u.file.slice(start, end);

xhr.upload.addEventListener("progress", progress_handler, true);
xhr.addEventListener("readystatechange", handler, true);
xhr.addEventListener("error", error_handler, true);
xhr.addEventListener("timeout", error_handler, true);

xhr.open(method, u.settings.host + path, true);

xhr.setRequestHeader("x-amz-date", date);
xhr.setRequestHeader("Authorization", authorization);
xhr.setRequestHeader("Content-Type", u.settings.content_type);
xhr.setRequestHeader("Content-Disposition", "attachment; filename=" + u.file.name);

xhr.send(blob);

chunk_size is 6MB. After a chunk finishes uploading, the next one follows, and so on. But sometimes (every 80 chunks or so), the PUT request fails, with e.type == "error", e.target.status == 0 (which surprises me), and e.target.responseText == "". After a chunk fails, the code re-attempts to upload it, and gets the exact same error. When I refresh the page and continue the upload (the same chunk!), it works like a charm (for 80 chunks or so, when it gets stuck again). Here's how the request looks in chrome dev tools:

Any ideas why this might happen, or how to debug something like this?

EDIT: Here is the OPTIONS response:

Share Improve this question edited Sep 28, 2012 at 5:22 Gabi Purcaru asked Sep 27, 2012 at 13:19 Gabi PurcaruGabi Purcaru 31.6k9 gold badges80 silver badges96 bronze badges 0
Add a ment  | 

2 Answers 2

Reset to default 4

I finally found the issue by sniffing packets: there are two issues:

  1. for PUT requests that get a 4xx (didn't test for other non-2xx responses), the xhr request returns as aborted (status = 0); still haven't found an explanation for that, check out Why does a PUT 403 show up as Aborted?

  2. Amazon S3 responded with a 403 that said RequestTimeTooSkewed, because my signatures are generated when the upload starts, and after 15 minutes (the timeout that triggers the RequestTimeTooSkewed error), it fails, and the signatures have to be regenerated. That 403 error is never seen in the dev tools console or by the js code, because of the first problem..

After regenerating the signatures, everything works like a charm.

Did you verify whether browser is making any 'OPTIONS' request. If yes what is the response headers.

发布评论

评论列表(0)

  1. 暂无评论