最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - putObject() times out on files larger than ~1MB - Stack Overflow

programmeradmin2浏览0评论

When trying to upload a CSV file to my S3 bucket using the putObject() function, it will timeout if the file is larger than about 1 MB. The largest CSV file I've been able to successfully upload is 1048 KB (25500 lines), and it was only successful four of the 6 times I tried uploading it. Any file smaller than 1 MB though uploads very quickly without any failure. What would cause the function to get hung up and time out once it reaches the ~1 MB threshold?

Relevant code is below:

function writeToS3(){
    var params = {
        Bucket: bucketName,
        Key: fileName,
        Body: fs.createReadStream(fileName)
    }
	    	
    s3.putObject(params, function(err, data){
        if(err){
            console.log('Error uploading data to S3 bucket: ' + err);
            throw err;
        } else{
            console.log("Data uploaded to S3 bucket: ", util.inspect(data));
        }
    });
}

When trying to upload a CSV file to my S3 bucket using the putObject() function, it will timeout if the file is larger than about 1 MB. The largest CSV file I've been able to successfully upload is 1048 KB (25500 lines), and it was only successful four of the 6 times I tried uploading it. Any file smaller than 1 MB though uploads very quickly without any failure. What would cause the function to get hung up and time out once it reaches the ~1 MB threshold?

Relevant code is below:

function writeToS3(){
    var params = {
        Bucket: bucketName,
        Key: fileName,
        Body: fs.createReadStream(fileName)
    }
	    	
    s3.putObject(params, function(err, data){
        if(err){
            console.log('Error uploading data to S3 bucket: ' + err);
            throw err;
        } else{
            console.log("Data uploaded to S3 bucket: ", util.inspect(data));
        }
    });
}

Below is the error message:

 Error uploading data to S3 bucket: TimeoutError: Connection timed out after 120000ms
    C:\Users\426782\documents\workspace-sts\lf-rti-file-copier-sql\node_modules\aws-sdk-proxy\node_modules\aws-sdk\lib\request.js:31
                throw err;
                ^

    TimeoutError: Connection timed out after 120000ms
        at ClientRequest.<anonymous> (C:\Users\426782\documents\workspace-sts\lf-rti-file-copier-sql\node_modules\aws-sdk-proxy\node_modules\aws-sdk\lib\http\node.js:83:34)
        at ClientRequest.g (events.js:292:16)
        at emitNone (events.js:86:13)
        at ClientRequest.emit (events.js:185:7)
        at Socket.emitTimeout (_http_client.js:630:10)
        at Socket.g (events.js:292:16)
        at emitNone (events.js:86:13)
        at Socket.emit (events.js:185:7)
        at Socket._onTimeout (net.js:338:8)
        at ontimeout (timers.js:386:11)
        at tryOnTimeout (timers.js:250:5)
        at Timer.listOnTimeout (timers.js:214:5)

    npm ERR! Windows_NT 10.0.14393
    npm ERR! argv "C:\\Program Files\\nodejs\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js" "run" "test" "65056"
    npm ERR! node v6.11.4
    npm ERR! npm  v3.10.10
    npm ERR! code ELIFECYCLE
    npm ERR! [email protected] test: `serverless invoke local -f dataRefresh -s dev -d  "65056"`
    npm ERR! Exit status 1
    npm ERR!
    npm ERR! Failed at the [email protected] test script 'serverless invoke local -f dataRefresh -s dev -d  "65056"'.
    npm ERR! Make sure you have the latest version of node.js and npm installed.
    npm ERR! If you do, this is most likely a problem with the lf-rti-file-copier package,
    npm ERR! not with npm itself.
    npm ERR! Tell the author that this fails on your system:
    npm ERR!     serverless invoke local -f dataRefresh -s dev -d  "65056"
    npm ERR! You can get information on how to open an issue for this project with:
    npm ERR!     npm bugs lf-rti-file-copier
    npm ERR! Or if that isn't available, you can get their info via:
    npm ERR!     npm owner ls lf-rti-file-copier
    npm ERR! There is likely additional logging output above.
Share Improve this question edited Jan 10, 2018 at 17:32 Dez 5,8488 gold badges46 silver badges52 bronze badges asked Jan 9, 2018 at 19:20 RobertRobert 5211 gold badge8 silver badges27 bronze badges 2
  • "it will timeout" ... what exactly is timing out? What error is being thrown? – Michael - sqlbot Commented Jan 10, 2018 at 0:55
  • Hi @Michael, I've updated the question with the error message – Robert Commented Jan 10, 2018 at 16:23
Add a ment  | 

1 Answer 1

Reset to default 5

I had a similar issue, and I suggest you try this method upload. https://docs.aws.amazon./AWSJavaScriptSDK/latest/AWS/S3.html#upload-property

Uploads an arbitrarily sized buffer, blob, or stream, using intelligent concurrent handling of parts if the payload is large enough.

Here is an example how your code can be transformed:

function writeToS3() {
  fs.readFile(fileName, 'utf8', function (err, contents) {
    if (err) {
      console.log('Error during reading file: ' + err);
      throw err;
    } else {
      // Create a binary data from stream
      var base64data = Buffer.from(contents, 'binary')

      var params = {
        Bucket: bucketName,
        Key: fileName,
        Body: base64data
      }

      s3.upload(params, function (err, data) {
        if (err) {
          console.log('Error uploading data to S3 bucket: ' + err);
          throw err;
        } else {
          console.log("Data uploaded to S3 bucket: ", util.inspect(data));
        }
      });
    }
  });
}

UPDATE:

Also, please, take a look at Bluebird library method, this will help you to transform all methods into promises so that it will be more readable.

CHANGE TIMEOUT

There is a place in Lambda dashboard where you can change Timeout Here is a screenshot of it. Please change it to 5 minutes.

发布评论

评论列表(0)

  1. 暂无评论