最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - get a stream's content-length - Stack Overflow

programmeradmin1浏览0评论

In my node.js app, I'm using gm (a graphicsmagick wrapper) and aws2js (amazon cli wrapper). The problem is that Amazon needs to know the content-length so that I can put the stream on S3.

I'm uploading an image to my app, read that file by creating a stream:

var fileStream=fs.createReadStream(file.path)

I pass that file to gm, resize it and then tell it to stream that file. I then want to put that stream to aws:

gm( fileStream, "some.png" ).                               
    identify({bufferStream: true}, function(err, info) {
        this.stream("png", function (err, stdout, stderr) {
            if (err){console.log(err);cb(err);return;}

            aws.S3.putStream(path, stdout, 'public-read', {'content-length': ?????, 'content-type': 'image/png'}, function (err, result) {
            .....       
            }
        });
    });
});

The problem is that Amazon needs to know the content-length (its not the library) to put the stream. AWS doesn't support chunked streams.

Does anyone know how I could determine the content-length of a stream? Or would the only solution be to tmp write it to the disk, read the file as a stream and then put it to amazon with the content-length of the temp file?

In my node.js app, I'm using gm (a graphicsmagick wrapper) and aws2js (amazon cli wrapper). The problem is that Amazon needs to know the content-length so that I can put the stream on S3.

I'm uploading an image to my app, read that file by creating a stream:

var fileStream=fs.createReadStream(file.path)

I pass that file to gm, resize it and then tell it to stream that file. I then want to put that stream to aws:

gm( fileStream, "some.png" ).                               
    identify({bufferStream: true}, function(err, info) {
        this.stream("png", function (err, stdout, stderr) {
            if (err){console.log(err);cb(err);return;}

            aws.S3.putStream(path, stdout, 'public-read', {'content-length': ?????, 'content-type': 'image/png'}, function (err, result) {
            .....       
            }
        });
    });
});

The problem is that Amazon needs to know the content-length (its not the library) to put the stream. AWS doesn't support chunked streams.

Does anyone know how I could determine the content-length of a stream? Or would the only solution be to tmp write it to the disk, read the file as a stream and then put it to amazon with the content-length of the temp file?

Share Improve this question edited Feb 11, 2012 at 13:56 JJJ 33.2k20 gold badges94 silver badges103 bronze badges asked Feb 11, 2012 at 13:51 japrescottjaprescott 5,0233 gold badges28 silver badges38 bronze badges 2
  • 1 Did you ever figure this out? I'm trying to do exactly the same thing... – Bill Commented Aug 7, 2012 at 23:48
  • 1 Bill; I buffer the result and get the size of it. Sadly, there is no other way of doing it, because amazon needs to know the size and a stream is well - a stream - no fixed length. – japrescott Commented Aug 13, 2012 at 12:00
Add a ment  | 

2 Answers 2

Reset to default 2
gm( fileStream ).filesize({bufferStream: true}, function (error, filesize) {
  this.stream(function (error, stdout, stderr) {
    aws.S3.putStream(path, stdout, 'public-read', {'content-length': filesize, 'content-type': 'image/png'}, function (error, result) {
      // .....       
    });
  });
});

It depends on how you are doing the upload. I'm using express.js and I'm getting the length from the req.files object like this req.files.[file input name].length.

I'm actually trying to do this exact same thing but I'm having a problem with aws2js recognizing the stream. let me know if you get it working.

发布评论

评论列表(0)

  1. 暂无评论