最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - How to upload a file to S3 via NodeJS and track progress in the browser? - Stack Overflow

programmeradmin3浏览0评论

I'm having some trouble wrapping my head around a couple on concepts when it es to file management and what not within NodeJS and wondered if anyone can point me in the right direction.

I have a queue list of uploads that will happen simultaneously as the user needs (allowing them to pause or delete a download at will) and I need the process to hit the Node server to resize the image or manipulate the video as needed. From here I need Node to upload the video or image via S3, using AWS-SDK (I set up a fakeS3 server for general sandbox testing) and as that is being uploaded I need to track the progress of that upload within the browser.

I am using React and server side rendering with Node so I figured there has to be a very straightforward way of handling this situation but I cannot find someone that has done this previously. Here are a couple concepts I was messing with:

Server.js (core input)

server.use(express.static(path.join(__dirname, 'public')));
server.use(multer({
    dest: './public/temp',
    limits: {
        fieldNameSize: 50,
        files: 1,
        fields: 5,
        fileSize: 1024 * 1024
    },
    rename: (fieldname, filename) => {
        return filename;
    },
    onFileUploadStart: (file) => {
        console.log('Starting file upload process.');
    },
    inMemory: true
}));
server.use(cookieParser());
server.use(bodyParser.urlencoded({ extended: true }));
server.use(bodyParser.json());

Intermediate route (/upload or something)

export function uploadContent(req, res) {
    const config = {
        s3ForcePathStyle: true,
        accessKeyId: 'ACCESS_KEY_ID',
        secretAccessKey: 'SECRET_ACCESS_KEY',
        endpoint: new AWS.Endpoint('http://localhost:4567'), // TODO make live
    };

    const client = new AWS.S3(config);

    const params = {
        Key: 'Key',
        Bucket: 'Bucket',
        Body: fs.createReadStream(req.body.files[0])
    };

    client.upload(params, function uploadCallback (err, data) {
        console.log(err, data);
    });
}

This does not work, due to body parser conflicting with the route, multer is having a time as well (I'm open to other suggestions) any information on how this would be acplished would be awesome. I'm not looking for full code just another idea to get me on the right path. I appreciate any help!

I'm having some trouble wrapping my head around a couple on concepts when it es to file management and what not within NodeJS and wondered if anyone can point me in the right direction.

I have a queue list of uploads that will happen simultaneously as the user needs (allowing them to pause or delete a download at will) and I need the process to hit the Node server to resize the image or manipulate the video as needed. From here I need Node to upload the video or image via S3, using AWS-SDK (I set up a fakeS3 server for general sandbox testing) and as that is being uploaded I need to track the progress of that upload within the browser.

I am using React and server side rendering with Node so I figured there has to be a very straightforward way of handling this situation but I cannot find someone that has done this previously. Here are a couple concepts I was messing with:

Server.js (core input)

server.use(express.static(path.join(__dirname, 'public')));
server.use(multer({
    dest: './public/temp',
    limits: {
        fieldNameSize: 50,
        files: 1,
        fields: 5,
        fileSize: 1024 * 1024
    },
    rename: (fieldname, filename) => {
        return filename;
    },
    onFileUploadStart: (file) => {
        console.log('Starting file upload process.');
    },
    inMemory: true
}));
server.use(cookieParser());
server.use(bodyParser.urlencoded({ extended: true }));
server.use(bodyParser.json());

Intermediate route (/upload or something)

export function uploadContent(req, res) {
    const config = {
        s3ForcePathStyle: true,
        accessKeyId: 'ACCESS_KEY_ID',
        secretAccessKey: 'SECRET_ACCESS_KEY',
        endpoint: new AWS.Endpoint('http://localhost:4567'), // TODO make live
    };

    const client = new AWS.S3(config);

    const params = {
        Key: 'Key',
        Bucket: 'Bucket',
        Body: fs.createReadStream(req.body.files[0])
    };

    client.upload(params, function uploadCallback (err, data) {
        console.log(err, data);
    });
}

This does not work, due to body parser conflicting with the route, multer is having a time as well (I'm open to other suggestions) any information on how this would be acplished would be awesome. I'm not looking for full code just another idea to get me on the right path. I appreciate any help!

Share Improve this question asked May 1, 2016 at 2:05 Mike HuebnerMike Huebner 6691 gold badge10 silver badges23 bronze badges
Add a ment  | 

1 Answer 1

Reset to default 15

you can use a socket chanel between your browser and your Nodejs Route and emit event start, end and progress and using httpUploadProgress event :

var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) { 
        console.log(evt); 
        //Emit Here your events 
}).
send(function(err, data) { console.log(err, data) });
发布评论

评论列表(0)

  1. 暂无评论