最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Upload large files in chunks using nodejs - Stack Overflow

programmeradmin0浏览0评论

I'm trying to upload very large files in chunks to a server. The server side works great. In the function I've e up with, it tries to upload all the files at once. I'm new to async/await and am unsure how to proceed. The goal is to just upload one at a time in chunks.

Note that I've stripped this out of a class, updated for clarity, and remove all the unnecessary bits.

Using node-fetch to add fetch().

Thanks for any help or advice :)

let tree = []

const start = () => {
    scan('/foo/bar', err => { 
        tree.forEach(file => {
            upload(file)
        })
    })
}

const scan = dir => { ... } // assigns array of files to tree

const upload = file => {
    const stream = fs.createReadStream(file) 
    stream.on('data', async chunk => {
        stream.pause()
        try {
            let response = await fetch('https://some/endpoint', {
                method: 'post',
                body: JSON.stringify({
                    file: file,
                    chunk: chunk.toString('base64')
                }) 
            })
            stream.resume()
        } catch(e) {
            // handle errors
        }
    })
    stream.on('end', () => {
       console.log('end', file)
    })
    stream.on('error', err => {
        console.log('error', err)
    })
}

I'm trying to upload very large files in chunks to a server. The server side works great. In the function I've e up with, it tries to upload all the files at once. I'm new to async/await and am unsure how to proceed. The goal is to just upload one at a time in chunks.

Note that I've stripped this out of a class, updated for clarity, and remove all the unnecessary bits.

Using node-fetch to add fetch().

Thanks for any help or advice :)

let tree = []

const start = () => {
    scan('/foo/bar', err => { 
        tree.forEach(file => {
            upload(file)
        })
    })
}

const scan = dir => { ... } // assigns array of files to tree

const upload = file => {
    const stream = fs.createReadStream(file) 
    stream.on('data', async chunk => {
        stream.pause()
        try {
            let response = await fetch('https://some/endpoint', {
                method: 'post',
                body: JSON.stringify({
                    file: file,
                    chunk: chunk.toString('base64')
                }) 
            })
            stream.resume()
        } catch(e) {
            // handle errors
        }
    })
    stream.on('end', () => {
       console.log('end', file)
    })
    stream.on('error', err => {
        console.log('error', err)
    })
}

Share Improve this question edited Feb 11, 2022 at 0:52 Bill Kervaski asked Feb 11, 2022 at 0:40 Bill KervaskiBill Kervaski 6031 gold badge7 silver badges20 bronze badges 2
  • Just a note: This question is not really about async/await. It is more about streams. That is a very specialized theme. Maybe you might consider using a library for this? There are many. Like for example npmjs./package/formidable – user3658510 Commented Feb 11, 2022 at 1:05
  • 2 Well, won't learn anything by letting a library do the heavy lifting ... but thanks for the reply :) – Bill Kervaski Commented Feb 11, 2022 at 15:39
Add a ment  | 

1 Answer 1

Reset to default 6

Array.prototype.forEach runs synchronously until all elements of an array have been processed, discarding any value returned by the argument function it calls.

To upload one file at a time, try using await (inside an asynchronous function) to wait for each file to upload, in bination with returning a promise from the upload process that bees fulfilled when an upload has been pleted. Conceptually (with no graceful error handling) it might look like

const uploadTree = async tree => {
    for( file of tree) {
         await uploadFile( file); // process files one at a time
    }
}

// wrap the existing upload code in a promise executor:

const uploadFile = file => new Promise( (resolve, reject) => {
     const stream = fs.createReadStream(file)
     stream.on('data' ...
        ...
     } catch(e) {
        console.log( "Error uploading chunk of %s", file)
        reject(e) // handle errors
     }
     stream.on('end', resolve)
     stream.on('error', e=> {
        console.log(" Error reading %s", file)
        reject (e)
     })
});

// upload plete tree array files:

uploadTree(tree)
.then( ()=> console.log( "tree uploaded successfully"));
.catch( err=> console.log("An upload failed: ", err));
发布评论

评论列表(0)

  1. 暂无评论