最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - How can track write progress when piping with Node.js? - Stack Overflow

programmeradmin5浏览0评论

I am trying to track the progress of a pipe from a read stream to write stream so I can display the progress to the user.

My original idea was to track progress when the data event is emitted as shown here:

const fs = require('fs');
let final = fs.createWriteStream('output');

fs.createReadStream('file')
    .on('close', () => {
        console.log('done');
    })
    .on('error', (err) => {
        console.error(err);
    })
    .on('data', (data) => {
        console.log("data");
        /* Calculate progress */
    })
    .pipe(final);

However I realized just cause it was read, doesn't mean it was actually written. This can be seen if the pipe is removed, as the data event still emits.

How can track write progress when piping with Node.js?

I am trying to track the progress of a pipe from a read stream to write stream so I can display the progress to the user.

My original idea was to track progress when the data event is emitted as shown here:

const fs = require('fs');
let final = fs.createWriteStream('output');

fs.createReadStream('file')
    .on('close', () => {
        console.log('done');
    })
    .on('error', (err) => {
        console.error(err);
    })
    .on('data', (data) => {
        console.log("data");
        /* Calculate progress */
    })
    .pipe(final);

However I realized just cause it was read, doesn't mean it was actually written. This can be seen if the pipe is removed, as the data event still emits.

How can track write progress when piping with Node.js?

Share Improve this question asked Jul 9, 2019 at 1:33 user7886229user7886229 9273 gold badges10 silver badges27 bronze badges 1
  • If it adds any meaningful value to his thread, you may want to look into node.js buffers – J-Cake Commented Jul 9, 2019 at 4:42
Add a ment  | 

3 Answers 3

Reset to default 8

You can use a dummy Transform stream like this:

const stream = require('stream');

let totalBytes = 0;
stream.pipeline(
    fs.createReadStream(from_file),
    new stream.Transform({
        transform(chunk, encoding, callback) {
            totalBytes += chunk.length;
            console.log(totalBytes);
            this.push(chunk);
            callback();
        }
    }),
    fs.createWriteStream(to_file),
    err => {
        if (err)
            ...
    }
);

You can do the piping manually, and make use of the callback from writable.write()

callback: < function > Callback for when this chunk of data is flushed

const fs = require('fs');

let from_file = `<from_file>`;
let to_file = '<to_file>';

let from_stream = fs.createReadStream(from_file);
let to_stream   = fs.createWriteStream(to_file);

// get total size of the file
let { size } = fs.statSync(from_file);

let written = 0;
from_stream.on('data', data => {
    // do the piping manually here.
    to_stream.write(data, () => {
        written += data.length;
        console.log(`written ${written} of ${size} bytes (${(written/size*100).toFixed(2)}%)`);
    });
});

Somehow I remember this thread being about memory efficiency, anyway, I've rigged up a small script that's very memory efficient and tracks progress very well. I tested it under a 230MB file and the result speaks for itself. https://gist.github./J-Cake/78ce059972595823243526e022e327e4

The sample file I used was a bit weird as the content-length header it reported was in fact off but the program uses no more than 4.5 MiB of memory.

发布评论

评论列表(0)

  1. 暂无评论