最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

node.js - NestJS Archiver Stream Fails for Large Files (Takes Too Long & Times Out) - Stack Overflow

programmeradmin2浏览0评论

I'm using NestJS and the archiver package to zip files in a streaming approach. The goal is to stream the zip archive either:

  • Directly to Google Cloud Storage (GCS)
  • Or to a Buffer (local testing)

Issue

  • For small files (<1MB), it works fine.
  • For files (>1MB) example around 5MB, the process takes too long and eventually times out.
  • This only happens in local development. On live deployment, there is no issue.

Code Snippet

import { Injectable, Logger } from '@nestjs/common';
import * as archiver from 'archiver';
import { PassThrough } from 'stream';
import * as fs from 'fs';

@Injectable()
export class FileManagementService {
  private readonly logger = new Logger(FileManagementService.name);

  async zipFilesToStream(destinationPath: string, files: string[]) {
    try {
      const archive = archiver('zip', {
        zlib: { level: 9 },
        highWaterMark: 1024 * 1024 * 2, // 2MB
      });

      archive.on('error', (err) => {
        this.logger.error('Archiver error:', err);
        throw err;
      });

      // Testing with local file stream (to check if GCS is the issue)
      const passThrough = new PassThrough();
      const localWriteStream = fs.createWriteStream(destinationPath);

      passThrough.on('error', (err) => this.logger.error('PassThrough error:', err));
      localWriteStream.on('finish', () => this.logger.log('File written successfully.'));

      passThrough.pipe(localWriteStream);
      archive.pipe(passThrough);

      // Add files to the archive
      files.forEach((filePath) => {
        archive.append(fs.createReadStream(filePath), { name: filePath });
      });

      await archive.finalize(); // Finalizing the archive

    } catch (error) {
      this.logger.error('Error while zipping:', error);
      throw error;
    }
  }
}

What I've Tried & Observed

  1. Tried Different highWaterMark Values
  • Used 1MB, 2MB, 6MB → No significant improvement.
  1. Tried Writing to a Local File Instead of GCS
  • Even writing to a local file (fs.createWriteStream) behaves the same.
  • This confirms that GCS is not the issue.
  1. Checked for Memory Issues (process.memoryUsage())
  • No memory spikes, so memory isn't the bottleneck.
  1. Checked Stream Events (error, finish)
  • No errors are thrown before timeout, but finish never fires for large files.

Question

Why does streaming work fine for small files but hang indefinitely for files larger than ~1MB in local development?

Any help is appreciated!

发布评论

评论列表(0)

  1. 暂无评论