I'm using NestJS and the archiver
package to zip files in a streaming approach. The goal is to stream the zip archive either:
- Directly to Google Cloud Storage (GCS)
- Or to a Buffer (local testing)
Issue
- For small files (<1MB), it works fine.
- For files (>1MB) example around 5MB, the process takes too long and eventually times out.
- This only happens in local development. On live deployment, there is no issue.
Code Snippet
import { Injectable, Logger } from '@nestjs/common';
import * as archiver from 'archiver';
import { PassThrough } from 'stream';
import * as fs from 'fs';
@Injectable()
export class FileManagementService {
private readonly logger = new Logger(FileManagementService.name);
async zipFilesToStream(destinationPath: string, files: string[]) {
try {
const archive = archiver('zip', {
zlib: { level: 9 },
highWaterMark: 1024 * 1024 * 2, // 2MB
});
archive.on('error', (err) => {
this.logger.error('Archiver error:', err);
throw err;
});
// Testing with local file stream (to check if GCS is the issue)
const passThrough = new PassThrough();
const localWriteStream = fs.createWriteStream(destinationPath);
passThrough.on('error', (err) => this.logger.error('PassThrough error:', err));
localWriteStream.on('finish', () => this.logger.log('File written successfully.'));
passThrough.pipe(localWriteStream);
archive.pipe(passThrough);
// Add files to the archive
files.forEach((filePath) => {
archive.append(fs.createReadStream(filePath), { name: filePath });
});
await archive.finalize(); // Finalizing the archive
} catch (error) {
this.logger.error('Error while zipping:', error);
throw error;
}
}
}
What I've Tried & Observed
- Tried Different highWaterMark Values
- Used 1MB, 2MB, 6MB → No significant improvement.
- Tried Writing to a Local File Instead of GCS
- Even writing to a local file (fs.createWriteStream) behaves the same.
- This confirms that GCS is not the issue.
- Checked for Memory Issues (process.memoryUsage())
- No memory spikes, so memory isn't the bottleneck.
- Checked Stream Events (error, finish)
- No errors are thrown before timeout, but finish never fires for large files.
Question
Why does streaming work fine for small files but hang indefinitely for files larger than ~1MB in local development?
Any help is appreciated!