Context: I'm working on code which uses a read stream to download a file from an SFTP server and upload it to GCS via a writeStream, using Nodejs v10.15.3.
Due to an error in the SFTP library I'm working, stream.pipe
(that is, piping from the read stream the library produces) is actually broken in Node 10, Because of this, I am attempting to instead upload this file via the following code (where stream
is the read stream and unnecessary information has been left out):
let acl = fileMode;
if (fileMode === 'public') {
// options for acl are publicRead and private
// need to add the Read if public
acl += 'Read';
}
var options = {
predefinedAcl: acl,
destination: destPath,
metadata: {
contentType: contentType,
cacheControl: 'no-cache'
}
};
// Add in a check here for if the bucket exists
let file = new File(bucket, destPath);
let writeStream = file.createWriteStream(options);
writeStream.on('finish', () => {
file.getMetadata()
.then((metadata) => {
console.log('metadata', metadata);
return resolve(metadata);
})
.catch(error => {
console.error('Error getting file metadata', error);
return reject(error);
});
});
stream.on('end', () => {
try {
writeStream.end();
} catch (err) {
console.error('Error closing writeStream', err);
return reject(err);
}
});
writeStream.on('error', error => {
console.error('Error in writeStream', error);
return reject(error);
});
stream.on('error', error => {
console.error('Error in stream', error);
return reject(error);
});
let data = stream.read();
while (data) {
writeStream.write(data);
data = stream.read();
}
When I use the while (data)
method to stream from our SFTP server to a local file on the filesystem, this works without incident. However, when I try to run this code to upload to our GCS file, I get the following error:
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increase limit
Error in writeStream Error: Retry limit exceeded
// stacktrace omitted
Error Uploading to GCS from a stream: Retry limit exceeded
Error: Retry limit exceeded
It seems like I must be doing something wrong here, but I have no idea why this isn't a valid method, nor am I sure if I'm missing some subtlety of streams (which I freely confess are pretty much a black box to me) or an issue with GCS.
EDIT: Okay, this actually appears to be pletely unrelated to the SFTP issue. I've tried just uploading a file from the local fs using the remended method, and am seeing the same error. The more 'streamlined' code I'm trying is:
// Add in a check here for if the bucket exists
let file = new File(bucket, destPath);
fs.createReadStream('sample_file.csv')
.pipe(file.createWriteStream(options))
.on('error', function(err) {
console.error('err', err);
return reject(err);
})
.on('finish', function() {
console.log('resolving');
return resolve({gcsUrl: url});
});
Context: I'm working on code which uses a read stream to download a file from an SFTP server and upload it to GCS via a writeStream, using Nodejs v10.15.3.
Due to an error in the SFTP library I'm working, stream.pipe
(that is, piping from the read stream the library produces) is actually broken in Node 10, Because of this, I am attempting to instead upload this file via the following code (where stream
is the read stream and unnecessary information has been left out):
let acl = fileMode;
if (fileMode === 'public') {
// options for acl are publicRead and private
// need to add the Read if public
acl += 'Read';
}
var options = {
predefinedAcl: acl,
destination: destPath,
metadata: {
contentType: contentType,
cacheControl: 'no-cache'
}
};
// Add in a check here for if the bucket exists
let file = new File(bucket, destPath);
let writeStream = file.createWriteStream(options);
writeStream.on('finish', () => {
file.getMetadata()
.then((metadata) => {
console.log('metadata', metadata);
return resolve(metadata);
})
.catch(error => {
console.error('Error getting file metadata', error);
return reject(error);
});
});
stream.on('end', () => {
try {
writeStream.end();
} catch (err) {
console.error('Error closing writeStream', err);
return reject(err);
}
});
writeStream.on('error', error => {
console.error('Error in writeStream', error);
return reject(error);
});
stream.on('error', error => {
console.error('Error in stream', error);
return reject(error);
});
let data = stream.read();
while (data) {
writeStream.write(data);
data = stream.read();
}
When I use the while (data)
method to stream from our SFTP server to a local file on the filesystem, this works without incident. However, when I try to run this code to upload to our GCS file, I get the following error:
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increase limit
Error in writeStream Error: Retry limit exceeded
// stacktrace omitted
Error Uploading to GCS from a stream: Retry limit exceeded
Error: Retry limit exceeded
It seems like I must be doing something wrong here, but I have no idea why this isn't a valid method, nor am I sure if I'm missing some subtlety of streams (which I freely confess are pretty much a black box to me) or an issue with GCS.
EDIT: Okay, this actually appears to be pletely unrelated to the SFTP issue. I've tried just uploading a file from the local fs using the remended method, and am seeing the same error. The more 'streamlined' code I'm trying is:
// Add in a check here for if the bucket exists
let file = new File(bucket, destPath);
fs.createReadStream('sample_file.csv')
.pipe(file.createWriteStream(options))
.on('error', function(err) {
console.error('err', err);
return reject(err);
})
.on('finish', function() {
console.log('resolving');
return resolve({gcsUrl: url});
});
Share
Improve this question
edited Apr 24, 2019 at 16:20
Amber B.
asked Apr 18, 2019 at 13:58
Amber B.Amber B.
1,1742 gold badges10 silver badges20 bronze badges
11
- Where do you run your code? Is it App Engine, Cloud Function, Cloud Run, Compute VM Instance or the server outside of the Google Cloud? – Pawel Czuczwara Commented Apr 23, 2019 at 13:23
- This Error: Uploading to GCS from a stream: Retry limit exceeded Is escalated by Client Library: for error 404 that occurs more than 5 times or from 499 to 600 when exponential back off fails more than 5 times. – Pawel Czuczwara Commented Apr 23, 2019 at 13:29
- Right now it's running via a local docker setup outside of the Google Cloud, which is where we develop and test our services locally before deploying them. – Amber B. Commented Apr 24, 2019 at 14:16
- Did you configure properly the CORS for the bucket? Did you configure your user/service account to have write access to the bucket? And can you verify that upload using gsutil (Could SDK) works from that configuration that you use? – Pawel Czuczwara Commented Apr 24, 2019 at 14:21
-
The answer to all three should be yes. Of note is that
bucket.exists
works, etc. – Amber B. Commented Apr 24, 2019 at 14:42
2 Answers
Reset to default 5As correctly pointed out by Alex Riquelme this warning happens when you surpass the maximum default listeners for an event in Node.js. The maximum number of listeners for an event in in Node.js by default is 10. You can change this value, however it´s not remended in this situation because it would be a waste of resources as the leak will be still there.
The reason why multiple listeners are going to be created to upload the files in GCS is because resumable uploads are enable by default in createWriteStream. In your case, as you are uploading a lot of small files, the remended approach is to set options.resumable
to false
. That way you will avoid the overhead caused by the resumable uploads without having to allow more listeners to be created.
This warning is actually expected. When you try to upload a file to GCS, it will try to optimize this upload and it will split your file in chunks (normally in chunks of 1MB). So, it will create multiple listeners to upload this file. By default, the max number of listeners in Node.js
is 10
(Take a look at this documentation). if you want to set the number of listeners to unlimited just set the variable setMaxListeners(0);
to 0