I have done setup for processing s3 file, and the process is tested with 5MB .gz file, and its working perfectly,
Now I have to test the process with large file, I uploaded a file with 2GB .gz file, but its not processing, it is not even showing error anywhere
I know recommended size, but is there any limit of size, or is there any default limit set that I need to change like on integration or stage or pipe anywhere
Thanks,
I have done setup for processing s3 file, and the process is tested with 5MB .gz file, and its working perfectly,
Now I have to test the process with large file, I uploaded a file with 2GB .gz file, but its not processing, it is not even showing error anywhere
I know recommended size, but is there any limit of size, or is there any default limit set that I need to change like on integration or stage or pipe anywhere
Thanks,
Share Improve this question asked Mar 17 at 7:43 Md. Parvez AlamMd. Parvez Alam 4,5966 gold badges57 silver badges119 bronze badges 1- Please update your question with the pipe definition. I’m assuming you’re auto-ingesting; if you are, what happens if you trigger the pipe manually? – NickW Commented Mar 17 at 20:04
1 Answer
Reset to default 1There is no maximum size limit, and the documents even say that "Loading very large files (e.g. 100 GB or larger) is not recommended."
https://docs.snowflake/en/user-guide/data-load-considerations-prepare#general-file-sizing-recommendations
So 2 GB shouldn't hit any limit. Can you also try to upload a non-compressed version of the same file?