最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

snowflake cloud data platform - snowpipe does not process 2GB .gz file - Stack Overflow

programmeradmin1浏览0评论

I have done setup for processing s3 file, and the process is tested with 5MB .gz file, and its working perfectly,

Now I have to test the process with large file, I uploaded a file with 2GB .gz file, but its not processing, it is not even showing error anywhere

I know recommended size, but is there any limit of size, or is there any default limit set that I need to change like on integration or stage or pipe anywhere

Thanks,

I have done setup for processing s3 file, and the process is tested with 5MB .gz file, and its working perfectly,

Now I have to test the process with large file, I uploaded a file with 2GB .gz file, but its not processing, it is not even showing error anywhere

I know recommended size, but is there any limit of size, or is there any default limit set that I need to change like on integration or stage or pipe anywhere

Thanks,

Share Improve this question asked Mar 17 at 7:43 Md. Parvez AlamMd. Parvez Alam 4,5966 gold badges57 silver badges119 bronze badges 1
  • Please update your question with the pipe definition. I’m assuming you’re auto-ingesting; if you are, what happens if you trigger the pipe manually? – NickW Commented Mar 17 at 20:04
Add a comment  | 

1 Answer 1

Reset to default 1

There is no maximum size limit, and the documents even say that "Loading very large files (e.g. 100 GB or larger) is not recommended."

https://docs.snowflake/en/user-guide/data-load-considerations-prepare#general-file-sizing-recommendations

So 2 GB shouldn't hit any limit. Can you also try to upload a non-compressed version of the same file?

发布评论

评论列表(0)

  1. 暂无评论