最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

apache spark - How to package separate dependencies for driver and executor in pyspark? - Stack Overflow

programmeradmin5浏览0评论

I am looking various approaches for pyspark package management. I went through .html . As per my understanding, the zip file will be downloaded both in driver and executors via all methods. I am wondering if it is possible to specify certain packages to be only in driver and not in executor? Is my understanding wrong?

My use case is that i need some packages for at driver end only. These package might be having some size. The same package wont be used at executor at all. I did not see driver/executor classpath java sort of approaches in pyspark. Can you reccomend some best practices regarding pyspark dependency management?

Thank you.

发布评论

评论列表(0)

  1. 暂无评论