最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

pyspark - What is the python equivalent of net.snowflake.spark.snowflake.Utils.runQuery - Stack Overflow

programmeradmin4浏览0评论

From PySpark I am trying to execute a create table ... as select statement in snowflake. This means I can't use something like:

(
    spark.read.format('snowflake') \
    .options(**sfOptions) \
    .option("query", "create table ....") \
    .load()
)

as that only supports select statements.

In scala, one would use net.snowflake.spark.snowflake.Utils.runQuery.

In PySpark, I can access this via spark.sparkContext._jvm.snowflake.spark.snowflake.Utils.runQuery

Which gets the job done and returns "JavaObject id=..." whatever that means.

Is there some more reasonable way how to run a query against snoflake from PySpark?

I've searched on the databricks website (e.g. here) but haven't found anything.

I guess I could use the python Snowflake connector instead...

发布评论

评论列表(0)

  1. 暂无评论