最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

mongodb - Unable to fetch newly added columns while reading mongo db streams using mongo-spark connector - Stack Overflow

programmeradmin4浏览0评论

I'm Trying to read a mongodb stream, using the spark connector but whenever i try to add any new fields to any of the documents it's not getting populated to the dataframe. need help here.

i tried using inferSchema true option which doen't work

here is the code i'm using:

mongo_df = (spark.readStream
.format("mongodb")
.option("spark.mongodb.collection", collection)
.option("spark.mongodb.database", database_name)
.option("spark.mongodb.connection.uri", mongodb_uri)
.option("spark.mongodb.change.stream.publish.full.document.only", "true")
.load())

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论