I'm Trying to read a mongodb stream, using the spark connector but whenever i try to add any new fields to any of the documents it's not getting populated to the dataframe. need help here.
i tried using inferSchema true option which doen't work
here is the code i'm using:
mongo_df = (spark.readStream
.format("mongodb")
.option("spark.mongodb.collection", collection)
.option("spark.mongodb.database", database_name)
.option("spark.mongodb.connection.uri", mongodb_uri)
.option("spark.mongodb.change.stream.publish.full.document.only", "true")
.load())