I am using examples/pi-with-spark-connect-plugin.yaml
from Apache Spark operator.
Adding spark.plugins: ".apache.spark.sql.connect.SparkConnectPlugin"
should start the connect server but it not is happening. Any ideas? Thx
I am using examples/pi-with-spark-connect-plugin.yaml
from Apache Spark operator.
Adding spark.plugins: ".apache.spark.sql.connect.SparkConnectPlugin"
should start the connect server but it not is happening. Any ideas? Thx
1 Answer
Reset to default 0try to use sbin/start-connect-server.sh
.
The plugin way is for you to start the connect server along with other app. For example you can start Spark thrift server and connect server together by bash sbin/start-thriftserver.sh --conf spark.plugins=.apache.spark.sql.connect.SparkConnectPlugin --packages=.apache.spark:spark-connect_2.12:3.5.1