最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

pyspark --packages spark mongodb connector gives me an error - Stack Overflow

programmeradmin0浏览0评论

I'm trying to connect to a mongodb using mongo-spark-connector via pyspark following the guide from the official guide document. .2/python/api/#python-spark-shell

Whenever I try to execute the below command, then I got the following error. What could be the most possible root cause of this error? I'm very new to spark.

  • spark version: 3.5.4
  • mongo spark connector version: 2.12:10.4.1 (updated. formely 2.12:10.2.1)
  • java version: openjdk 17.0.8 2023-07-18
  • mac os: 13.7.4 (apple silicon)
./bin/pyspark --conf "spark.mongodb.read.connection.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \               --conf "spark.mongodb.write.connection.uri=mongodb://127.0.0.1/test.myCollection" \               --packages .mongodb.spark:mongo-spark-connector_2.12:10.2.1
Python 3.13.2 (main, Feb  4 2025, 14:51:09) [Clang 15.0.0 (clang-1500.1.0.2.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
:: loading settings :: url = jar:file:/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/jars/ivy-2.5.1.jar!//apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /Users/user/.ivy2/cache
The jars for the packages stored in: /Users/user/.ivy2/jars
.mongodb.spark#mongo-spark-connector_2.12 added as a dependency
:: resolving dependencies :: .apache.spark#spark-submit-parent-d059da69-1de1-4318-a358-bd2d1f04d708;1.0
    confs: [default]
    found .mongodb.spark#mongo-spark-connector_2.12;10.4.1 in central
    found .mongodb#mongodb-driver-sync;5.1.4 in central
    [5.1.4] .mongodb#mongodb-driver-sync;[5.1.1,5.1.99)
    found .mongodb#bson;5.1.4 in central
    found .mongodb#mongodb-driver-core;5.1.4 in central
    found .mongodb#bson-record-codec;5.1.4 in central
:: resolution report :: resolve 3337ms :: artifacts dl 4ms
    :: modules in use:
    .mongodb#bson;5.1.4 from central in [default]
    .mongodb#bson-record-codec;5.1.4 from central in [default]
    .mongodb#mongodb-driver-core;5.1.4 from central in [default]
    .mongodb#mongodb-driver-sync;5.1.4 from central in [default]
    .mongodb.spark#mongo-spark-connector_2.12;10.4.1 from central in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   5   |   1   |   0   |   0   ||   5   |   0   |
    ---------------------------------------------------------------------
:: retrieving :: .apache.spark#spark-submit-parent-d059da69-1de1-4318-a358-bd2d1f04d708
    confs: [default]
    0 artifacts copied, 5 already retrieved (0kB/3ms)
25/03/13 16:53:34 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
25/03/13 16:53:35 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
25/03/13 16:53:35 WARN TransportChannelHandler: Exception in connection from /{myhost}:{myport}
java.lang.IllegalArgumentException: Too large frame: 5785721462337832960
    at .sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
    at .apache.sparkwork.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
    at .apache.sparkwork.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at ioty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at ioty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at ioty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
    at ioty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at ioty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at ioty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at ioty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at ioty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at ioty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /{myhost}:{myport} is closed
25/03/13 16:53:35 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: Too large frame: 5785721462337832960
    at .sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
    at .apache.sparkwork.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
    at .apache.sparkwork.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at ioty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at ioty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at ioty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
    at ioty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at ioty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at ioty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at ioty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at ioty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at ioty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 ERROR Utils: Uncaught exception in thread Thread-3
java.lang.NullPointerException: Cannot invoke ".apache.spark.rpc.RpcEndpointRef.ask(Object, scala.reflect.ClassTag)" because the return value of ".apache.spark.scheduler.local.LocalSchedulerBackend.localEndpoint()" is null
    at .apache.spark.scheduler.local.LocalSchedulerBackend$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
    at .apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
    at .apache.spark.scheduler.SchedulerBackend.stop(SchedulerBackend.scala:33)
    at .apache.spark.scheduler.SchedulerBackend.stop$(SchedulerBackend.scala:33)
    at .apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:103)
    at .apache.spark.scheduler.TaskSchedulerImpl.$anonfun$stop$2(TaskSchedulerImpl.scala:992)
    at .apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1375)
    at .apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:992)
    at .apache.spark.scheduler.DAGScheduler.$anonfun$stop$4(DAGScheduler.scala:2976)
    at .apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1375)
    at .apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2976)
    at .apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2258)
    at .apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1375)
    at .apache.spark.SparkContext.stop(SparkContext.scala:2258)
    at .apache.spark.SparkContext.stop(SparkContext.scala:2211)
    at .apache.spark.SparkContext.<init>(SparkContext.scala:706)
    at .apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
    at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
    at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
    at py4j.Gateway.invoke(Gateway.java:238)
    at py4jmands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4jmands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
    at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 WARN MetricsSystem: Stopping a MetricsSystem that is not running
25/03/13 16:53:35 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
py4j.Gateway.invoke(Gateway.java:238)
py4jmands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4jmands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
py4j.ClientServerConnection.run(ClientServerConnection.java:106)
java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
25/03/13 16:53:35 WARN TransportChannelHandler: Exception in connection from /{myhost}:{myport}
java.lang.IllegalArgumentException: Too large frame: 5785721462337832960
    at .sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
    at .apache.sparkwork.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
    at .apache.sparkwork.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at ioty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at ioty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at ioty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
    at ioty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at ioty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at ioty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at ioty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at ioty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at ioty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /{myhost}:{myport} is closed
25/03/13 16:53:35 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: Too large frame: 5785721462337832960
    at .sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
    at .apache.sparkwork.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
    at .apache.sparkwork.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at ioty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at ioty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at ioty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
    at ioty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at ioty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at ioty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at ioty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at ioty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at ioty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 ERROR Utils: Uncaught exception in thread Thread-3
java.lang.NullPointerException: Cannot invoke ".apache.spark.rpc.RpcEndpointRef.ask(Object, scala.reflect.ClassTag)" because the return value of ".apache.spark.scheduler.local.LocalSchedulerBackend.localEndpoint()" is null
    at .apache.spark.scheduler.local.LocalSchedulerBackend$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
    at .apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
    at .apache.spark.scheduler.SchedulerBackend.stop(SchedulerBackend.scala:33)
    at .apache.spark.scheduler.SchedulerBackend.stop$(SchedulerBackend.scala:33)
    at .apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:103)
    at .apache.spark.scheduler.TaskSchedulerImpl.$anonfun$stop$2(TaskSchedulerImpl.scala:992)
    at .apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1375)
    at .apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:992)
    at .apache.spark.scheduler.DAGScheduler.$anonfun$stop$4(DAGScheduler.scala:2976)
    at .apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1375)
    at .apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2976)
    at .apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2258)
    at .apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1375)
    at .apache.spark.SparkContext.stop(SparkContext.scala:2258)
    at .apache.spark.SparkContext.stop(SparkContext.scala:2211)
    at .apache.spark.SparkContext.<init>(SparkContext.scala:706)
    at .apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
    at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
    at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
    at py4j.Gateway.invoke(Gateway.java:238)
    at py4jmands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4jmands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
    at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
    at java.base/java.lang.Thread.run(Thread.java:840)
25/03/13 16:53:35 WARN MetricsSystem: Stopping a MetricsSystem that is not running
/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/shell.py:74: UserWarning: Failed to initialize Spark session.
  warnings.warn("Failed to initialize Spark session.")
Traceback (most recent call last):
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/shell.py", line 69, in <module>
    spark = SparkSession._create_shell_session()
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/sql/session.py", line 1145, in _create_shell_session
    return SparkSession._getActiveSessionOrCreate()
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/sql/session.py", line 1161, in _getActiveSessionOrCreate
    spark = builder.getOrCreate()
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/sql/session.py", line 497, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/context.py", line 515, in getOrCreate
    SparkContext(conf=conf or SparkConf())
    ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/context.py", line 203, in __init__
    self._do_init(
    ~~~~~~~~~~~~~^
        master,
        ^^^^^^^
    ...<10 lines>...
        memory_profiler_cls,
        ^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/context.py", line 296, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
                       ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/pyspark/context.py", line 421, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1587, in __call__
    return_value = get_return_value(
        answer, self._gateway_client, None, self._fqn)
  File "/opt/homebrew/Cellar/apache-spark/3.5.4/libexec/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
        "An error occurred while calling {0}{1}{2}.\n".
        format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling None.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalArgumentException: Too large frame: 5785721462337832960
    at .sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
    at .apache.sparkwork.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
    at .apache.sparkwork.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at ioty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at ioty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at ioty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
    at ioty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
    at ioty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
    at ioty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at ioty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at ioty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at ioty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at ioty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at ioty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:840)

25/03/13 16:53:35 ERROR Utils: Uncaught exception in thread shutdown-hook-0
java.lang.ExceptionInInitializerError
    at .apache.spark.executor.Executor.stop(Executor.scala:429)
    at .apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:90)
    at .apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
    at .apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at .apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928)
    at .apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.util.Try$.apply(Try.scala:213)
    at .apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at .apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: java.lang.NullPointerException: Cannot invoke ".apache.spark.SparkEnv.conf()" because the return value of ".apache.spark.SparkEnv$.get()" is null
    at .apache.spark.shuffle.ShuffleBlockPusher$.<init>(ShuffleBlockPusher.scala:499)
    at .apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala)
    ... 16 more
25/03/13 16:53:35 WARN ShutdownHookManager: ShutdownHook '' failed, java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
    at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
    at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
    at .apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
    at .apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
Caused by: java.lang.ExceptionInInitializerError
    at .apache.spark.executor.Executor.stop(Executor.scala:429)
    at .apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:90)
    at .apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
    at .apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at .apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928)
    at .apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.util.Try$.apply(Try.scala:213)
    at .apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at .apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: java.lang.NullPointerException: Cannot invoke ".apache.spark.SparkEnv.conf()" because the return value of ".apache.spark.SparkEnv$.get()" is null
    at .apache.spark.shuffle.ShuffleBlockPusher$.<init>(ShuffleBlockPusher.scala:499)
    at .apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala)
    ... 16 more

I've checked the compatibility regarding spark-mongo-connector and spark. But it seems that there is no problem with my local environment version settings.

I could connect to Mongodb with the connection uri when using other tools, so this would not be the root cause.

When I tried the provided command with spark-shell(which is spark shell), it also failed.

发布评论

评论列表(0)

  1. 暂无评论