最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Hadoop Pseudo-Distributed Mode Error: launch_container.sh: line 58: unexpected EOF while looking for matching '&quot

programmeradmin2浏览0评论

I tried Hadoop 3.3.6 and 3.4.0 on Ubuntu 22, using both Java 8 and Java 11, but I encounter this error every time I validate. However, I can’t seem to find the launch_container.sh file. Is there a problem with my configuration, or could it be an issue with user permissions (I'm not running as root, but all relevant configuration folders have been chown -R ubuntu:ubuntu)?

error:

ubuntu@ubuntu:/LogData/hadoop-3.4.0/etc/hadoop$ hadoop jar /LogData/hadoop-3.4.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.4.0.jar wordcount /input /output
2025-02-09 21:50:47,963 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at hadoop/192.168.150.129:8032
2025-02-09 21:50:48,763 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/ubuntu/.staging/job_1739108888914_0001
2025-02-09 21:50:49,402 INFO input.FileInputFormat: Total input files to process : 1
2025-02-09 21:50:49,587 INFO mapreduce.JobSubmitter: number of splits:1
2025-02-09 21:50:49,958 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1739108888914_0001
2025-02-09 21:50:49,958 INFO mapreduce.JobSubmitter: Executing with tokens: []
2025-02-09 21:50:50,266 INFO conf.Configuration: resource-types.xml not found
2025-02-09 21:50:50,267 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2025-02-09 21:50:51,012 INFO impl.YarnClientImpl: Submitted application application_1739108888914_0001
2025-02-09 21:50:51,091 INFO mapreduce.Job: The url to track the job: http://hadoop:8088/proxy/application_1739108888914_0001/
2025-02-09 21:50:51,093 INFO mapreduce.Job: Running job: job_1739108888914_0001
2025-02-09 21:50:56,254 INFO mapreduce.Job: Job job_1739108888914_0001 running in uber mode : false
2025-02-09 21:50:56,256 INFO mapreduce.Job:  map 0% reduce 0%
2025-02-09 21:50:56,294 INFO mapreduce.Job: Job job_1739108888914_0001 failed with state FAILED due to: Application application_1739108888914_0001 failed 2 times due to AM Container for appattempt_1739108888914_0001_000002 exited with  exitCode: 2
Failing this attempt.Diagnostics: [2025-02-09 21:50:55.445]Exception from container-launch.
Container id: container_1739108888914_0001_02_000001
Exit code: 2

[2025-02-09 21:50:55.450]Container exited with a non-zero exit code 2. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
/LogData/hdfs/yarn/usercache/ubuntu/appcache/application_1739108888914_0001/container_1739108888914_0001_02_000001/launch_container.sh: line 58: unexpected EOF while looking for matching `"'

[2025-02-09 21:50:55.451]Container exited with a non-zero exit code 2. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
/LogData/hdfs/yarn/usercache/ubuntu/appcache/application_1739108888914_0001/container_1739108888914_0001_02_000001/launch_container.sh: line 58: unexpected EOF while looking for matching `"'

For more detailed output, check the application tracking page: http://hadoop:8088/cluster/app/application_1739108888914_0001 Then click on links to logs of each attempt.
. Failing the application.
2025-02-09 21:50:56,335 INFO mapreduce.Job: Counters: 0

core-site.xml:

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://hadoop:9000</value>
    </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/LogData/hdfs/tmp</value>
    </property>
</configuration>

hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.namenode.secondary.http-address</name>
        <value>hadoop:50090</value>
    </property>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>/LogData/hdfs/namenode</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>/LogData/hdfs/datanode</value>
    </property>
</configuration>

mapred-site.xml

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>hadoop</value>
    </property>
    <property>
        <name>mapreduce.jobhistory.address</name>
        <value>hadoop:10020</value>
    </property>
    <property>
        <name>mapreduce.jobhistory.webapp.address</name>
        <value>hadoop:19888</value>
    </property>
    <property>
        <name>yarn.app.mapreduce.am.env</name>
        <value>HADOOP_MAPRED_HOME=/LogData/hadoop-3.4.0</value>
    </property>
    <property>
        <name>mapreduce.map.env</name>
        <value>HADOOP_MAPRED_HOME=/LogData/hadoop-3.4.0</value>
    </property>
    <property>
        <name>mapreduce.reduce.env</name>
        <value>HADOOP_MAPRED_HOME=/LogData/hadoop-3.4.0</value>
    </property>
</configuration>

yarn-site.xml:

<configuration>
    <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>hadoop</value>
    </property>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.local-dirs</name>
        <value>/LogData/hdfs/yarn</value>
    </property>
</configuration>

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论