91超碰碰碰碰久久久久久综合_超碰av人澡人澡人澡人澡人掠_国产黄大片在线观看画质优化_txt小说免费全本

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

CDH Spark2的spark2-submit的一個No such file or directory問題怎么解決

發布時間:2021-12-17 11:54:09 來源:億速云 閱讀:551 作者:柒染 欄目:大數據

這期內容當中小編將會給大家帶來有關CDH Spark2的spark2-submit的一個No such file or directory問題怎么解決,文章內容豐富且以專業的角度為大家分析和敘述,閱讀完這篇文章希望大家可以有所收獲。

運行:
在測試的CDH Spark2, 運行spark streaming,
命令如下:

  • spark2-submit \

  • --class com.telenav.dataplatform.demo.realtimecases.WeatherAlerts \

  • --master yarn --deploy-mode cluster \

  • /usr/local/sparkProject/realtimeCases-0.0.1-SNAPSHOT.jar



錯誤:

  • 17/03/02 21:01:56 INFO cluster.YarnClusterScheduler: Adding task set 0.0 with 1 tasks

  • 17/03/02 21:01:56 WARN net.ScriptBasedMapping: Exception running /etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py 172.16.102.64

  • java.io.IOException: Cannot run program "/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py" (in directory "/yarn/nm/usercache/spark/appcache/application_1488459089260_0003/container_1488459089260_0003_01_000001"): error=2, No such file or directory

  •     at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)

  •     at org.apache.hadoop.util.Shell.runCommand(Shell.java:548)

  •     at org.apache.hadoop.util.Shell.run(Shell.java:504)

  •     at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:786)

  •     at org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:251)

  •     at org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:188)

  •     at org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:119)

  •     at org.apache.hadoop.yarn.util.RackResolver.coreResolve(RackResolver.java:101)

  •     at org.apache.hadoop.yarn.util.RackResolver.resolve(RackResolver.java:81)

  •     at org.apache.spark.scheduler.cluster.YarnScheduler.getRackForHost(YarnScheduler.scala:37)

  •     at org.apache.spark.scheduler.TaskSetManager$$anonfun$org$apache$spark$scheduler$TaskSetManager$$addPendingTask$1.apply(TaskSetManager.scala:201)

  •     at org.apache.spark.scheduler.TaskSetManager$$anonfun$org$apache$spark$scheduler$TaskSetManager$$addPendingTask$1.apply(TaskSetManager.scala:182)

  •     at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

  •     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

  •     at org.apache.spark.scheduler.TaskSetManager.org$apache$spark$scheduler$TaskSetManager$$addPendingTask(TaskSetManager.scala:182)

  •     at org.apache.spark.scheduler.TaskSetManager$$anonfun$1.apply$mcVI$sp(TaskSetManager.scala:161)

  •     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)

  •     at org.apache.spark.scheduler.TaskSetManager.<init>(TaskSetManager.scala:160)

  •     at org.apache.spark.scheduler.TaskSchedulerImpl.createTaskSetManager(TaskSchedulerImpl.scala:222)

  •     at org.apache.spark.scheduler.TaskSchedulerImpl.submitTasks(TaskSchedulerImpl.scala:186)

  •     at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1058)

  •     at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:933)

  •     at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:873)

  •     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1632)

  •     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1624)

  •     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1613)

  •     at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

  • Caused by: java.io.IOException: error=2, No such file or directory

  •     at java.lang.UNIXProcess.forkAndExec(Native Method)

  •     at java.lang.UNIXProcess.<init>(UNIXProcess.java:247)

  •     at java.lang.ProcessImpl.start(ProcessImpl.java:134)

  •     at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)

解決思路:
1.分析這句話,
17/03/02 21:01:56 WARN net.ScriptBasedMapping: Exception running /etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py 172.16.102.64
java.io.IOException: Cannot run program "/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py" (in directory "/yarn/nm/usercache/spark/appcache/application_1488459089260_0003/container_1488459089260_0003_01_000001"): error=2, No such file or directory
說明在這個ip的機器上 沒有這個py文件。

然后去機器驗證,
然后再將01機器的 配置文件 全部copy到另外四臺即可。
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-02:/etc/spark2/
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-03:/etc/spark2/
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-04:/etc/spark2/
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-05:/etc/spark2/

驗證:
就ok了

上述就是小編為大家分享的CDH Spark2的spark2-submit的一個No such file or directory問題怎么解決了,如果剛好有類似的疑惑,不妨參照上述分析進行理解。如果想知道更多相關知識,歡迎關注億速云行業資訊頻道。

向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

上林县| 宜兴市| 连江县| 贵定县| 德昌县| 依兰县| 文水县| 和田县| 平舆县| 和平县| 邵武市| 溧阳市| 犍为县| 和顺县| 桐柏县| 亚东县| 金华市| 马尔康县| 离岛区| 砀山县| 阿瓦提县| 通州市| 吴江市| 明光市| 桦南县| 蒙阴县| 安徽省| 南城县| 黔南| 康马县| 旺苍县| 兴国县| 柳林县| 碌曲县| 新干县| 油尖旺区| 敦煌市| 齐河县| 宁夏| 英吉沙县| 鹤山市|