立即注册 登录
About云-梭伦科技 返回首页

s060403072的个人空间 https://www.aboutyun.com/?57 [收藏] [复制] [分享] [RSS]

日志

spark Exception in task 0.0 in stage 0.0

已有 3276 次阅读2017-5-14 21:51

大家好,有个问题,local模式submit 时,程序里println的内容在控制台有输出,而换成standalone client 方式submit时println里的内容没有输出,是内存不够吗,还是其他什么问题。
我是虚拟机在笔记本上搭建的spark集群
17/05/14 19:20:24 INFO util.Utils: Fetching http://192.168.31.160:33039/jars/spark_study_java-0.0.1-SNAPSHOT-jar-with-dependencies.jar to /tmp/spark-446068a4-aaa4-4277-b009-908bf0d4ecac/executor-dcc3175b-7d19-4485-81e1-bf31a83a66b4/spark-3f849149-bcfb-44a2-90f6-c71f31098c30/fetchFileTemp8558073324247536081.tmp
17/05/14 19:20:24 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.io.IOException: No space left on device
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:345)
at org.apache.spark.util.Utils$$anonfun$copyStream$1.apply$mcJ$sp(Utils.scala:286)
at org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:252)
at org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:252)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)
at org.apache.spark.util.Utils$.copyStream(Utils.scala:292)
at org.apache.spark.util.Utils$.downloadFile(Utils.scala:415)
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:557)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:356)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
我把/etc/fstab 也给停了,也是不行
解决办法:
是jar太大了,spark把jar上传到Fetching http://192.168.31.160:33039/jars/spark_study_java-0.0.1-SNAPSHOT-jar-with-dependencies.jar to /tmp/spark-446068a4-aaa4-4277-b009-908bf0d4ecac/executor-dcc3175b-7d19-4485-81e1-bf31a83a66b4/spark-3f849149-bcfb-44a2-90f6-c71f31098c30/fetchFileTemp8558073324247536081.tmp的时候,报空间不足了

是本地磁盘目录的空间太小了

来自:
about云7群:
webform 】バ幸福De右岸ヤ

路过

雷人

握手

鲜花

鸡蛋

评论 (0 个评论)

facelist doodle 涂鸦板

您需要登录后才可以评论 登录 | 立即注册

关闭

推荐上一条 /2 下一条