分享

spark-shell yarn-client 出错java.io.FileNotFoundException,求救

lielies 发表于 2015-9-6 23:49:50 [显示全部楼层] 只看大图 回帖奖励 阅读模式 关闭右栏 2 26889
CDH5.4.5



[root@cdh122 conf.cloudera.spark_on_yarn]# spark-shell --master yarn-client
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.4.5-1.cdh5.4.5.p0.7/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.4.5-1.cdh5.4.5.p0.7/jars/avro-tools-1.7.6-cdh5.4.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/09/06 23:37:39 INFO SecurityManager: Changing view acls to: root
15/09/06 23:37:39 INFO SecurityManager: Changing modify acls to: root
15/09/06 23:37:39 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/09/06 23:37:39 INFO HttpServer: Starting HTTP Server
15/09/06 23:37:39 INFO Server: jetty-8.y.z-SNAPSHOT
15/09/06 23:37:39 INFO AbstractConnector: Started SocketConnector@0.0.0.0:40130
15/09/06 23:37:39 INFO Utils: Successfully started service 'HTTP class server' on port 40130.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_71)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/06 23:37:44 INFO SparkContext: Running Spark version 1.3.0
15/09/06 23:37:44 INFO SecurityManager: Changing view acls to: root
15/09/06 23:37:44 INFO SecurityManager: Changing modify acls to: root
15/09/06 23:37:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/09/06 23:37:44 INFO Slf4jLogger: Slf4jLogger started
15/09/06 23:37:44 INFO Remoting: Starting remoting
15/09/06 23:37:44 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@cdh122:40993]
15/09/06 23:37:44 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver@cdh122:40993]
15/09/06 23:37:44 INFO Utils: Successfully started service 'sparkDriver' on port 40993.
15/09/06 23:37:44 INFO SparkEnv: Registering MapOutputTracker
15/09/06 23:37:44 INFO SparkEnv: Registering BlockManagerMaster
15/09/06 23:37:44 INFO DiskBlockManager: Created local directory at /tmp/spark-c8d015e4-55c1-4fea-8a16-1506d7b0ed5a/blockmgr-c0cf2406-5321-4b31-8e35-9cf42202ef5c
15/09/06 23:37:44 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/09/06 23:37:45 INFO HttpFileServer: HTTP File server directory is /tmp/spark-799a80dc-2c0a-4b9f-b39a-d2fec50b926f/httpd-05ad2090-8ce5-4a5a-8250-13d631e60211
15/09/06 23:37:45 INFO HttpServer: Starting HTTP Server
15/09/06 23:37:45 INFO Server: jetty-8.y.z-SNAPSHOT
15/09/06 23:37:45 INFO AbstractConnector: Started SocketConnector@0.0.0.0:49991
15/09/06 23:37:45 INFO Utils: Successfully started service 'HTTP file server' on port 49991.
15/09/06 23:37:45 INFO SparkEnv: Registering OutputCommitCoordinator
15/09/06 23:37:45 INFO Server: jetty-8.y.z-SNAPSHOT
15/09/06 23:37:45 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/09/06 23:37:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/09/06 23:37:45 INFO SparkUI: Started SparkUI at http://cdh122:4040
15/09/06 23:37:45 INFO ConfiguredRMFailoverProxyProvider: Failing over to rm120
15/09/06 23:37:45 INFO Client: Requesting a new application from cluster with 5 NodeManagers
15/09/06 23:37:45 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2048 MB per container)
15/09/06 23:37:45 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
15/09/06 23:37:45 INFO Client: Setting up container launch context for our AM
15/09/06 23:37:45 INFO Client: Preparing resources for our AM container
15/09/06 23:37:46 INFO Client: Setting up the launch environment for our AM container
15/09/06 23:37:46 INFO SecurityManager: Changing view acls to: root
15/09/06 23:37:46 INFO SecurityManager: Changing modify acls to: root
15/09/06 23:37:46 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/09/06 23:37:46 INFO Client: Submitting application 28 to ResourceManager
15/09/06 23:37:46 INFO YarnClientImpl: Submitted application application_1441534263937_0028
15/09/06 23:37:47 INFO Client: Application report for application_1441534263937_0028 (state: ACCEPTED)
15/09/06 23:37:47 INFO Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: root.root
     start time: 1441553866246
     final status: UNDEFINED
     tracking URL: http://cdh123:8088/proxy/application_1441534263937_0028/
     user: root
15/09/06 23:37:48 INFO Client: Application report for application_1441534263937_0028 (state: ACCEPTED)
15/09/06 23:37:49 INFO Client: Application report for application_1441534263937_0028 (state: ACCEPTED)
15/09/06 23:37:50 INFO Client: Application report for application_1441534263937_0028 (state: ACCEPTED)
15/09/06 23:37:50 INFO YarnClientSchedulerBackend: ApplicationMaster registered as Actor[akka.tcp://sparkYarnAM@cdh124:55266/user/YarnAM#154942810]
15/09/06 23:37:50 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> cdh122,cdh123, PROXY_URI_BASES -> http://cdh122:8088/proxy/application_1441534263937_0028,http://cdh123:8088/proxy/application_1441534263937_0028), /proxy/application_1441534263937_0028
15/09/06 23:37:50 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/09/06 23:37:51 INFO Client: Application report for application_1441534263937_0028 (state: RUNNING)
15/09/06 23:37:51 INFO Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: cdh124
     ApplicationMaster RPC port: 0
     queue: root.root
     start time: 1441553866246
     final status: UNDEFINED
     tracking URL: http://cdh123:8088/proxy/application_1441534263937_0028/
     user: root
15/09/06 23:37:51 INFO YarnClientSchedulerBackend: Application application_1441534263937_0028 has started running.
15/09/06 23:37:51 INFO NettyBlockTransferService: Server created on 42163
15/09/06 23:37:51 INFO BlockManagerMaster: Trying to register BlockManager
15/09/06 23:37:51 INFO BlockManagerMasterActor: Registering block manager cdh122:42163 with 265.4 MB RAM, BlockManagerId(<driver>, cdh122, 42163)
15/09/06 23:37:51 INFO BlockManagerMaster: Registered BlockManager
java.io.FileNotFoundException: /user/spark/applicationHistory/application_1441534263937_0028.inprogress (没有那个文件或目录)
    at java.io.FileOutputStream.open(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:110)
    at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:117)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:399)
    at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
    at $iwC$$iwC.<init>(<console>:9)
    at $iwC.<init>(<console>:18)
    at <init>(<console>:20)
    at .<init>(<console>:24)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:141)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:49)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1027)
    at $iwC$$iwC.<init>(<console>:9)
    at $iwC.<init>(<console>:18)
    at <init>(<console>:20)
    at .<init>(<console>:24)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^

scala> 15/09/06 23:37:54 INFO YarnClientSchedulerBackend: Registered executor: Actor[akka.tcp://sparkExecutor@cdh126:54540/user/Executor#-22190952] with ID 1
15/09/06 23:37:54 INFO YarnClientSchedulerBackend: Registered executor: Actor[akka.tcp://sparkExecutor@cdh121:60900/user/Executor#1866389679] with ID 2
15/09/06 23:37:54 INFO BlockManagerMasterActor: Registering block manager cdh126:55226 with 530.3 MB RAM, BlockManagerId(1, cdh126, 55226)
15/09/06 23:37:55 INFO BlockManagerMasterActor: Registering block manager cdh121:56304 with 530.3 MB RAM, BlockManagerId(2, cdh121, 56304)



弄了 好久都没解决这个问题,网上有的说是角色权限问题。但我切换了角色还是报这个错。更换路径也没效果。


有人遇上这样的问题吗

已有(3)人评论

跳转到指定楼层
leo_1989 发表于 2015-9-7 15:26:28
下面楼主是怎么配置的
配置spark-defaults.conf 中jobhistory中的配置
spark.eventLog.dir=hdfs:///user/spark/applicationHistory

点评

你的回复,经过尝试。是可行的。 spark.eventLog.dir=hdfs:///user/spark/applicationHistory 我这的配置默认是spark.eventLog.dir=/user/spark/applicationHistory;按你说的问题就解决了。   发表于 2015-9-8 08:45
回复

使用道具 举报

lielies 发表于 2015-9-7 23:45:17
leo_1989 发表于 2015-9-7 15:26
下面楼主是怎么配置的
配置spark-defaults.conf 中jobhistory中的配置
spark.eventLog.dir=hdfs:///user/ ...

配置和你写的一样:没什么不同的、


QQ图片20150907234124.png
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条