分享

启动spark-shell 报错。请大神帮忙

chsong888 发表于 2015-10-3 23:44:53 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 7 89329
chsong@Master:/spark/spark-1.4.0-bin-hadoop2.6/bin$ ./spark-shell --master spark://Master:7077
15/10/03 08:25:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/10/03 08:25:25 INFO spark.SecurityManager: Changing view acls to: chsong
15/10/03 08:25:25 INFO spark.SecurityManager: Changing modify acls to: chsong
15/10/03 08:25:25 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(chsong); users with modify permissions: Set(chsong)
15/10/03 08:25:25 INFO spark.HttpServer: Starting HTTP Server
15/10/03 08:25:25 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/10/03 08:25:25 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:38410
15/10/03 08:25:25 INFO util.Utils: Successfully started service 'HTTP class server' on port 38410.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.0
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) Client VM, Java 1.7.0_09)
Type in expressions to have them evaluated.
Type :help for more information.
15/10/03 08:25:49 WARN util.Utils: Your hostname, Master resolves to a loopback address: 127.0.1.1; using 192.168.1.105 instead (on interface eth0)
15/10/03 08:25:49 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/10/03 08:25:49 INFO spark.SparkContext: Running Spark version 1.4.0
15/10/03 08:25:50 WARN spark.SparkConf: In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
15/10/03 08:25:50 INFO spark.SecurityManager: Changing view acls to: chsong
15/10/03 08:25:50 INFO spark.SecurityManager: Changing modify acls to: chsong
15/10/03 08:25:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(chsong); users with modify permissions: Set(chsong)
15/10/03 08:25:55 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/10/03 08:25:56 INFO Remoting: Starting remoting
15/10/03 08:25:57 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.105:42267]
15/10/03 08:25:57 INFO util.Utils: Successfully started service 'sparkDriver' on port 42267.
15/10/03 08:25:58 INFO spark.SparkEnv: Registering MapOutputTracker
15/10/03 08:25:58 INFO spark.SparkEnv: Registering BlockManagerMaster
15/10/03 08:25:58 INFO storage.DiskBlockManager: Created local directory at /spark/spark_workspace/spark-5ea209f8-4eed-4924-a59f-1baa208d3c1b/blockmgr-276bdb7a-e824-416b-90dc-1515a9183159
15/10/03 08:25:58 INFO storage.MemoryStore: MemoryStore started with capacity 267.3 MB
15/10/03 08:25:59 INFO spark.HttpFileServer: HTTP File server directory is /spark/spark_workspace/spark-5ea209f8-4eed-4924-a59f-1baa208d3c1b/httpd-63e70bd1-a7e3-424b-8e7f-573592aecda1
15/10/03 08:25:59 INFO spark.HttpServer: Starting HTTP Server
15/10/03 08:25:59 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/10/03 08:25:59 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:33233
15/10/03 08:25:59 INFO util.Utils: Successfully started service 'HTTP file server' on port 33233.
15/10/03 08:25:59 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/10/03 08:26:02 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/10/03 08:26:03 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/10/03 08:26:03 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/10/03 08:26:03 INFO ui.SparkUI: Started SparkUI at http://192.168.1.105:4040
15/10/03 08:26:04 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Master:7077/user/Master...
15/10/03 08:26:04 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@Master:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@Master:7077
15/10/03 08:26:04 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Master:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: Master/127.0.1.1:7077
15/10/03 08:26:24 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Master:7077/user/Master...
15/10/03 08:26:24 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@Master:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@Master:7077
15/10/03 08:26:24 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Master:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: Master/127.0.1.1:7077
15/10/03 08:26:44 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Master:7077/user/Master...
15/10/03 08:26:44 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@Master:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@Master:7077
15/10/03 08:26:44 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Master:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: Master/127.0.1.1:7077
15/10/03 08:27:04 ERROR cluster.SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
15/10/03 08:27:04 WARN cluster.SparkDeploySchedulerBackend: Application ID is not initialized yet.
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/10/03 08:27:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/10/03 08:27:05 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.1.105:4040
15/10/03 08:27:05 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/10/03 08:27:05 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
15/10/03 08:27:05 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
15/10/03 08:27:05 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Master:7077/user/Master...
15/10/03 08:27:05 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@Master:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@Master:7077
15/10/03 08:27:05 ERROR actor.OneForOneStrategy:
java.lang.NullPointerException
        at org.apache.spark.deploy.client.AppClient$ClientActor$$anonfun$receiveWithLogging$1.applyOrElse(AppClient.scala:160)
        at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
        at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
        at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
        at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:59)
        at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
        at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
        at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
        at org.apache.spark.deploy.client.AppClient$ClientActor.aroundReceive(AppClient.scala:61)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
15/10/03 08:27:05 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Master:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: Master/127.0.1.1:7077
15/10/03 08:27:06 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39962.
15/10/03 08:27:06 INFO netty.NettyBlockTransferService: Server created on 39962
15/10/03 08:27:06 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/10/03 08:27:06 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.1.105:39962 with 267.3 MB RAM, BlockManagerId(driver, 192.168.1.105, 39962)
15/10/03 08:27:06 INFO storage.BlockManagerMaster: Registered BlockManager
15/10/03 08:27:07 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
        at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
        at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1501)
        at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2005)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
        at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
        at $line3.$read$$iwC$$iwC.<init>(<console>:9)
        at $line3.$read$$iwC.<init>(<console>:18)
        at $line3.$read.<init>(<console>:20)
        at $line3.$read$.<init>(<console>:24)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.<init>(<console>:7)
        at $line3.$eval$.<clinit>(<console>)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
        at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
        at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
        at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/10/03 08:27:07 INFO spark.SparkContext: SparkContext already stopped.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
        at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
        at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1501)
        at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2005)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
        at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
        at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
        at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
        at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

java.lang.NullPointerException
        at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
        at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
        at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
        at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
        at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^

scala> 15/10/03 08:27:25 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Master:7077/user/Master...
15/10/03 08:27:25 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@Master:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@Master:7077
15/10/03 08:27:25 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Master:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: Master/127.0.1.1:7077
15/10/03 08:27:45 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Master:7077/user/Master...
15/10/03 08:27:45 WARN client.AppClient$ClientActor: Could not connect to akka.tcp://sparkMaster@Master:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://sparkMaster@Master:7077
15/10/03 08:27:45 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Master:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: Master/127.0.1.1:7077
15/10/03 08:29:05 INFO client.AppClient: Stop request to Master timed out; it may already be shut down.
15/10/03 08:29:05 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/10/03 08:29:05 INFO util.Utils: path = /spark/spark_workspace/spark-5ea209f8-4eed-4924-a59f-1baa208d3c1b/blockmgr-276bdb7a-e824-416b-90dc-1515a9183159, already present as root for deletion.
15/10/03 08:29:05 INFO storage.MemoryStore: MemoryStore cleared
15/10/03 08:29:05 INFO storage.BlockManager: BlockManager stopped
15/10/03 08:29:05 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/10/03 08:29:05 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/10/03 08:29:05 INFO spark.SparkContext: Successfully stopped SparkContext
15/10/03 08:29:05 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/10/03 08:29:05 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/10/03 08:29:05 ERROR util.SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[sparkDriver-akka.actor.default-dispatcher-4,5,main]
org.apache.spark.SparkException: Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up.
        at org.apache.spark.scheduler.TaskSchedulerImpl.error(TaskSchedulerImpl.scala:409)
        at org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend.dead(SparkDeploySchedulerBackend.scala:122)
        at org.apache.spark.deploy.client.AppClient$ClientActor.markDead(AppClient.scala:177)
        at org.apache.spark.deploy.client.AppClient$ClientActor$$anonfun$registerWithMaster$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(AppClient.scala:98)
        at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1198)
        at org.apache.spark.deploy.client.AppClient$ClientActor$$anonfun$registerWithMaster$1.apply$mcV$sp(AppClient.scala:93)
        at akka.actor.Scheduler$$anon$5.run(Scheduler.scala:79)
        at akka.actor.LightArrayRevolverScheduler$$anon$2$$anon$1.run(Scheduler.scala:242)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
15/10/03 08:29:06 INFO util.Utils: Shutdown hook called
15/10/03 08:29:06 INFO util.Utils: Deleting directory /spark/spark_workspace/spark-5ea209f8-4eed-4924-a59f-1baa208d3c1b
15/10/03 08:29:06 INFO util.Utils: Deleting directory /tmp/spark-c7747a71-6fdd-46f0-8220-5a01b7b56318

已有(7)人评论

跳转到指定楼层
arsenduan 发表于 2015-10-4 23:01:46
应该不难明天在看看吧
回复

使用道具 举报

mituan2008 发表于 2015-10-5 17:22:54
楼主试试这个命令
master=spark://centos.host1:7077 ./spark-shell

回复

使用道具 举报

chsong888 发表于 2015-10-5 20:25:35
现在解决了。不过又出现一个新的问题。hdfs://Master:9000/这个地址指向的物理地址是哪呢。我设置的spark.history.fs.logDirectory 这个参数里要用到,总是找不到地址,而我设置成物理地址后没有问题了。但在启动start-history-server.sh 确报错。说这个地址找不到。
回复

使用道具 举报

天意1987 发表于 2016-1-21 10:51:49
哥们 你这问题咋解决的 说一下吧?谢谢
回复

使用道具 举报

chimes298 发表于 2016-1-24 19:58:33
chsong888 发表于 2015-10-5 20:25
现在解决了。不过又出现一个新的问题。hdfs://Master:9000/这个地址指向的物理地址是哪呢。我设置的spark.h ...

hdfs文件存放位置不要使用相对路径
回复

使用道具 举报

bingyuac 发表于 2016-9-22 10:34:06
楼主是怎么解决的,求指点,非常感谢
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条