分享

SparkStreaming运行提示错误,求救

cookeem 发表于 2015-3-31 10:35:17 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 12 153363
Spark1.3,运行SparkStreaming的时候提示错误,运行Spark-SQL正常

运行环境:Spark1.3+Hadoop2.6.0+JDK1.8

运行官方的Streaming例子,提示以下错误,求救:

[root@hadoop1 lib]# spark-shell --master spark://hadoop1:7077
Spark assembly has been built with Hive, including Datanucleus jars on classpath
15/03/31 10:15:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/03/31 10:15:58 INFO spark.SecurityManager: Changing view acls to: root
15/03/31 10:15:58 INFO spark.SecurityManager: Changing modify acls to: root
15/03/31 10:15:58 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/03/31 10:15:58 INFO spark.HttpServer: Starting HTTP Server
15/03/31 10:15:58 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/31 10:15:58 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:42898
15/03/31 10:15:58 INFO util.Utils: Successfully started service 'HTTP class server' on port 42898.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_31)
Type in expressions to have them evaluated.
Type :help for more information.
15/03/31 10:16:04 INFO spark.SparkContext: Running Spark version 1.3.0
15/03/31 10:16:04 INFO spark.SecurityManager: Changing view acls to: root
15/03/31 10:16:04 INFO spark.SecurityManager: Changing modify acls to: root
15/03/31 10:16:04 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/03/31 10:16:04 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/31 10:16:05 INFO Remoting: Starting remoting
15/03/31 10:16:05 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@hadoop1:39011]
15/03/31 10:16:05 INFO util.Utils: Successfully started service 'sparkDriver' on port 39011.
15/03/31 10:16:05 INFO spark.SparkEnv: Registering MapOutputTracker
15/03/31 10:16:05 INFO spark.SparkEnv: Registering BlockManagerMaster
15/03/31 10:16:05 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-b6395623-6c92-4596-9384-bf0b6ca77f74/blockmgr-5120e8b1-45c3-487b-9ee8-39a065abe13e
15/03/31 10:16:05 INFO storage.MemoryStore: MemoryStore started with capacity 265.1 MB
15/03/31 10:16:05 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-112e01f4-7553-4178-8a0b-48faedceadd3/httpd-19f59ae7-4882-4b28-8764-911f8a77df57
15/03/31 10:16:05 INFO spark.HttpServer: Starting HTTP Server
15/03/31 10:16:05 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/31 10:16:05 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:46204
15/03/31 10:16:05 INFO util.Utils: Successfully started service 'HTTP file server' on port 46204.
15/03/31 10:16:05 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/03/31 10:16:05 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/31 10:16:05 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/03/31 10:16:05 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/03/31 10:16:05 INFO ui.SparkUI: Started SparkUI at http://hadoop1:4040
15/03/31 10:16:05 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@hadoop1:7077/user/Master...
15/03/31 10:16:06 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150331101606-0000
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor added: app-20150331101606-0000/0 on worker-20150331085919-hadoop1-33018 (hadoop1:33018) with 4 cores
15/03/31 10:16:06 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150331101606-0000/0 on hostPort hadoop1:33018 with 4 cores, 512.0 MB RAM
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor added: app-20150331101606-0000/1 on worker-20150331085918-hadoop2-33995 (hadoop2:33995) with 2 cores
15/03/31 10:16:06 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150331101606-0000/1 on hostPort hadoop2:33995 with 2 cores, 512.0 MB RAM
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor added: app-20150331101606-0000/2 on worker-20150331085918-hadoop3-51799 (hadoop3:51799) with 2 cores
15/03/31 10:16:06 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150331101606-0000/2 on hostPort hadoop3:51799 with 2 cores, 512.0 MB RAM
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor updated: app-20150331101606-0000/0 is now RUNNING
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor updated: app-20150331101606-0000/1 is now RUNNING
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor updated: app-20150331101606-0000/2 is now RUNNING
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor updated: app-20150331101606-0000/2 is now LOADING
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor updated: app-20150331101606-0000/1 is now LOADING
15/03/31 10:16:06 INFO client.AppClient$ClientActor: Executor updated: app-20150331101606-0000/0 is now LOADING
15/03/31 10:16:06 INFO netty.NettyBlockTransferService: Server created on 37354
15/03/31 10:16:06 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/03/31 10:16:06 INFO storage.BlockManagerMasterActor: Registering block manager hadoop1:37354 with 265.1 MB RAM, BlockManagerId(<driver>, hadoop1, 37354)
15/03/31 10:16:06 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/31 10:16:07 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
15/03/31 10:16:07 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/03/31 10:16:08 INFO repl.SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.


scala> import org.apache.spark._
import org.apache.spark._

scala> import org.apache.spark.streaming._
import org.apache.spark.streaming._

scala> import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.StreamingContext._

scala> val conf = new SparkConf().setAppName("NetworkWordCount")
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@2e1c8027

scala> val ssc = new StreamingContext(conf, Seconds(1))
15/03/31 10:26:08 INFO spark.SparkContext: Running Spark version 1.3.0
15/03/31 10:26:08 INFO spark.SecurityManager: Changing view acls to: root
15/03/31 10:26:08 INFO spark.SecurityManager: Changing modify acls to: root
15/03/31 10:26:08 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/03/31 10:26:08 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/31 10:26:08 INFO Remoting: Starting remoting
15/03/31 10:26:08 INFO util.Utils: Successfully started service 'sparkDriver' on port 59294.
15/03/31 10:26:08 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@hadoop1:59294]
15/03/31 10:26:08 INFO spark.SparkEnv: Registering MapOutputTracker
15/03/31 10:26:08 INFO spark.SparkEnv: Registering BlockManagerMaster
15/03/31 10:26:08 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-b5d52baf-47d2-46f8-8a9a-cc17ad9afa74/blockmgr-34e2114b-5f6f-4587-9f0d-3bee06658fa3
15/03/31 10:26:08 INFO storage.MemoryStore: MemoryStore started with capacity 263.3 MB
15/03/31 10:26:08 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-13a714da-d5ab-4e57-93ff-99fa3eb813f9/httpd-ff4c1af3-95d1-4fc0-8f96-06d6f54486fe
15/03/31 10:26:08 INFO spark.HttpServer: Starting HTTP Server
15/03/31 10:26:08 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/31 10:26:08 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:38776
15/03/31 10:26:08 INFO util.Utils: Successfully started service 'HTTP file server' on port 38776.
15/03/31 10:26:08 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/03/31 10:26:08 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/31 10:26:08 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:436)
        at sun.nio.ch.Net.bind(Net.java:428)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.spark-project.jetty.server.Server.doStart(Server.java:293)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1832)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1823)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:307)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:307)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:307)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:642)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
        at $line37.$read$$iwC$$iwC$$iwC.<init>(<console>:55)
        at $line37.$read$$iwC$$iwC.<init>(<console>:57)
        at $line37.$read$$iwC.<init>(<console>:59)
        at $line37.$read.<init>(<console>:61)
        at $line37.$read$.<init>(<console>:65)
        at $line37.$read$.<clinit>(<console>)
        at $line37.$eval$.<init>(<console>:7)
        at $line37.$eval$.<clinit>(<console>)
        at $line37.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/03/31 10:26:08 WARN component.AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@3e5ed0d4: java.net.BindException: Address already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:436)
        at sun.nio.ch.Net.bind(Net.java:428)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.spark-project.jetty.server.Server.doStart(Server.java:293)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1832)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1823)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:307)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:307)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:307)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:642)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
        at $line37.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
        at $line37.$read$$iwC$$iwC$$iwC.<init>(<console>:55)
        at $line37.$read$$iwC$$iwC.<init>(<console>:57)
        at $line37.$read$$iwC.<init>(<console>:59)
        at $line37.$read.<init>(<console>:61)
        at $line37.$read$.<init>(<console>:65)
        at $line37.$read$.<clinit>(<console>)
        at $line37.$eval$.<init>(<console>:7)
        at $line37.$eval$.<clinit>(<console>)
        at $line37.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/03/31 10:26:08 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/03/31 10:26:08 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/03/31 10:26:08 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/31 10:26:08 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
15/03/31 10:26:08 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
15/03/31 10:26:08 INFO ui.SparkUI: Started SparkUI at http://hadoop1:4041
15/03/31 10:26:08 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@hadoop1:7077/user/Master...
15/03/31 10:26:08 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150331102608-0001
15/03/31 10:26:08 INFO netty.NettyBlockTransferService: Server created on 41350
15/03/31 10:26:08 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/03/31 10:26:08 INFO storage.BlockManagerMasterActor: Registering block manager hadoop1:41350 with 263.3 MB RAM, BlockManagerId(<driver>, hadoop1, 41350)
15/03/31 10:26:08 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/31 10:26:08 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
$iwC$$iwC.<init>(<console>:9)
$iwC.<init>(<console>:18)
<init>(<console>:20)
.<init>(<console>:24)
.<clinit>(<console>)
.<init>(<console>:7)
.<clinit>(<console>)
$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:483)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1811)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1807)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1794)
        at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:1846)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:1753)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:642)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
        at $iwC$$iwC$$iwC.<init>(<console>:55)
        at $iwC$$iwC.<init>(<console>:57)
        at $iwC.<init>(<console>:59)
        at <init>(<console>:61)
        at .<init>(<console>:65)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

已有(12)人评论

跳转到指定楼层
bioger_hit 发表于 2015-3-31 11:33:11
4040被暂用了

sudo netstat -ap | grep 4040

把暂用端口的进程kiil在试试
回复

使用道具 举报

cookeem 发表于 2015-3-31 11:35:07
4040就是spark的管理ui在使用
回复

使用道具 举报

bioger_hit 发表于 2015-3-31 11:43:25
cookeem 发表于 2015-3-31 11:35
4040就是spark的管理ui在使用

WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use

上面跟ui端口冲突了

回复

使用道具 举报

langke93 发表于 2015-3-31 14:51:21
bioger_hit 发表于 2015-3-31 11:43
WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindExcept ...

楼主,英语如何,可以看看这个帖子:
https://github.com/apache/spark/pull/1019
回复

使用道具 举报

chenhenry 发表于 2015-5-13 11:45:50
你在别处也开启了一个任务吧,把那个任务停掉
回复

使用道具 举报

cookeem 发表于 2015-5-13 12:03:18
其实原因是默认情况下Spark Shell只允许一个SparkContext实例,spark shell默认已经创建了一个sc了,直接用sc来创建ssc就可以了。
回复

使用道具 举报

sea_fei 发表于 2015-5-19 11:52:41
cookeem 发表于 2015-5-13 12:03
其实原因是默认情况下Spark Shell只允许一个SparkContext实例,spark shell默认已经创建了一个sc了,直接用 ...

楼主,我也遇到了同样的问题,请问您是怎么解决的,谢谢。
回复

使用道具 举报

sea_fei 发表于 2015-5-19 16:21:00
sea_fei 发表于 2015-5-19 11:52
楼主,我也遇到了同样的问题,请问您是怎么解决的,谢谢。

谢谢楼主,明白了,已经ok。
回复

使用道具 举报

sxyqhyt 发表于 2015-5-26 16:51:51
sea_fei 发表于 2015-5-19 16:21
谢谢楼主,明白了,已经ok。

请问您是怎么解决的,分享一下,谢谢
回复

使用道具 举报

12下一页
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条