-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
java.lang.OutOfMemoryError: GC overhead limit exceeded #115
Comments
Try to increase the parallelism (more partitions) |
@nelsou Thank you for your help.I try to increase the parallelism,but it is not effect.
I set It is still report error after the spark streaming ran for 24 hours(maybe less). Container exited with a non-zero exit code 1 17/09/13 15:32:12 WARN servlet.ServletHandler: Error for /storage/ Container exited with a non-zero exit code 1 17/09/13 15:32:12 INFO storage.BlockManagerMasterEndpoint: Removing block manager BlockManagerId(1, spark2, 38456) Container exited with a non-zero exit code 1` Any help would be appreciated. |
Because of "data.repartition(1)" |
@nelsou Thank you for your help.But I can control the size of the data. |
@nelsou Any suggestions ? |
The repartition(1) is useless. Try removing it. |
@nelsou Spark will generate many files If I try removing it.I would like to generate one file every time. |
After the repartition(1) you are doing ".mode(SaveMode.Append)". It doesn't work ? |
@nelsou It is work.But you can look this picture. Spark streming generate many files every bacth if I try removing repartition(1).I just let it generate one file every bacth. |
In my opinion it is not the role of Spark Streaming to control the number of output. There is some tricky solutions (saveAsHadoopFile is one) but you should probably avoid such thing. |
@nelsou I just want to save text file . And it is better generate one file every spark streaming bacth.Do you have some examples? |
I use:
spark1.6.0
spark-rabbitmq 0.4.0
My program ran for 24 hours before it was reported wrong.Please help me.
My error log:
`17/09/13 17:15:43 WARN spark.HeartbeatReceiver: Removing executor 6 with no recent heartbeats: 178850 ms exceeds timeout 120000 ms
17/09/13 17:15:43 ERROR cluster.YarnScheduler: Lost executor 6 on spark1: Executor heartbeat timed out after 178850 ms
17/09/13 17:15:43 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 7.0 (TID 75, spark1): ExecutorLostFailure (executor 6 exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 178850 ms
17/09/13 17:15:43 INFO scheduler.DAGScheduler: Executor lost: 6 (epoch 544)
17/09/13 17:15:43 INFO storage.BlockManagerMasterEndpoint: Trying to remove executor 6 from BlockManagerMaster.
17/09/13 17:15:45 INFO cluster.YarnClientSchedulerBackend: Requesting to kill executor(s) 6
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark2/192.168.155.3:50701
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark3/192.168.155.4:37321
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark3/192.168.155.4:37317
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark2/192.168.155.3:57252
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 INFO storage.BlockManagerMasterEndpoint: Removing block manager BlockManagerId(6, spark1, 54732)
17/09/13 17:15:52 INFO storage.BlockManagerMaster: Removed 6 successfully in removeExecutor
17/09/13 17:15:52 INFO storage.BlockManagerInfo: Added input-2-1505294133600 in memory on spark2:53361 (size: 9.1 KB, free: 1047.9 MB)
17/09/13 17:15:52 ERROR server.TransportRequestHandler: Error sending result RpcResponse{requestId=6104341188587013298, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=353 cap=353]}} to spark3/192.168.155.4:37317; closing connection
java.nio.channels.ClosedChannelException
17/09/13 17:15:52 INFO storage.BlockManagerInfo: Added input-3-1505294135000 in memory on spark3:34916 (size: 1568.0 B, free: 819.1 MB)
17/09/13 17:15:52 ERROR server.TransportRequestHandler: Error sending result RpcResponse{requestId=7579983330511639701, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=47 cap=47]}} to spark2/192.168.155.3:57252; closing connection
java.nio.channels.ClosedChannelException
17/09/13 17:15:52 ERROR server.TransportRequestHandler: Error sending result RpcResponse{requestId=5632111837789001813, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=47 cap=47]}} to spark3/192.168.155.4:37321; closing connection
java.nio.channels.ClosedChannelException
17/09/13 17:15:56 WARN server.TransportChannelHandler: Exception in connection from spark1/192.168.155.2:41920
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:56 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
17/09/13 17:15:56 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.155.2:4040
17/09/13 17:16:00 WARN server.TransportChannelHandler: Exception in connection from spark2/192.168.155.3:57212
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 4 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 1508 (start at SparkStremingReadRabbitMQ.scala:77) failed in 6207.798 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@1ab59f28)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 5 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.231 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@772c8704)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 3
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 2 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.642 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@4b86d4b2)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ShuffleMapStage 1599 (text at SparkStremingReadRabbitMQ.scala:72) failed in 1592.898 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@1ae3ac4)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 6 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.065 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@37d4a906)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 7 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83399.925 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@24531c5f)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 4 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.364 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@590d6f23)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 1601 (start at SparkStremingReadRabbitMQ.scala:77) failed in 698.533 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@290516dc)
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(4,1505294160255,JobFailed(org.apache.spark.SparkException: Job 4 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(6,1505294160255,JobFailed(org.apache.spark.SparkException: Job 6 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 6 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 5
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 995 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 6
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(995,1505294160257,JobFailed(org.apache.spark.SparkException: Job 995 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 3 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 2
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(3,1505294160258,JobFailed(org.apache.spark.SparkException: Job 3 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 5 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 4
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(5,1505294160261,JobFailed(org.apache.spark.SparkException: Job 5 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: Job 1055 failed: text at SparkStremingReadRabbitMQ.scala:72, took 1597.324995 s
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)`
The text was updated successfully, but these errors were encountered: