- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我有一个带有子项目的 sbt 项目。他们都使用 Scala 2.11.4。在其中一个子项目(sparktest)中,我添加了 spark-core
name := """sparktest"""
version := "1.0-SNAPSHOT"
scalaVersion := "2.11.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.2.1"
exclude("org.slf4j", "slf4j-log4j12")
)
sparktest 依赖于另一个名为 commons 的 sbt 项目,它将 akka-actors 覆盖到 2.3.9
问题是,当我尝试运行以下代码片段(从 spark 示例中提取)
import org.apache.spark.{SparkContext, SparkConf}
import scala.math.random
object SparkSpike extends App {
val conf = new SparkConf().setAppName("Spark Pi").setMaster("local")
val spark = new SparkContext(conf)
val slices = if (args.length > 0) args(0).toInt else 2
val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
val count = spark.parallelize(1 until n, slices).map { i =>
val x = random * 2 - 1
val y = random * 2 - 1
if (x*x + y*y < 1) 1 else 0
}.reduce(_ + _)
println("Pi is roughly " + 4.0 * count / n)
spark.stop()
}
我收到以下错误:
2015-02-19 17:03:31,429 INFO o.a.s.SecurityManager Changing view acls to: bar
2015-02-19 17:03:31,432 INFO o.a.s.SecurityManager Changing modify acls to: bar
2015-02-19 17:03:31,433 INFO o.a.s.SecurityManager SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(bar); users with modify permissions: Set(bar)
2015-02-19 17:03:31,957 INFO a.e.s.Slf4jLogger Slf4jLogger started
2015-02-19 17:03:32,052 INFO Remoting Starting remoting
2015-02-19 17:03:32,336 INFO Remoting Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.59.3:49236]
2015-02-19 17:03:32,350 INFO o.a.s.u.Utils Successfully started service 'sparkDriver' on port 49236.
2015-02-19 17:03:32,378 INFO o.a.s.SparkEnv Registering MapOutputTracker
2015-02-19 17:03:32,404 INFO o.a.s.SparkEnv Registering BlockManagerMaster
2015-02-19 17:03:32,440 INFO o.a.s.s.DiskBlockManager Created local directory at /var/folders/26/7b3b32gd4wx1h25vd2qm66q00000gp/T/spark-a594f880-f5d1-4926-b555-eabbe1728734/spark-4e8f77c4-8018-4e64-88e7-6ca060d9a35c
2015-02-19 17:03:32,447 INFO o.a.s.s.MemoryStore MemoryStore started with capacity 1891.5 MB
2015-02-19 17:03:32,948 WARN o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-02-19 17:03:33,100 INFO o.a.s.HttpFileServer HTTP File server directory is /var/folders/26/7b3b32gd4wx1h25vd2qm66q00000gp/T/spark-0f30ec72-f2aa-44ed-9e92-9931fba5ba39/spark-d7fa24ef-768a-4e05-9fa3-ce29eacc0c19
2015-02-19 17:03:33,109 INFO o.a.s.HttpServer Starting HTTP Server
2015-02-19 17:03:33,206 INFO o.e.j.s.Server jetty-8.1.14.v20131031
2015-02-19 17:03:33,229 INFO o.e.j.s.AbstractConnector Started SocketConnector@0.0.0.0:49237
2015-02-19 17:03:33,229 INFO o.a.s.u.Utils Successfully started service 'HTTP file server' on port 49237.
2015-02-19 17:03:33,420 INFO o.e.j.s.Server jetty-8.1.14.v20131031
2015-02-19 17:03:33,441 INFO o.e.j.s.AbstractConnector Started SelectChannelConnector@0.0.0.0:4040
2015-02-19 17:03:33,442 INFO o.a.s.u.Utils Successfully started service 'SparkUI' on port 4040.
2015-02-19 17:03:33,445 INFO o.a.s.u.SparkUI Started SparkUI at http://192.168.59.3:4040
2015-02-19 17:03:33,611 INFO o.a.s.e.Executor Starting executor ID <driver> on host localhost
2015-02-19 17:03:33,634 INFO o.a.s.u.AkkaUtils Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@192.168.59.3:49236/user/HeartbeatReceiver
2015-02-19 17:03:33,994 INFO o.a.s.n.n.NettyBlockTransferService Server created on 49238
2015-02-19 17:03:33,996 INFO o.a.s.s.BlockManagerMaster Trying to register BlockManager
2015-02-19 17:03:33,998 INFO o.a.s.s.BlockManagerMasterActor Registering block manager localhost:49238 with 1891.5 MB RAM, BlockManagerId(<driver>, localhost, 49238)
2015-02-19 17:03:34,001 INFO o.a.s.s.BlockManagerMaster Registered BlockManager
2015-02-19 17:03:34,297 INFO o.a.s.SparkContext Starting job: reduce at SparkSpike.scala:17
2015-02-19 17:03:34,321 INFO o.a.s.s.DAGScheduler Got job 0 (reduce at SparkSpike.scala:17) with 2 output partitions (allowLocal=false)
2015-02-19 17:03:34,322 INFO o.a.s.s.DAGScheduler Final stage: Stage 0(reduce at SparkSpike.scala:17)
2015-02-19 17:03:34,323 INFO o.a.s.s.DAGScheduler Parents of final stage: List()
2015-02-19 17:03:34,329 INFO o.a.s.s.DAGScheduler Missing parents: List()
2015-02-19 17:03:34,349 INFO o.a.s.s.DAGScheduler Submitting Stage 0 (MappedRDD[1] at map at SparkSpike.scala:13), which has no missing parents
2015-02-19 17:03:34,505 INFO o.a.s.s.MemoryStore ensureFreeSpace(1600) called with curMem=0, maxMem=1983365775
2015-02-19 17:03:34,507 INFO o.a.s.s.MemoryStore Block broadcast_0 stored as values in memory (estimated size 1600.0 B, free 1891.5 MB)
2015-02-19 17:03:34,588 INFO o.a.s.s.MemoryStore ensureFreeSpace(1171) called with curMem=1600, maxMem=1983365775
2015-02-19 17:03:34,588 INFO o.a.s.s.MemoryStore Block broadcast_0_piece0 stored as bytes in memory (estimated size 1171.0 B, free 1891.5 MB)
2015-02-19 17:03:34,591 INFO o.a.s.s.BlockManagerInfo Added broadcast_0_piece0 in memory on localhost:49238 (size: 1171.0 B, free: 1891.5 MB)
2015-02-19 17:03:34,592 INFO o.a.s.s.BlockManagerMaster Updated info of block broadcast_0_piece0
2015-02-19 17:03:34,594 INFO o.a.s.SparkContext Created broadcast 0 from broadcast at DAGScheduler.scala:838
2015-02-19 17:03:34,617 INFO o.a.s.s.DAGScheduler Submitting 2 missing tasks from Stage 0 (MappedRDD[1] at map at SparkSpike.scala:13)
2015-02-19 17:03:34,618 INFO o.a.s.s.TaskSchedulerImpl Adding task set 0.0 with 2 tasks
2015-02-19 17:03:34,659 INFO o.a.s.s.TaskSetManager Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1260 bytes)
2015-02-19 17:03:34,671 INFO o.a.s.e.Executor Running task 0.0 in stage 0.0 (TID 0)
2015-02-19 17:03:34,692 ERROR o.a.s.e.Executor Exception in task 0.0 in stage 0.0 (TID 0)
java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078)
at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075)
... 20 common frames omitted
2015-02-19 17:03:34,704 INFO o.a.s.s.TaskSetManager Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1260 bytes)
2015-02-19 17:03:34,704 INFO o.a.s.e.Executor Running task 1.0 in stage 0.0 (TID 1)
2015-02-19 17:03:34,707 WARN o.a.s.s.TaskSetManager Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078)
at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075)
... 20 more
2015-02-19 17:03:34,708 ERROR o.a.s.e.Executor Exception in task 1.0 in stage 0.0 (TID 1)
java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078)
at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075)
... 20 common frames omitted
2015-02-19 17:03:34,711 ERROR o.a.s.s.TaskSetManager Task 0 in stage 0.0 failed 1 times; aborting job
2015-02-19 17:03:34,731 INFO o.a.s.s.TaskSetManager Lost task 1.0 in stage 0.0 (TID 1) on executor localhost: java.io.IOException (java.lang.ClassNotFoundException: scala.collection.immutable.Range) [duplicate 1]
2015-02-19 17:03:34,733 INFO o.a.s.s.TaskSchedulerImpl Removed TaskSet 0.0, whose tasks have all completed, from pool
2015-02-19 17:03:34,742 INFO o.a.s.s.TaskSchedulerImpl Cancelling stage 0
2015-02-19 17:03:34,760 INFO o.a.s.s.DAGScheduler Job 0 failed: reduce at SparkSpike.scala:17, took 0.461929 s
[error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range
[error] at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078)
[error] at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.lang.reflect.Method.invoke(Method.java:606)
[error] at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
[error] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
[error] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
[error] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
[error] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
[error] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
[error] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
[error] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
[error] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
[error] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
[error] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
[error] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[error] at java.lang.Thread.run(Thread.java:745)
[error] Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range
[error] at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
[error] at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
[error] at java.security.AccessController.doPrivileged(Native Method)
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
[error] at java.lang.Class.forName0(Native Method)
[error] at java.lang.Class.forName(Class.java:274)
[error] at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
[error] at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
[error] at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
[error] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
[error] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
[error] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
[error] at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
[error] at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
[error] at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075)
[error] ... 20 more
[error]
[error] Driver stacktrace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078)
at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075)
... 20 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
at scala.Option.foreach(Option.scala:256)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[trace] Stack trace suppressed: run last root/compile:runMain for the full output.
2015-02-19 17:03:34,793 ERROR o.a.s.ContextCleaner Error in cleaning thread
java.lang.InterruptedException: null
at java.lang.Object.wait(Native Method)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:136)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1550)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:133)
at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65)
2015-02-19 17:03:34,812 ERROR o.a.s.u.Utils Uncaught exception in thread SparkListenerBus
java.lang.InterruptedException: null
at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
at java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:48)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1550)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last root/compile:runMain for the full output.
[error] (root/compile:runMain) Nonzero exit code: 1
[error] Total time: 31 s, completed Feb 19, 2015 12:03:35 PM
[foo-api] $ 2015-02-19 17:03:36,303 INFO o.a.s.s.BlockManager Removing broadcast 0
2015-02-19 17:03:36,311 INFO o.a.s.s.BlockManager Removing block broadcast_0
2015-02-19 17:03:36,313 INFO o.a.s.s.MemoryStore Block broadcast_0 of size 1600 dropped from memory (free 1983364604)
2015-02-19 17:03:36,313 INFO o.a.s.s.BlockManager Removing block broadcast_0_piece0
2015-02-19 17:03:36,313 INFO o.a.s.s.MemoryStore Block broadcast_0_piece0 of size 1171 dropped from memory (free 1983365775)
2015-02-19 17:03:36,315 INFO o.a.s.s.BlockManagerInfo Removed broadcast_0_piece0 on localhost:49238 in memory (size: 1171.0 B, free: 1891.5 MB)
2015-02-19 17:03:36,315 INFO o.a.s.s.BlockManagerMaster Updated info of block broadcast_0_piece0
2015-02-19 17:03:36,319 INFO o.a.s.ContextCleaner Cleaned broadcast 0
注意:相同的设置适用于新项目。这一定与我现有的项目有冲突,这是我想要与 spark 集成的项目
最佳答案
关于scala - Spark : ClassNotFoundException when running hello world example in scala 2. 11,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28612837/
我已经通过 AVD 管理器启动了我的模拟器,一旦它运行,我点击了 run app。我已经等了几分钟,我的正在运行的设备出现在 选择一个正在运行的设备 中,但窗口始终保持空白。 最佳答案 您正在运行的项
我想在安装新数据库之前删除旧数据库,以便为用户更新它。 我有以下情况: 在我的 Components 部分中,我为用户提供了一个选项: [Components] Name: "updateDataba
如果我将一个 Python 模块实现为一个目录(即包),它同时具有顶级函数 run 和子模块 run,我可以指望 from example import run 总是导入函数?根据我的测试,至少在 L
我在 Eclipse Juno 上使用 Tomcat 7。我使用工作区元数据作为服务器位置(请参阅下面的我的 tomcat 配置)。 我也收到了 服务器项目在 eclipse [请看下图] 中使用单独
我正在做一些测试以了解 java 中的不同线程状态,并且遇到了一些查询。 通常,当一个线程被实例化时,它被称为处于 "NEW" 状态,然后当调用它的 start() 方法时,操作系统调度程序获得控制权
当我使用命令 npm run build -- --prod 时,我收到以下错误消息: 属性“PropertyName1”是私有(private)属性,只能在“AppComponent”类中访问 “A
我正在尝试将默认的“运行”键盘快捷键更改为 ⌘R。 - 因为我不想每次都伸手去拿触控板,而且我的手指不够长,无法一次执行⌥⇧F10。 “运行”和“运行...”有什么区别? 最佳答案 ... 用于菜单中
我现在不知道如何编写一个合适的方法来测试这种行为。请不要投反对票.. 我现在有一个 java 类负责处理数据并将数据添加到多个数据库。每个数据库都保存相同的数据,但处理方式不同(例如,以不同的插值率进
我知道不应该调用 run 方法来启动新线程执行,但我指的是 this article他们在另一个 run 方法中调用了 runnable.run(); ,这似乎暗示它启动了一个新线程或者根本没有cre
当我尝试在Windows 10/11下使用Eclipse 2023-06调试任何应用程序(甚至是hello.c)时,我总是收到以下错误:。该错误清楚地指示-(错误2)-路径是错误的。。我试图在互联网上
在运行vue文件时,需要进行npm操作,但我们发现,有时候用的是npm run serve,而有的时候用的是npm run dev,二者有什么区别 在我们运行一些 vue 项目的时候,输入npm ru
我想在 cloud run 上运行一个长时间运行的作业。该任务可能执行超过 30 分钟,并且主要发送 API 请求。cloud run 在大约 20 分钟后停止执行,从指标来看,它似乎没有识别出我的任
我们无法让 SSE 从 Google Cloud Run 上的容器发送。我已经尝试使用一个简单的 SSE 示例( https://github.com/kljensen/node-sse-exampl
直到最近,我一直在执行这个美丽来构建 + 运行一个带有堆栈的项目: stack build && .stack-work/install/x86_64-linux/lts-4.1/7.10.3/bin
我们有一个小脚本,可以抓取网页(约 17 个条目),并将它们写入 Firestore 集合。为此,我们在 Google Cloud Run 上部署了一项服务。 这段代码的执行需要大约 5 秒 when
我是Docker的新手,我知道一种运行交互式容器的方法如下: $ docker run -it image-name bash 要么 $ docker run -it image-name /bin/
Dockerfile 中的多个 RUN 条目之间有什么区别,例如: FROM php:5.6-apache RUN docker-php-ext-install mysqli RUN apt upda
对于来自文档的云运行内存使用情况 ( https://cloud.google.com/run/docs/configuring/memory-limits ) Cloud Run applicati
今天早上我更新了我的 Ubuntu 版本,现在我无法从 eclipse 运行我的应用程序。 问题是,当我单击“运行方式”时出现的列表是空的,我无法运行任何内容。 我该如何解决这个问题? 我能看到的唯一
我正在 intelliJ 上使用 livereload 测试 spring-boot-devtools。我有一个简单的 SpringBootApplication,可以正常工作。 当我从 maven
我是一名优秀的程序员,十分优秀!