2015-02-19 5 views
2

У меня есть проект sbt с подпроектами. Все они используют Scala 2.11.4. В одном из подпроектов (sparktest) Я добавил искровые жильныйSpark: ClassNotFoundException при запуске hello world example in scala 2.11

name := """sparktest""" 

version := "1.0-SNAPSHOT" 

scalaVersion := "2.11.4" 

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.2.1" 
    exclude("org.slf4j", "slf4j-log4j12") 
) 

sparktest зависит от другого SBT проекта под названием Обще, который отменяет Акку-актер 2.3.9

Дела в том, когда я пытаюсь запустить следующий фрагмент (вытащил из примеров свечей)

import org.apache.spark.{SparkContext, SparkConf} 
import scala.math.random 

object SparkSpike extends App { 

    val conf = new SparkConf().setAppName("Spark Pi").setMaster("local") 
    val spark = new SparkContext(conf) 

    val slices = if (args.length > 0) args(0).toInt else 2 
    val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow 
    val count = spark.parallelize(1 until n, slices).map { i => 
     val x = random * 2 - 1 
     val y = random * 2 - 1 
     if (x*x + y*y < 1) 1 else 0 
    }.reduce(_ + _) 
    println("Pi is roughly " + 4.0 * count/n) 
    spark.stop() 
} 

Я получаю следующее сообщение об ошибке:

2015-02-19 17:03:31,429 INFO o.a.s.SecurityManager Changing view acls to: bar 
2015-02-19 17:03:31,432 INFO o.a.s.SecurityManager Changing modify acls to: bar 
2015-02-19 17:03:31,433 INFO o.a.s.SecurityManager SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(bar); users with modify permissions: Set(bar) 
2015-02-19 17:03:31,957 INFO a.e.s.Slf4jLogger Slf4jLogger started 
2015-02-19 17:03:32,052 INFO Remoting Starting remoting 
2015-02-19 17:03:32,336 INFO Remoting Remoting started; listening on addresses :[akka.tcp://[email protected]:49236] 
2015-02-19 17:03:32,350 INFO o.a.s.u.Utils Successfully started service 'sparkDriver' on port 49236. 
2015-02-19 17:03:32,378 INFO o.a.s.SparkEnv Registering MapOutputTracker 
2015-02-19 17:03:32,404 INFO o.a.s.SparkEnv Registering BlockManagerMaster 
2015-02-19 17:03:32,440 INFO o.a.s.s.DiskBlockManager Created local directory at /var/folders/26/7b3b32gd4wx1h25vd2qm66q00000gp/T/spark-a594f880-f5d1-4926-b555-eabbe1728734/spark-4e8f77c4-8018-4e64-88e7-6ca060d9a35c 
2015-02-19 17:03:32,447 INFO o.a.s.s.MemoryStore MemoryStore started with capacity 1891.5 MB 
2015-02-19 17:03:32,948 WARN o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
2015-02-19 17:03:33,100 INFO o.a.s.HttpFileServer HTTP File server directory is /var/folders/26/7b3b32gd4wx1h25vd2qm66q00000gp/T/spark-0f30ec72-f2aa-44ed-9e92-9931fba5ba39/spark-d7fa24ef-768a-4e05-9fa3-ce29eacc0c19 
2015-02-19 17:03:33,109 INFO o.a.s.HttpServer Starting HTTP Server 
2015-02-19 17:03:33,206 INFO o.e.j.s.Server jetty-8.1.14.v20131031 
2015-02-19 17:03:33,229 INFO o.e.j.s.AbstractConnector Started [email protected]:49237 
2015-02-19 17:03:33,229 INFO o.a.s.u.Utils Successfully started service 'HTTP file server' on port 49237. 
2015-02-19 17:03:33,420 INFO o.e.j.s.Server jetty-8.1.14.v20131031 
2015-02-19 17:03:33,441 INFO o.e.j.s.AbstractConnector Started [email protected]:4040 
2015-02-19 17:03:33,442 INFO o.a.s.u.Utils Successfully started service 'SparkUI' on port 4040. 
2015-02-19 17:03:33,445 INFO o.a.s.u.SparkUI Started SparkUI at http://192.168.59.3:4040 
2015-02-19 17:03:33,611 INFO o.a.s.e.Executor Starting executor ID <driver> on host localhost 
2015-02-19 17:03:33,634 INFO o.a.s.u.AkkaUtils Connecting to HeartbeatReceiver: akka.tcp://[email protected]:49236/user/HeartbeatReceiver 
2015-02-19 17:03:33,994 INFO o.a.s.n.n.NettyBlockTransferService Server created on 49238 
2015-02-19 17:03:33,996 INFO o.a.s.s.BlockManagerMaster Trying to register BlockManager 
2015-02-19 17:03:33,998 INFO o.a.s.s.BlockManagerMasterActor Registering block manager localhost:49238 with 1891.5 MB RAM, BlockManagerId(<driver>, localhost, 49238) 
2015-02-19 17:03:34,001 INFO o.a.s.s.BlockManagerMaster Registered BlockManager 
2015-02-19 17:03:34,297 INFO o.a.s.SparkContext Starting job: reduce at SparkSpike.scala:17 
2015-02-19 17:03:34,321 INFO o.a.s.s.DAGScheduler Got job 0 (reduce at SparkSpike.scala:17) with 2 output partitions (allowLocal=false) 
2015-02-19 17:03:34,322 INFO o.a.s.s.DAGScheduler Final stage: Stage 0(reduce at SparkSpike.scala:17) 
2015-02-19 17:03:34,323 INFO o.a.s.s.DAGScheduler Parents of final stage: List() 
2015-02-19 17:03:34,329 INFO o.a.s.s.DAGScheduler Missing parents: List() 
2015-02-19 17:03:34,349 INFO o.a.s.s.DAGScheduler Submitting Stage 0 (MappedRDD[1] at map at SparkSpike.scala:13), which has no missing parents 
2015-02-19 17:03:34,505 INFO o.a.s.s.MemoryStore ensureFreeSpace(1600) called with curMem=0, maxMem=1983365775 
2015-02-19 17:03:34,507 INFO o.a.s.s.MemoryStore Block broadcast_0 stored as values in memory (estimated size 1600.0 B, free 1891.5 MB) 
2015-02-19 17:03:34,588 INFO o.a.s.s.MemoryStore ensureFreeSpace(1171) called with curMem=1600, maxMem=1983365775 
2015-02-19 17:03:34,588 INFO o.a.s.s.MemoryStore Block broadcast_0_piece0 stored as bytes in memory (estimated size 1171.0 B, free 1891.5 MB) 
2015-02-19 17:03:34,591 INFO o.a.s.s.BlockManagerInfo Added broadcast_0_piece0 in memory on localhost:49238 (size: 1171.0 B, free: 1891.5 MB) 
2015-02-19 17:03:34,592 INFO o.a.s.s.BlockManagerMaster Updated info of block broadcast_0_piece0 
2015-02-19 17:03:34,594 INFO o.a.s.SparkContext Created broadcast 0 from broadcast at DAGScheduler.scala:838 
2015-02-19 17:03:34,617 INFO o.a.s.s.DAGScheduler Submitting 2 missing tasks from Stage 0 (MappedRDD[1] at map at SparkSpike.scala:13) 
2015-02-19 17:03:34,618 INFO o.a.s.s.TaskSchedulerImpl Adding task set 0.0 with 2 tasks 
2015-02-19 17:03:34,659 INFO o.a.s.s.TaskSetManager Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1260 bytes) 
2015-02-19 17:03:34,671 INFO o.a.s.e.Executor Running task 0.0 in stage 0.0 (TID 0) 
2015-02-19 17:03:34,692 ERROR o.a.s.e.Executor Exception in task 0.0 in stage 0.0 (TID 0) 
java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078) 
    at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62) 
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:274) 
    at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500) 
    at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74) 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075) 
    ... 20 common frames omitted 
2015-02-19 17:03:34,704 INFO o.a.s.s.TaskSetManager Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1260 bytes) 
2015-02-19 17:03:34,704 INFO o.a.s.e.Executor Running task 1.0 in stage 0.0 (TID 1) 
2015-02-19 17:03:34,707 WARN o.a.s.s.TaskSetManager Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078) 
    at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62) 
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:274) 
    at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500) 
    at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74) 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075) 
    ... 20 more 

2015-02-19 17:03:34,708 ERROR o.a.s.e.Executor Exception in task 1.0 in stage 0.0 (TID 1) 
java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078) 
    at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62) 
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:274) 
    at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500) 
    at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74) 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075) 
    ... 20 common frames omitted 
2015-02-19 17:03:34,711 ERROR o.a.s.s.TaskSetManager Task 0 in stage 0.0 failed 1 times; aborting job 
2015-02-19 17:03:34,731 INFO o.a.s.s.TaskSetManager Lost task 1.0 in stage 0.0 (TID 1) on executor localhost: java.io.IOException (java.lang.ClassNotFoundException: scala.collection.immutable.Range) [duplicate 1] 
2015-02-19 17:03:34,733 INFO o.a.s.s.TaskSchedulerImpl Removed TaskSet 0.0, whose tasks have all completed, from pool 
2015-02-19 17:03:34,742 INFO o.a.s.s.TaskSchedulerImpl Cancelling stage 0 
2015-02-19 17:03:34,760 INFO o.a.s.s.DAGScheduler Job 0 failed: reduce at SparkSpike.scala:17, took 0.461929 s 
[error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
[error]  at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078) 
[error]  at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70) 
[error]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
[error]  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
[error]  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
[error]  at java.lang.reflect.Method.invoke(Method.java:606) 
[error]  at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
[error]  at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
[error]  at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
[error]  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
[error]  at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
[error]  at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
[error]  at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
[error]  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
[error]  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
[error]  at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62) 
[error]  at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87) 
[error]  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182) 
[error]  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[error]  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[error]  at java.lang.Thread.run(Thread.java:745) 
[error] Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
[error]  at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
[error]  at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
[error]  at java.security.AccessController.doPrivileged(Native Method) 
[error]  at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
[error]  at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
[error]  at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
[error]  at java.lang.Class.forName0(Native Method) 
[error]  at java.lang.Class.forName(Class.java:274) 
[error]  at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59) 
[error]  at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612) 
[error]  at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
[error]  at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
[error]  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
[error]  at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
[error]  at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500) 
[error]  at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74) 
[error]  at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075) 
[error]  ... 20 more 
[error] 
[error] Driver stacktrace: 
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1078) 
    at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62) 
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.ClassNotFoundException: scala.collection.immutable.Range 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:274) 
    at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500) 
    at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74) 
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1075) 
    ... 20 more 

Driver stacktrace: 
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202) 
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) 
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) 
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696) 
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696) 
    at scala.Option.foreach(Option.scala:256) 
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696) 
    at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420) 
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465) 
    at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375) 
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
    at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254) 
    at akka.dispatch.Mailbox.run(Mailbox.scala:221) 
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231) 
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 
[trace] Stack trace suppressed: run last root/compile:runMain for the full output. 
2015-02-19 17:03:34,793 ERROR o.a.s.ContextCleaner Error in cleaning thread 
java.lang.InterruptedException: null 
    at java.lang.Object.wait(Native Method) 
    at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135) 
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:136) 
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134) 
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134) 
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1550) 
    at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:133) 
    at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65) 
2015-02-19 17:03:34,812 ERROR o.a.s.u.Utils Uncaught exception in thread SparkListenerBus 
java.lang.InterruptedException: null 
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996) 
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303) 
    at java.util.concurrent.Semaphore.acquire(Semaphore.java:317) 
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:48) 
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47) 
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47) 
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1550) 
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46) 
java.lang.RuntimeException: Nonzero exit code: 1 
    at scala.sys.package$.error(package.scala:27) 
[trace] Stack trace suppressed: run last root/compile:runMain for the full output. 
[error] (root/compile:runMain) Nonzero exit code: 1 
[error] Total time: 31 s, completed Feb 19, 2015 12:03:35 PM 
[foo-api] $ 2015-02-19 17:03:36,303 INFO o.a.s.s.BlockManager Removing broadcast 0 
2015-02-19 17:03:36,311 INFO o.a.s.s.BlockManager Removing block broadcast_0 
2015-02-19 17:03:36,313 INFO o.a.s.s.MemoryStore Block broadcast_0 of size 1600 dropped from memory (free 1983364604) 
2015-02-19 17:03:36,313 INFO o.a.s.s.BlockManager Removing block broadcast_0_piece0 
2015-02-19 17:03:36,313 INFO o.a.s.s.MemoryStore Block broadcast_0_piece0 of size 1171 dropped from memory (free 1983365775) 
2015-02-19 17:03:36,315 INFO o.a.s.s.BlockManagerInfo Removed broadcast_0_piece0 on localhost:49238 in memory (size: 1171.0 B, free: 1891.5 MB) 
2015-02-19 17:03:36,315 INFO o.a.s.s.BlockManagerMaster Updated info of block broadcast_0_piece0 
2015-02-19 17:03:36,319 INFO o.a.s.ContextCleaner Cleaned broadcast 0 

Примечание: эта же настройка работала в новом проекте. Это должен быть конфликт с моим существующим проектом, который я хочу интегрировать с искровым

+0

Этот вопрос объясняет те же симптомы, и решение должно было понизиться до scala 2.10. Дело в том, что искра 1.2.0+ предназначена для поддержки scala 2.11 http://stackoverflow.com/questions/26351338/running-spark-scala-example-fails – luisobo

+0

Но вы построили Spark для 2.11? –

+0

Джастин, я не строю искру. Я извлекаю банки из Mavem Central, используя sbt. искровой сердечник был построен для scala 2.11 и опубликован там http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22spark-core_2.11%22 – luisobo

ответ

11

Очевидно, sbt строит несколько классов в scala 2.10 по той причине, что я не понял.

Решение было использовать fork := true в главном build.sbt

Source

0

Имел тот же самый вопрос, у меня было несколько библиотек на моем пути к классам - один был требующий SCALA 2.10.4, а остальные 2.10 .6, как только я установил scalaVersion на уровне проекта, проблема не исчезла:

scalaVersion := "2.10.6" 
Смежные вопросы