2015-04-06 3 views
3

У меня возникла проблема при построении Spark 1.3.0 (это не произошло для меня, пока я строил его с помощью 1.0.2 и 1.2.0).Ошибка при построении Spark 1.3.0 - Неразрешенный путь зависимостей

У меня есть Scala 2.10.4 установлен.

Я бег sbt/sbt assembly (я также попытался sbt/sbt clean update assembly, с тем же результатом), и я продолжаю получать следующую ошибки:

(...) 

[info] Resolving org.fusesource.jansi#jansi;1.4 ... 
[info] Done updating. 
[info] Updating {file:/usr/local/src/spark/spark-1.3.0/}network-shuffle... 
[info] Resolving org.fusesource.jansi#jansi;1.4 ... 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] ::   UNRESOLVED DEPENDENCIES   :: 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] :: org.apache.spark#spark-network-common_2.10;1.3.0: configuration not public in org.apache.spark#spark-network-common_2.10;1.3.0: 'test'. It was required from org.apache.spark#spark-network-shuffle_2.10;1.3.0 test 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] 
[warn] Note: Unresolved dependencies path: 
[warn]  org.apache.spark:spark-network-common_2.10:1.3.0 ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76) 
[warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.3.0 
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-network-common_2.10;1.3.0: configuration not public in org.apache.spark#spark-network-common_2.10;1.3.0: 'test'. It was required from org.apache.spark#spark-network-shuffle_2.10;1.3.0 test 
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278) 
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175) 
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157) 
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151) 
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151) 
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128) 
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56) 
    at sbt.IvySbt$$anon$4.call(Ivy.scala:64) 
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93) 
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78) 
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97) 
    at xsbt.boot.Using$.withResource(Using.scala:10) 
    at xsbt.boot.Using$.apply(Using.scala:9) 
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58) 
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48) 
    at xsbt.boot.Locks$.apply0(Locks.scala:31) 
    at xsbt.boot.Locks$.apply(Locks.scala:28) 
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:64) 
    at sbt.IvySbt.withIvy(Ivy.scala:123) 
    at sbt.IvySbt.withIvy(Ivy.scala:120) 
    at sbt.IvySbt$Module.withModule(Ivy.scala:151) 
    at sbt.IvyActions$.updateEither(IvyActions.scala:157) 
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318) 
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1315) 
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1345) 
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1343) 
    at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35) 
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348) 
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342) 
    at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45) 
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360) 
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300) 
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275) 
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) 
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) 
    at sbt.std.Transform$$anon$4.work(System.scala:63) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) 
    at sbt.Execute.work(Execute.scala:235) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) 
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
[error] (network-shuffle/*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-network-common_2.10;1.3.0: configuration not public in org.apache.spark#spark-network-common_2.10;1.3.0: 'test'. It was required from org.apache.spark#spark-network-shuffle_2.10;1.3.0 test 
[error] Total time: 3 s, completed Apr 6, 2015 7:43:13 PM 

У меня нет experiencie в решении такой проблемы зависимостей и буду appreciante любого намека на как двигаться дальше с этой проблемой.

ответ

0

просто попробуйте удалить .ivy (папка конфигурации) и снова запустить сборку sbt/sbt.

+0

Может быть ~/.ivy2 в зависимости от вашей текущей версии плюща – Murmel

+0

где путь? вы не даете решение, просто говорящее за то, что сказали –

+0

@beyhan, путь ~/.ivy или ~/.ivy2. в зависимости от вашей текущей версии плюща. –

Смежные вопросы