2017-02-04 3 views
1

I am trying to connect spark with cassandra database but I got the error mentioned below. I think there should some mismatch with the versions.Cassandra Спарк Connector - NoSuchMethodError: scala.runtime.ObjectRef.zero() Lscala/выполнения/ObjectRef

код:

SparkConf conf = new SparkConf().setAppName("kafka-sandbox").setMaster("local[2]"); 
    conf.set("spark.cassandra.connection.host", "192.168.34.1");//connection for cassandra database 
    JavaSparkContext sc = new JavaSparkContext(conf); 
    CassandraConnector connector = CassandraConnector.apply(sc.getConf()); 
    final Session session = connector.openSession();//error in this line 
    final PreparedStatement prepared = session.prepare("INSERT INTO spark_test5.messages JSON?"); 

error: 


    Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef; 
     at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala) 
     at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149) 
     at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149) 
     at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) 
     at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) 
     at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:82) 

pom.xml: 

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 
    <groupId>SparkPoc</groupId> 
    <artifactId>Spark-Poc</artifactId> 
    <version>0.0.1-SNAPSHOT</version> 
    <dependencies> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming_2.10</artifactId> 
     <version>2.0.0</version> 
     <scope>provided</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.10</artifactId> 
     <version>2.0.1</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming-kafka-0-8_2.10</artifactId> 
     <version>2.0.0</version> 
    </dependency> 
    <dependency> 
     <groupId>com.datastax.spark</groupId> 
     <artifactId>spark-cassandra-connector_2.11</artifactId> 
     <version>2.0.0-M3</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-sql_2.11</artifactId> 
     <version>2.0.1</version> 
    </dependency> 
    </dependencies> 
<build> 
    <plugins> 
    <plugin> 
     <groupId>org.apache.maven.plugins</groupId> 
     <artifactId>maven-compiler-plugin</artifactId> 
     <version>3.3</version> 
     <configuration> 
      <source>1.8</source> 
      <target>1.8</target> 
     </configuration> 
    </plugin> 
    <plugin> 
     <groupId>org.apache.maven.plugins</groupId> 
     <artifactId>maven-assembly-plugin</artifactId> 
     <version>2.4.1</version> 
     <configuration> 
      <!-- get all project dependencies --> 
      <descriptorRefs> 
        <descriptorRef>jar-with-dependencies</descriptorRef> 
      </descriptorRefs> 
      <!-- MainClass in mainfest make a executable jar --> 
      <archive> 
        <manifest> 
          <mainClass>com.nwf.Consumer</mainClass> 
        </manifest> 
      </archive> 
     </configuration> 
     <executions> 
      <execution> 
        <id>make-assembly</id> 
        <!-- bind to the packaging phase --> 
        <phase>package</phase> 
        <goals> 
          <goal>single</goal> 
        </goals> 
      </execution> 
    </executions> 
    </plugin> 
    </plugins> 
</build> 
</project> 

Спарк версия: версия 2.0.0

Версия Scala: версия 2.11.8

+0

Пожалуйста, покажите, как вы добавили разъем Кассандру в прикладном Спарк –

+0

импорта com.datastax.spark.connector.cql.CassandraConnector; – sat

+0

Нет, нет. Maven, SBT, Некоторый флаг '--пакеты 'на вашей искре-submit? –

ответ

0

По вашему pom.xml вы смешиваете версии для различных лестницу зависимостей:

  • искровым streaming_ 2,10
  • spark-core_ 2.10
  • искровым потоковый Кафка-0-8_ 2,10
  • искровой Кассандра connector_ 2,11
  • искрового sql_ 2.11

Все зависимости должен иметь одинаковые версии SCALA. Пожалуйста, попробуйте изменить все на _2.11.

+0

Спасибо за помощь. Я разрешил ошибку – sat

0

zero() на scala.runtime.VolatileObjectRef был представлен в Scala 2.11. У вас, вероятно, есть библиотека, скомпилированная против Scala 2.11 и работающая во время выполнения Scala 2.10.

См

v2.10: https://github.com/scala/scala/blob/2.10.x/src/library/scala/runtime/VolatileObjectRef.java V2.11: https://github.com/scala/scala/blob/2.11.x/src/library/scala/runtime/VolatileObjectRef.java

0
In my pom.xml I changed sacala version from 2.10 to 2.11. 
Given below is the updated pom.xml 


---------- 
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 
    <groupId>SparkPoc</groupId> 
    <artifactId>Spark-Poc</artifactId> 
    <version>0.0.1-SNAPSHOT</version> 
    <dependencies> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming_2.11</artifactId> 
     <version>2.0.0</version> 
     <scope>provided</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.11</artifactId> 
     <version>2.0.1</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> 
     <version>2.0.0</version> 
    </dependency> 
    <dependency> 
     <groupId>com.datastax.spark</groupId> 
     <artifactId>spark-cassandra-connector_2.11</artifactId> 
     <version>2.0.0-M3</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-sql_2.11</artifactId> 
     <version>2.0.1</version> 
    </dependency> 
    </dependencies> 
<build> 
    <plugins> 
    <plugin> 
     <groupId>org.apache.maven.plugins</groupId> 
     <artifactId>maven-compiler-plugin</artifactId> 
     <version>3.3</version> 
     <configuration> 
      <source>1.8</source> 
      <target>1.8</target> 
     </configuration> 
    </plugin> 
    <plugin> 
     <groupId>org.apache.maven.plugins</groupId> 
     <artifactId>maven-assembly-plugin</artifactId> 
     <version>2.4.1</version> 
     <configuration> 
      <!-- get all project dependencies --> 
      <descriptorRefs> 
        <descriptorRef>jar-with-dependencies</descriptorRef> 
      </descriptorRefs> 
      <!-- MainClass in mainfest make a executable jar --> 
      <archive> 
        <manifest> 
          <mainClass>com.nwf.Consumer</mainClass> 
        </manifest> 
      </archive> 
     </configuration> 
     <executions> 
      <execution> 
        <id>make-assembly</id> 
        <!-- bind to the packaging phase --> 
        <phase>package</phase> 
        <goals> 
          <goal>single</goal> 
        </goals> 
      </execution> 
    </executions> 
    </plugin> 
    </plugins> 
</build> 
</project> 
Смежные вопросы