gpt4 book ai didi

java - 为什么在 IntelliJ IDEA 中运行 MLlib 项目失败并显示 "AssertionError: assertion failed: unsafe symbol CompatContext"?

转载 作者:行者123 更新时间:2023-11-30 02:22:56 24 4
gpt4 key购买 nike

我正在尝试运行逻辑回归示例 ( https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaLogisticRegressionWithElasticNetExample.java )

这是代码:

public final class GettingStarted {

public static void main(final String[] args) throws InterruptedException {
System.setProperty("hadoop.home.dir", "C:\\winutils");

SparkSession spark = SparkSession
.builder()
.appName("JavaLogisticRegressionWithElasticNetExample")
.config("spark.master", "local")
.getOrCreate();

// $example on$
// Load training data
Dataset<Row> training = spark.read().format("libsvm").load("data/mllib/sample_libsvm_data.txt");

LogisticRegression lr = new LogisticRegression()
.setMaxIter(10)
.setRegParam(0.3)
.setElasticNetParam(0.8);

// Fit the model
LogisticRegressionModel lrModel = lr.fit(training);

// Print the coefficients and intercept for logistic regression
System.out.println("Coefficients: "
+ lrModel.coefficients() + " Intercept: " + lrModel.intercept());

// We can also use the multinomial family for binary classification
LogisticRegression mlr = new LogisticRegression()
.setMaxIter(10)
.setRegParam(0.3)
.setElasticNetParam(0.8)
.setFamily("multinomial");

// Fit the model
LogisticRegressionModel mlrModel = mlr.fit(training);

// Print the coefficients and intercepts for logistic regression with multinomial family
System.out.println("Multinomial coefficients: " + lrModel.coefficientMatrix()
+ "\nMultinomial intercepts: " + mlrModel.interceptVector());
// $example off$

spark.stop();}}

我还使用示例的相同文件 ( https://github.com/apache/spark/blob/master/data/mllib/sample_libsvm_data.txt )但我收到这些错误:

Exception in thread "main" java.lang.AssertionError: assertion failed: unsafe symbol CompatContext (child of package macrocompat) in runtime reflection universe
at scala.reflect.internal.Symbols$Symbol.<init>(Symbols.scala:184)
at scala.reflect.internal.Symbols$TypeSymbol.<init>(Symbols.scala:2984)
at scala.reflect.internal.Symbols$ClassSymbol.<init>(Symbols.scala:3176)
at scala.reflect.internal.Symbols$StubClassSymbol.<init>(Symbols.scala:3471)
at scala.reflect.internal.Symbols$Symbol.newStubSymbol(Symbols.scala:498)
at scala.reflect.internal.pickling.UnPickler$Scan.readExtSymbol$1(UnPickler.scala:258)
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbol(UnPickler.scala:284)
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbolRef(UnPickler.scala:649)
at scala.reflect.internal.pickling.UnPickler$Scan.readType(UnPickler.scala:417)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725)
at scala.reflect.internal.pickling.UnPickler$Scan.at(UnPickler.scala:179)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.completeInternal(UnPickler.scala:725)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.complete(UnPickler.scala:749)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1489)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:162)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:162)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.info(SynchronizedSymbols.scala:162)
at scala.reflect.internal.Mirrors$RootsBase.ensureClassSymbol(Mirrors.scala:94)
at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
at scala.reflect.internal.Mirrors$RootsBase.getClassIfDefined(Mirrors.scala:114)
at scala.reflect.internal.Mirrors$RootsBase.getClassIfDefined(Mirrors.scala:111)
at scala.reflect.internal.Definitions$DefinitionsClass.BlackboxContextClass$lzycompute(Definitions.scala:496)
at scala.reflect.internal.Definitions$DefinitionsClass.BlackboxContextClass(Definitions.scala:496)
at scala.reflect.runtime.JavaUniverseForce$class.force(JavaUniverseForce.scala:305)
at scala.reflect.runtime.JavaUniverse.force(JavaUniverse.scala:16)
at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:147)
at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:78)
at scala.reflect.runtime.package$.universe$lzycompute(package.scala:17)
at scala.reflect.runtime.package$.universe(package.scala:17)
at org.apache.spark.sql.catalyst.ScalaReflection$.<init>(ScalaReflection.scala:40)
at org.apache.spark.sql.catalyst.ScalaReflection$.<clinit>(ScalaReflection.scala)
at org.apache.spark.sql.catalyst.encoders.RowEncoder$.org$apache$spark$sql$catalyst$encoders$RowEncoder$$serializerFor(RowEncoder.scala:74)
at org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(RowEncoder.scala:61)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:415)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:172)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:156)
at GettingStarted.main(GettingStarted.java:95)

你知道我错在哪里吗?

编辑:我在 IntelliJ 上运行它,它是一个 Maven 项目,我添加了依赖项:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>2.2.0</version>
</dependency>

最佳答案

tl;dr 一旦您开始看到 scala 内部错误,提到反射宇宙,请考虑不兼容的 scala 版本。

您的库上的 scala 版本彼此不匹配(2.10 和 2.11)。

您应该将所有内容与实际的 scala 版本对齐。

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId> <!-- This is scala v2.11 -->
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId> <!-- This is scala v2.10 -->
<version>2.2.0</version>
</dependency>

关于java - 为什么在 IntelliJ IDEA 中运行 MLlib 项目失败并显示 "AssertionError: assertion failed: unsafe symbol CompatContext"?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46346536/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com