gpt4 book ai didi

java - Apache Spark - JavaSparkContext 无法转换为 SparkContext 错误

转载 作者:行者123 更新时间:2023-11-29 03:04:40 25 4
gpt4 key购买 nike

我在将 Spark 示例转换为可运行代码时遇到了相当大的困难(正如我之前的问题 here 所证明的那样)。

那里提供的答案帮助我解决了那个特定的例子,但现在我正在尝试使用 the Multilayer Perceptron example 进行试验我马上就遇到了错误。

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel;
import org.apache.spark.ml.classification.MultilayerPerceptronClassifier;
import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator;
import org.apache.spark.ml.param.ParamMap;
import org.apache.spark.mllib.regression.LabeledPoint;
import org.apache.spark.mllib.util.MLUtils;
import org.apache.spark.mllib.linalg.Vectors;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;

// Load training data
public class SimpleANN {
public static void main(String[] args) {
String path = "file:/usr/local/share/spark-1.5.0/data/mllib/sample_multiclass_classification_data.txt";
SparkConf conf = new SparkConf().setAppName("Simple ANN");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();
...
...
}
}

出现以下错误

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project simple-ann: Compilation failure
[ERROR] /Users/robo/study/spark/ann/src/main/java/SimpleANN.java:[23,61] incompatible types: org.apache.spark.api.java.JavaSparkContext cannot be converted to org.apache.spark.SparkContext

最佳答案

如果您需要 JavaSparkContext 中的 SparkContext,您可以使用静态方法:

JavaSparkContext.toSparkContext(youJavaSparkContextBean)

所以你必须修改你的代码

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(
JavaSparkContext.toSparkContext(sc),
path).toJavaRDD();

关于java - Apache Spark - JavaSparkContext 无法转换为 SparkContext 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32693752/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com