gpt4 book ai didi

java - 获取 java.lang.IllegalArgumentException : requirement failed while calling Sparks MLLIB StreamingKMeans from java application

转载 作者:搜寻专家 更新时间:2023-10-31 20:27:21 24 4
gpt4 key购买 nike

我是 Spark 和 MLlib 的新手,我试图从我的 Java 应用程序调用 StreamingKMeans,但我遇到了一个我似乎不明白的异常。这是我转换训练数据的代码:

JavaDStream<Vector> trainingData = sjsc.textFileStream("/training")
.map(new Function<String, Vector>() {
public DenseVector call(String line) throws Exception {
String[] lineSplit = line.split(",");

double[] doubleValues = new double[lineSplit.length];
for (int i = 0; i < lineSplit.length; i++) {
doubleValues[i] = Double.parseDouble(lineSplit[i] != null ? !""
.equals(lineSplit[i]) ? lineSplit[i] : "0" : "0");
}
DenseVector denseV = new DenseVector(doubleValues);
if (denseV.size() != 16) {
throw new Exception("All vectors are not the same size!");
}
System.out.println("Vector length is:" + denseV.size());
return denseV;
}
});

这里是我调用 trainOn 方法的代码:

int numDimensions = 18;
int numClusters = 2;
StreamingKMeans model = new StreamingKMeans();
model.setK(numClusters);
model.setDecayFactor(.5);
model.setRandomCenters(numDimensions, 0.0, Utils.random().nextLong());

model.trainOn(trainingData.dstream());

这是我收到的异常:

java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:292)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:485)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:459)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:453)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:453)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:35)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:258)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:258)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$15.apply(PairRDDFunctions.scala:674)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$15.apply(PairRDDFunctions.scala:674)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$33.apply(RDD.scala:1177)
at org.apache.spark.rdd.RDD$$anonfun$33.apply(RDD.scala:1177)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)

正如您在上面的代码中看到的那样,我正在检查以确保我的 vector 大小相同并且它们看起来是相同的,即使错误表明它们不是。任何帮助将不胜感激!

最佳答案

所有 vector 的维度不相同可能会导致此异常。

根据我的经验,另一个可能的原因是包含 NaN 值的 Vector。

vector 中的所有值都不能包含 NaN。

关于java - 获取 java.lang.IllegalArgumentException : requirement failed while calling Sparks MLLIB StreamingKMeans from java application,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30737361/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com