gpt4 book ai didi

hadoop - 无法在 Mahout 中实例化类型 Cluster、KMean 聚类示例

转载 作者:可可西里 更新时间:2023-11-01 14:31:56 26 4
gpt4 key购买 nike

您好,我试图在 Mahout 中运行 KmeanClustering Example,但遇到了示例代码中的错误。我在下面的代码片段中遇到错误

集群 cluster = new Cluster(vec, i, new EuclideanDistanceMeasure());

报错

Cannot instantiate the Type Cluster

(这是一个接口(interface),我的理解)。我想在我的样本数据集上运行 kmeans,任何人都可以指导我吗?

我在我的 EClipse IDE 中包含了以下 Jar

mahout-math-0.7-cdh4.3.0.jar

hadoop-common-2.0.0-cdh4.2.1.jar

hadoop-hdfs-2.0.0-cdh4.2.1.jar

hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar

mahout-core-0.7-cdh4.3.0.jar

检查我是否缺少任何必要的 jar,我将在 Hadoop CDH4.2.1 上运行它

这里附上我的全部代码,摘自 Github

package tryout;

import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.SequenceFile;
import org.apache.hadoop.io.Text;
import org.apache.mahout.math.RandomAccessSparseVector;
import org.apache.mahout.math.Vector;
import org.apache.mahout.math.VectorWritable;
import org.apache.mahout.clustering.Cluster;
import org.apache.mahout.clustering.classify.WeightedVectorWritable;
import org.apache.mahout.clustering.kmeans.KMeansDriver;
import org.apache.mahout.common.distance.EuclideanDistanceMeasure;

public class SimpleKMeansClustering {
public static final double[][] points = { {1, 1}, {2, 1}, {1, 2},
{2, 2}, {3, 3}, {8, 8},
{9, 8}, {8, 9}, {9, 9}};


public static void writePointsToFile(List<Vector> points,
String fileName,FileSystem fs,Configuration conf) throws IOException {
Path path = new Path(fileName);
SequenceFile.Writer writer = new SequenceFile.Writer(fs, conf,path, LongWritable.class, VectorWritable.class);

long recNum = 0;
VectorWritable vec = new VectorWritable();
for (Vector point : points) {
vec.set(point);
writer.append(new LongWritable(recNum++), vec);
} writer.close();
}

public static List<Vector> getPoints(double[][] raw) {
List<Vector> points = new ArrayList<Vector>();
for (int i = 0; i < raw.length; i++) {
double[] fr = raw[i];
Vector vec = new RandomAccessSparseVector(fr.length);
vec.assign(fr);
points.add(vec);
}
return points;
}
public static void main(String args[]) throws Exception {
int k = 2;
List<Vector> vectors = getPoints(points);
File testData = new File("testdata");
if (!testData.exists()) {
testData.mkdir();
}
testData = new File("testdata/points");
if (!testData.exists()) {
testData.mkdir();
}
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
writePointsToFile(vectors, "testdata/points/file1", fs, conf);
Path path = new Path("testdata/clusters/part-00000");
SequenceFile.Writer writer = new SequenceFile.Writer(fs, conf,path, Text.class, Cluster.class);
for (int i = 0; i < k; i++) {
Vector vec = vectors.get(i);
Cluster cluster = new Cluster(vec, i, new EuclideanDistanceMeasure());
writer.append(new Text(cluster.getIdentifier()), cluster);
}
writer.close();


KMeansDriver.run(conf, new Path("testdata/points"), new Path("testdata/clusters"),
new Path("output"), new EuclideanDistanceMeasure(), 0.001, 10,
true, false);
SequenceFile.Reader reader = new SequenceFile.Reader(fs,new Path("output/" + Cluster.CLUSTERED_POINTS_DIR+ "/part-m-00000"), conf);
IntWritable key = new IntWritable();
WeightedVectorWritable value = new WeightedVectorWritable();
while (reader.next(key, value)) {
System.out.println(value.toString() + " belongs to cluster " + key.toString());
}
reader.close();
}
}

还指导我,如果我有自己的数据集,该如何处理。

最佳答案

我也一直在尝试使 Mahout in Action 一书中的这个示例起作用。我最终做到了。这是我所做的:

SequenceFile.Writer writer= new SequenceFile.Writer(fs, conf, path, Text.class, Kluster.class);
for (int i = 0; i < k; i++) {
Vector vec = vectors.get(i);
Kluster cluster = new Kluster(vec, i, new EuclideanDistanceMeasure());
writer.append(new Text(Kluster.getIdentifier()), cluster);
}

我不敢相信书中的代码是错误的。我还设法在不使用 maven 的情况下让它工作。我在这里更全面地描述了这一点,但基本上我是用用户库做的:Using mahout in eclipse WITHOUT USING MAVEN

更新:好的,书的内容没有错,只是旧了。此页面包含指向本书中更新代码的链接

http://alexott.blogspot.co.uk/2012/07/getting-started-with-examples-from.html

关于hadoop - 无法在 Mahout 中实例化类型 Cluster、KMean 聚类示例,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/16954154/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com