gpt4 book ai didi

java - 矩阵乘法期间 Apache Spark java 堆空间错误

转载 作者:行者123 更新时间:2023-12-02 11:20:33 24 4
gpt4 key购买 nike

我正在使用 Spark 2.0.1,有两个工作线程(每个工作线程一个执行程序),每个工作线程 20Gb。并运行以下代码:

JavaRDD<MatrixEntry> entries = ...; // filing the data
CoordinateMatrix cmatrix = new CoordinateMatrix(entries.rdd());
BlockMatrix matrix = cmatrix.toBlockMatrix(100, 1000);
BlockMatrix cooc = matrix.transpose().multiply(matrix);

我的矩阵包含 10 000 000 个非空单元格(每个单元格等于 1.0),并且大约有。 3000 列。数据没那么大。但在乘法过程中我总是得到:

17/01/24 08:03:10 WARN TaskMemoryManager: leak 1322.6 MB memory from org.apache.spark.util.collection.ExternalAppendOnlyMap@649e7019
17/01/24 08:03:10 ERROR Executor: Exception in task 1.0 in stage 57.0 (TID 83664)
java.lang.OutOfMemoryError: Java heap space
at org.apache.spark.mllib.linalg.DenseMatrix$.zeros(Matrices.scala:453)
at org.apache.spark.mllib.linalg.Matrix$class.multiply(Matrices.scala:101)
at org.apache.spark.mllib.linalg.SparseMatrix.multiply(Matrices.scala:565)
at org.apache.spark.mllib.linalg.distributed.BlockMatrix$$anonfun$23$$anonfun$apply$9$$anonfun$apply$11.apply(BlockMatrix.scala:483)
at org.apache.spark.mllib.linalg.distributed.BlockMatrix$$anonfun$23$$anonfun$apply$9$$anonfun$apply$11.apply(BlockMatrix.scala:480)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:285)
at org.apache.spark.mllib.linalg.distributed.BlockMatrix$$anonfun$23$$anonfun$apply$9.apply(BlockMatrix.scala:480)
at org.apache.spark.mllib.linalg.distributed.BlockMatrix$$anonfun$23$$anonfun$apply$9.apply(BlockMatrix.scala:479)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at org.apache.spark.util.collection.CompactBuffer$$anon$1.foreach(CompactBuffer.scala:115)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at org.apache.spark.util.collection.CompactBuffer.foreach(CompactBuffer.scala:30)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at org.apache.spark.util.collection.CompactBuffer.flatMap(CompactBuffer.scala:30)
at org.apache.spark.mllib.linalg.distributed.BlockMatrix$$anonfun$23.apply(BlockMatrix.scala:479)
at org.apache.spark.mllib.linalg.distributed.BlockMatrix$$anonfun$23.apply(BlockMatrix.scala:478)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:192)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

现在我什至尝试为每个执行器仅使用一个核心。可能是什么问题?我该如何调试它并找到根本原因?谢谢。

更新:失败阶段的详细信息:

org.apache.spark.rdd.RDD.flatMap(RDD.scala:374)
org.apache.spark.mllib.linalg.distributed.BlockMatrix.multiply(BlockMatrix.scala:478)
MyClass.generate(SimilarityGenerator.java:57)
MyClass.main(GenerateSimilarity.java:54)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

最佳答案

看来稀疏矩阵乘法并没有按照我想象的方式实现。 Spark 会自然地将 block 矩阵相乘,即使几乎所有单元格中都有零。我们实现了自己的乘法。这是 Scala 代码(所以也从某个地方复制):

def multiply(left: CoordinateMatrix, right: CoordinateMatrix): CoordinateMatrix = {
val leftEntries = left.entries.map({ case MatrixEntry(i, j, v) => (j, (i, v)) })
val rightEntries = right.entries.map({ case MatrixEntry(j, k, w) => (j, (k, w)) })

val productEntries = leftEntries
.join(rightEntries)
.map({ case (_, ((i, v), (k, w))) => ((i, k), (v * w)) })
.reduceByKey(_ + _)
.map({ case ((i, k), sum) => MatrixEntry(i, k, sum) })

new CoordinateMatrix(productEntries)
}

关于java - 矩阵乘法期间 Apache Spark java 堆空间错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41834636/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com