gpt4 book ai didi

scala - 我如何在 Maven 中使用 spark-testing-base?

转载 作者:行者123 更新时间:2023-12-02 00:44:47 25 4
gpt4 key购买 nike

我想了解如何测试相同的 spark 代码。谷歌搜索我找到了 spark-tetsting-base .好吧,现在我想尝试一下,但我无法使用 Maven 运行它。

首先,我有可与 mvn package 打包的 Scala 代码并具有以下项目结构:

pom.xml
src/main/scala/com/test/spark/mycode.scala
src/test/scala/com/test/spark/test.scala

问题是当我运行 mvn test 时它不会在 src/main/com/test/spark/test.scala 下运行测试.该测试实际上是来自 spark-testing-base wiki 的第一个示例.maven 的输出是:

[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ mycode ---
[INFO] Nothing to compile - all classes are up to date

还有我的pom.xml看起来像这样:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.test.spark</groupId>
<artifactId>mycode</artifactId>
<version>0.0.1</version>
<name>${project.artifactId}</name>
<description>Simple wtest</description>
<inceptionYear>2017</inceptionYear>

<!-- change from 1.6 to 1.7 depending on Java version -->
<properties>
<maven.compiler.source>1.6</maven.compiler.source>
<maven.compiler.target>1.6</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.11.5</scala.version>
<scala.compat.version>2.11</scala.compat.version>
<spark.version>1.6.1</spark.version>
</properties>

<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>

<!-- Spark dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- Spark sql dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- Spark hive dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- spark-testing-base dependency -->
<dependency>
<groupId>com.holdenkarau</groupId>
<artifactId>spark-testing-base_2.11</artifactId>
<version>${spark.version}_0.6.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalactic</groupId>
<artifactId>scalactic_2.11</artifactId>
<version>3.0.1</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.2.0-SNAP5</version>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<!-- Create JAR with all dependencies -->
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<scalaCompatVersion>${scala.compat.version}</scalaCompatVersion>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- for testing scala code -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.20</version>
<configuration>
<argLine>-Xmx2048m -XX:MaxPermSize=2048m</argLine>
</configuration>
</plugin>
</plugins>
</build>
</project>

感谢任何帮助。谢谢!

编辑 1:

我将其添加为 test.scala为了不通过测试:

class test extends FunSuite with SharedSparkContext {
test("test initializing spark context") {
val list = List(1, 2, 3, 4)
val rdd = sc.parallelize(list)
val list2 = List(1, 2, 3, 4, 5)

assert(rdd.count === list2.length)
}
}

但我仍然得到:

[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ mycode ---
[INFO] Nothing to compile - all classes are up to date
[INFO]

[INFO] --- maven-surefire-plugin:2.20:test (default-test) @ mycode ---
[INFO]------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO]------------------------------------------------------------------------

虽然 assert 中的比较是假的,就好像测试没有运行一样。 :(

编辑 2:我用这个插件改变了 pom,仍然没有改变:

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.20</version>
<configuration>
<includes>
<include>**/*Test.scala</include>
</includes>
<argLine>-Xmx2048m -XX:MaxPermSize=2048m</argLine>
</configuration>
</plugin>

编辑 3:已添加 <goal>testCompile</goal>net.alchim31.maven为了编译测试类。虽然测试仍然没有运行。最终让测试运行还缺少什么?

编辑 4:感谢您的意见。我将 pom 更改为以下内容:

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.test.spark</groupId>
<artifactId>testJavaAndScala</artifactId>
<version>1.0-SNAPSHOT</version>
<name>Test for Java + Scala compilation</name>
<description>Test for Java + Scala compilation</description>
<inceptionYear>2017</inceptionYear>

<properties>
<scala.version>2.11.5</scala.version>
<scala.compat.version>2.11</scala.compat.version>
<spark.version>1.6.1</spark.version>
</properties>

<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!-- Spark dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- Spark sql dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- Spark hive dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- Spark-testing-base -->
<dependency>
<groupId>com.holdenkarau</groupId>
<artifactId>spark-testing-base_${scala.compat.version}</artifactId>
<version>${spark.version}_0.6.0</version>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.0.2</version>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

</project>

这是 mvn test输出:

[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Test for Java + Scala compilation 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ testJavaAndScala ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory /home/src/main/resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.1:add-source (scala-compile-first) @ testJavaAndScala ---
[INFO] Add Source directory: /home/src/main/scala
[INFO] Add Test Source directory: /home/src/test/scala
[INFO]
[INFO] --- scala-maven-plugin:3.2.1:compile (scala-compile-first) @ testJavaAndScala ---
[INFO] /home/src/main/scala:-1: info: compiling
[INFO] Compiling 1 source files to /home/target/classes at 1497938635103
[INFO] prepare-compile in 0 s
[INFO] compile in 3 s
[INFO]
[INFO] --- maven-compiler-plugin:2.0.2:compile (default-compile) @ testJavaAndScala ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-compiler-plugin:2.0.2:compile (default) @ testJavaAndScala ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ testJavaAndScala ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 0 resource
[INFO]
[INFO] --- scala-maven-plugin:3.2.1:testCompile (scala-test-compile) @ testJavaAndScala ---
[INFO] /home/src/test/scala:-1: info: compiling
[INFO] Compiling 1 source files to /home/target/test-classes at 1497938639022
[INFO] prepare-compile in 0 s
[INFO] compile in 5 s
[INFO]
[INFO] --- maven-compiler-plugin:2.0.2:testCompile (default-testCompile) @ testJavaAndScala ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ testJavaAndScala ---
[INFO] Surefire report directory: /home/target/surefire-reports

-------------------------------------------------------
T E S T S
-------------------------------------------------------

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14.854 s
[INFO] Finished at: 2017-06-20T08:04:05+02:00
[INFO] Final Memory: 28M/360M
[INFO] ------------------------------------------------------------------------

我可以在目标目录中找到我的测试类,但似乎没有执行。这是为什么?正如我所说,这只是最小的例子,它不依赖于我的实际主要代码。

最佳答案

Surefire 不会自己找到那些最大规模的测试。您需要使用 scalatest-maven-plugin。提供说明 on their website

首先,像这样修改您的 pom。

<!-- disable surefire -- >
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<!-- enable scalatest -- >
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>

关于scala - 我如何在 Maven 中使用 spark-testing-base?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44536150/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com