gpt4 book ai didi

java - Maven:将 Scala 代码和 Java 代码一起构建到一个 Fat JAR 中

转载 作者:行者123 更新时间:2023-12-02 03:15:03 25 4
gpt4 key购买 nike

我有一个构建到 Fat JAR 中的 Scala 项目。今天我需要向项目添加一些 Java 类,但现在我的 Maven 构建失败了。

我的项目结构(大致)如下所示:

.
├── src
│ └── main
│ ├── resources
│ │ └── Log4j.properties
| ├── java
│ │ └── com
│ │ └── myorg
│ │ └── myproject
│ │ └── MyPublicJavaClass.java
│ └── scala
│ └── com
│ └── myorg
│ └── myproject
│ └── spark
│ └── Main.scala
└── pom.xml

这是我的 POM 文件的样子:

<?xml version="1.0"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<name>myproject</name>
<url>http://maven.apache.org</url>

<groupId>com.myorg</groupId>
<artifactId>myproject</artifactId>
<packaging>jar</packaging>
<version>0.1.0-RELEASE</version>

<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.tools.version>2.11</scala.tools.version>
<scala.version>2.11.8</scala.version>
<spark.version>2.4.0</spark.version>
<aws.sdk.version>1.11.553</aws.sdk.version>
</properties>

<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<testResources>
<testResource>
<directory>src/test/resources</directory>
</testResource>
</testResources>

<plugins>
<!-- I added this plugin today to try and make it compile the Java code -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>generate-sources</phase>
<configuration>
<sources>
<source>src/main/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>

<!-- This compiles the Scala code -->
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>

<!-- This builds the fat JAR -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.myorg.myproject.spark.Main</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.amazon.emr</groupId>
<artifactId>emr-dynamodb-hadoop</artifactId>
<version>4.8.0</version>
</dependency>
<dependency>
<groupId>com.github.scopt</groupId>
<artifactId>scopt_${scala.tools.version}</artifactId>
<version>3.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-scala</artifactId>
<version>11.0</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.8.2</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>

在我今天添加 Java 类以及 build-helper-maven-plugin 之前,Maven 能够毫无问题地构建这个项目。但是现在,我似乎没有正确配置该插件,或者我没有使用正确的插件?

我的 Scala 代码试图使用 MyPublicJavaClass 类型的对象,所以现在我在 Maven 中看到的构建错误如下所示:

[ERROR] ~/src/main/scala/com/myorg/myproject/spark/Main.scala:227: error: not found: type MyPublicJavaClass

...

[ERROR] Failed to execute goal org.scala-tools:maven-scala-plugin:2.15.2:compile (default) on project myproject: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1(Exit value: 1) -> [Help 1]

我以为 build-helper-maven-plugin 会告诉它在编译之前将 Java 代码的源目录添加到要生成的源列表中,但显然不是。我该如何解决这个问题?

最佳答案

您正在为 Scala 编译使用一个非常旧的插件(最新版本 2.15.2 已发布 Feb 6 2011)。

我建议你先升级到更新的插件,比如scala-maven-plugin (最新版本 4.0.2 于 2019 年 5 月 11 日发布)。

然后您可以找到混合 Scala/Java 源代码的示例 in the docs .这种情况下不需要使用build-helper-maven-plugin,也不需要配置sourceDirectorytestSourceDirectory。用这个插件检查这个简单的 pom.xml(当我在本地重现问题时,我刚刚从你提供的示例中删除了未使用的依赖项):

<?xml version="1.0"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<name>myproject</name>
<url>http://maven.apache.org</url>

<groupId>com.myorg</groupId>
<artifactId>myproject</artifactId>
<packaging>jar</packaging>
<version>0.1.0-RELEASE</version>

<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.tools.version>2.11</scala.tools.version>
<scala.version>2.11.8</scala.version>
</properties>

<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.0.2</version>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>

<!-- This builds the fat JAR -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.myorg.myproject.spark.Main</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>

关于java - Maven:将 Scala 代码和 Java 代码一起构建到一个 Fat JAR 中,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/56386320/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com