gpt4 book ai didi

java - 使用 Spark 和 Maven 运行项目

转载 作者:行者123 更新时间:2023-12-02 07:41:40 24 4
gpt4 key购买 nike

我正在尝试运行我的类 java 来读取 GML 文件,并且我使用 Tinkerpop 和 GMLReader 来执行此操作,问题是当我尝试使用 Spark 运行它时,它给了我一个异常
我写了一个简单的代码进行测试:

 public static void main(String[] args) throws IOException {
TinkerGraph graphs = new TinkerGraph();
String in = "/home/salma/Desktop/celegansneural.gml";
GMLReader.inputGraph(graphs, in);
System.out.println(graphs);
}

我用来运行该类的命令:

root@salma-SATELLITE-C855-1EQ:/usr/local/spark# ./bin/spark-submit --class graph_example.WordCount --master local[2] ~/workspace/graph_example/target/graph_example-0.0.1-SNAPSHOT.jar

错误:

Exception in thread "main" java.lang.NoClassDefFoundError:
com/tinkerpop/blueprints/impls/tg/TinkerGraph
at graph_example.WordCount.main(WordCount.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.tinkerpop.blueprints.impls.tg.TinkerGraph
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

最佳答案

您必须提供包含 TinkerGraph 实现的依赖项。如果我没记错的话,你需要提供this jar

然后像往常一样运行 spark-submit 但使用 --jars/some/location/blueprints-core-2.6.0.jar

official documentation中有解释:

When using spark-submit, the application jar along with any jars included with the --jars option will be automatically transferred to the cluster.

关于java - 使用 Spark 和 Maven 运行项目,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39793858/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com