gpt4 book ai didi

apache-spark - Spark 包标志与 jars 目录?

转载 作者:行者123 更新时间:2023-12-04 04:48:16 24 4
gpt4 key购买 nike

在 Spark 中,通过 --packages 将 JAR 添加到类路径有什么区别?参数并将 JAR 直接添加到 $SPARK_HOME/jars目录?

最佳答案

TL;博士 jars用于使用 URL 指定的本地或远程 jar 文件并且不解析依赖项,packages用于 Maven 坐标,并解析依赖项。 From docs

  • --jars

    When using spark-submit, the application jar along with any jars included with the --jars option will be automatically transferred to the cluster. URLs supplied after --jars must be separated by commas. That list is included in the driver and executor classpaths. Directory expansion does not work with --jars.

  • --packages

    Users may also include any other dependencies by supplying a comma-delimited list of Maven coordinates with --packages. All transitive dependencies will be handled when using this command. Additional repositories (or resolvers in SBT) can be added in a comma-delimited fashion with the flag --repositories. (Note that credentials for password-protected repositories can be supplied in some cases in the repository URI, such as in https://user:password@host/.... Be careful when supplying credentials this way.) These commands can be used with pyspark, spark-shell, and spark-submit to include Spark Packages.

  • 关于apache-spark - Spark 包标志与 jars 目录?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50333750/

    24 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com