gpt4 book ai didi

apache-spark - 如何查看 Spark 独立集群的聚合日志

转载 作者:行者123 更新时间:2023-12-04 04:35:47 25 4
gpt4 key购买 nike

当 Spark 运行在 Yarn 上时,我可以简单地使用 yarn -logs -applicationId appId 在 Spark 作业完成后查看聚合日志。 Spark 独立集群的等效方法是什么?

最佳答案

通过 Web Interface :

Spark’s standalone mode offers a web-based user interface to monitor the cluster. The master and each worker has its own web UI that shows cluster and job statistics. By default you can access the web UI for the master at port 8080. The port can be changed either in the configuration file or via command-line options.

In addition, detailed log output for each job is also written to the work directory of each slave node (SPARK_HOME/work by default). You will see two files for each job, stdout and stderr, with all output it wrote to its console.



更多信息请访问 Monitoring and Instrumentation .

关于apache-spark - 如何查看 Spark 独立集群的聚合日志,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46004528/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com