gpt4 book ai didi

Sonarqube 6.1 com.mysql.jdbc.PacketTooBigException : Packet for query is too large

转载 作者:行者123 更新时间:2023-12-04 05:06:19 33 4
gpt4 key购买 nike

亲爱的Sonarqube社区,

(自更新Sonarqube 6.1以来),我们在Sonarqube中收到一个错误,即查询数据包太大。
我们的工作:Jenkins在 checkout PHP源代码,然后SonarQube Scanner在分析源代码并与SonarQube服务器通信。 Jenkins中的日志输出失败,此过程失败:
org.sonarqube.ws.client.HttpException: Error 500 on http://URL-TO-SONAR/sonar/api/ce/submit?projectKey=lhind.php.PRJName&projectName=PRJName : {"errors":[{"msg":"Fail to insert data of CE task AViRLtiaB_5m8twj_1J3"}]}

  • Jenkins版本:2.19.3
  • SonarQube版本:6.1
  • SonarQube扫描仪:2.8
  • MySQL版本:5.6.34
  • 驱动程序:MySQL连接器Java
  • 驱动程序版本:mysql-connector-java-5.1.39
  • MySQL变量“max_allowed_pa​​cket = 16M”(从4M增加)
  • MySQL变量“innodb_log_file_size = 128M”(从48M增加)
  • Sonar JDBC连接字符串:“
  • sonar.jdbc.url = jdbc:mysql://DB-URL:3306/sonar?useUnicode = true&* characterEncoding = utf8&rewriteBatchedStatements = true&useConfigs = maxPerformance&maxAllowedPacket = 16777216“

  • 我们已经增加了最大数据包大小和innodb_log_file_size。使用SonarQube 6.1进行相同数量的代码时,我们不会遇到此问题。

    有任何想法吗?

    在SonarQube中,我们在sonar.log文件中获得了以下异常:

    2016.11.23 12:35:16 ERROR web[][o.s.s.w.WebServiceEngine] Fail to process request http://SONAR-URL.de:8443/sonar/api/ce/submit?projectKey=lhind.php.PRJName&projectName=PRJName
    java.lang.IllegalStateException: Fail to insert data of CE task AViRLtiaB_5m8twj_1J3
    at org.sonar.db.ce.CeTaskInputDao.insert(CeTaskInputDao.java:56) ~[sonar-db-6.1.jar:na]
    (deleted because too much text ...)
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
    **Caused by: com.mysql.jdbc.PacketTooBigException: Packet for query is too large (24313938 > 16777216). You can change this value on the server by setting the max_allowed_packet' variable.**
    at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3671) ~[mysql-connector-java-5.1.39.jar:5.1.39]
    (deleted because too much text ...)
    at org.sonar.db.ce.CeTaskInputDao.insert(CeTaskInputDao.java:53) ~[sonar-db-6.1.jar:na]
    ... 34 common frames omitted

    最佳答案

    在客户端和服务器上都增加MySQL中允许的最大数据包大小,以解决此问题。

    服务器

    有关如何在服务器上执行此操作的详细信息,请参见此处的“解决方案”部分。建议将该值设置为256MB。在上面的堆栈跟踪中,数据包大小约为24MB。

    https://confluence.atlassian.com/confkb/exceeds-max-allowed-packet-for-mysql-179443425.html

    我喜欢上面的链接,因为它描述了如何在不停止数据库的情况下增加值(如果这对您很重要)。

    客户

    在客户端上,增加SonarQube JDBC URL中maxAllowedPacket参数的值。

    引用

    有关更多详细信息,请参见MySQL文档中的以下链接。

  • http://dev.mysql.com/doc/refman/5.7/en/packet-too-large.html

  • Both the client and the server have their own max_allowed_packet variable, so if you want to handle big packets, you must increase this variable both in the client and in the server.


  • https://dev.mysql.com/doc/connector-j/5.1/en/connector-j-reference-configuration-properties.html

  • maxAllowedPacket

    Maximum allowed packet size to send to server. If not set, the value of system variable 'max_allowed_packet' will be used to initialize this upon connecting. This value will not take effect if set larger than the value of 'max_allowed_packet'. Also, due to an internal dependency with the property "blobSendChunkSize", this setting has a minimum value of "8203" if "useServerPrepStmts" is set to "true".

    Default: -1

    Since version: 5.1.8

    关于Sonarqube 6.1 com.mysql.jdbc.PacketTooBigException : Packet for query is too large,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40766080/

    33 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com