gpt4 book ai didi

seo - GoogleBot 非常频繁地抓取导致服务器过载

转载 作者:塔克拉玛干 更新时间:2023-11-03 02:47:55 49 4
gpt4 key购买 nike

我的网站有大约 500.000 页。我制作了 sitemap.xml 并在其中列出了所有页面(我知道每个文件限制 50.000 个链接,所以我有 10 个站点地图)。无论如何,我在 webmastertool 中提交了站点地图,一切似乎都正常(没有错误,我可以看到提交和索引链接)。 Hoverer 我经常遇到蜘蛛网问题。 GoogleBot 每天抓取同一页面 4 次,但在 sitemap.xml 中我告诉该页面每年都会更改。

这是一个例子

<url>
<loc>http://www.domain.com/destitution</loc>
<lastmod>2015-01-01T16:59:23+02:00</lastmod>
<changefreq>yearly</changefreq>
<priority>0.1</priority>
</url>

1) 那么如何告诉 GoogleBot 不要过于频繁地抓取,因为它会使我的服务器过载?

2) 该网站有多个页面,如 http://www.domain.com/destitution1 , http://www.domain.com/destitution2 ...然后我将规范网址放入 http://www.domain.com/destitution .可能是多爬虫的原因?

最佳答案

您可以将此报告给 Google 抓取团队,请参阅此处:

In general, specific Googlebot crawling-problems like this are best handled through Webmaster Tools directly. I'd go through the Site Settings for your main domain, Crawl Rate, and then use the "Report a problem with Googlebot" form there. The submissions through this form go to our Googlebot team, who can work out what (or if anything) needs to be changed on our side. They generally won't be able to reply, and won't be able to process anything other than crawling issues, but they sure know Googlebot and can help tweak what it does.

https://www.seroundtable.com/google-crawl-report-problem-19894.html

关于seo - GoogleBot 非常频繁地抓取导致服务器过载,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29965131/

49 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com