Googlebot’s adjustable crawl speed function is scrapped

January 12, 2024 Posted by Maisie Lloyd News, Round-Up 0 thoughts on “Googlebot’s adjustable crawl speed function is scrapped”
Author Profile
Maisie Lloyd
Digital Content Specialist

I'm passionate about recording the world, but most of all driven by creativity and collaborative environments. My experience revolves around the production of digital content, pertaining to graphic design, writing copy, and video and audio content.

On January 8th, 2024, Google announced that its adjustable speed limiting tool for its
Googlebot crawler has been removed.

In the statement, Google addressed that the tool was no longer needed as the crawler has
evolved, whilst still providing site owners with a plethora of adjustable tools.

Google Crawler aims to work through a few pages on a server at a time, to not use all the
bandwidth. For sites undergoing rigorous crawling on a significant number of pages, servers
may not have the capability to handle users accessing the page. The adjustable speed tool
allowed site owners to slow the crawler and allow user traffic to return to normal.

Why?

The adjustable function removal comes amid the crawler’s capability to slow itself. Google
has stated that its algorithm can acknowledge when a server has come to its limit. Its ability
to recognise when a server’s aptitude is hindered, results in the crawler slowing to
accommodate the server’s user volume.

Google previously noted that the usage of the tool was somewhat limited, those who were
using the feature would put it to the lowest setting.

Google bot’s crawler has been set to crawl at a slower speed than it previously was,
enabling site servers to not reach capacity. This differs significantly from when the tool was
first introduced in 2008 as a solution for publishers when the bandwidth was maxed.

What does this mean for site owners?

For site owners who are still facing issues after the removal of the feature, Google has
provided a document on reducing the Googlebot crawl rate. When users try to access the
tool, Google redirects to a page signposting that the tool is offline.

Users are still able to limit the crawl rate in urgent situations, this is done through a limited
amount of time, typically lasting between a few hours to 1-2 days.

This can be done by responding to crawl requests with 503,429 or 500 instead of 200 on the
HTTP status code. It is worth noting these codes should be used for a brief period. If Google
bot recognises the code has been there for an extended period (more than 2 days) it runs
the risk of the URL being removed from Google’s index.

Google has stated if your site is being crawled too fast, it is likely to be indicative of a
problem with the infrastructure of the website.

Latest Posts

Categories