Organic SEO & Marketing

Our Organic Search Engine Optimization and Search Engine Marketing includes a full assessment of your current website, reports on valuable keywords and niche markets, and a full gamut of SEO techniques that are tried and true over the years. Our services work with Google regardless of algorithm changes and changing theories.

Web Design & Development

Our Web Design and Development service is second to none and our designers have worked with a variety of clients from big names to mom-and-pop shops. All designs are built from scratch and are never developed from a template. Your site design will be unique to your business. We build marketable websites!

Hablamos Español
Title Image


Google May Be Planning to Reduce Web Page Crawl Rate

Google May Be Planning to Reduce Web Page Crawl Rate

As Google becomes more aware of the sustainability of crawling and indexing in SEO, it may reduce the number of times that SEO pages are crawled. John Mueller, Martin Splitt, and Gary Illyes from Google’s Search Relations department addressed this topic in the most recent episode of Search Off The Record.

The three talked about what to anticipate from Google in 2022. One of the topics they discussed was crawling and indexing, which became less frequent as SEO experts and website owners noticed last year.

Google’s goal this year is to conserve computing resources to make crawling more sustainable. Here’s what it means for the SEO community and their websites’ performance in the search results.

Sustainability of Crawling and Indexing in SEO

Googlebot crawls and indexes virtually, so most people think it does not affect the environment. However, Illyes pointed out that computing isn’t sustainable.

He said that even Bitcoin mining impacts the environment, and one can actually measure its effect, especially if the electricity comes from coal-fired or other less sustainable plants.

Illyes also said that Google has been carbon-free for well over a decade. However, the search engine company is still aiming to further decrease its environmental footprint. Crawling is one of those things that they can reduce to achieve this goal. In this case, Google can reduce unnecessary crawling for web pages that did not undergo any recent changes.

Google’s Plan to Make Crawling More Sustainable

Illyes explains that reducing the amount of crawling is one way to make it more sustainable. Googlebot has two types of web crawling: crawling to discover new content and crawling updated content. Google is planning to reduce crawling that refreshes updated content.

This is how Google refreshes updated content: Googlebot first visits a URL, crawls it, then goes back to that URL after some time to re-crawl and see if the publisher made changes. That is known as a refresh crawl. Every time they revisit a URL, they will always do a refresh crawl.

The question is, how often does Google revisit a single URL?

Illyes then mentioned that specific types of websites need their pages re-crawled more compared to others. For instance, a news website’s homepage will almost always update; therefore, it requires a lot of refresh crawls.

However, the same news website won’t modify its About page as often, so Google doesn’t need to perform refresh crawls on those kinds of pages. Illyes also admitted that they often can’t estimate how frequently they perform refresh crawls.

He thinks that revisiting the same page repeatedly is a waste of time and resources. Sometimes, they revisit 404 pages for no good reason. So, there is plenty of room for improvement in terms of reducing Google’s footprint.

The search engine giant has yet to confirm if they would reduce their refresh crawls, but if they would ever include that in their plan, it could have several effects on websites.

The Effect of Crawl Rate Reduction on Websites

Mueller then asked Illyes about the idea that having a high crawl rate is a positive SEO signal. Some SEOs believe that it is good if Google crawls their website frequently, even if they rarely update their content. According to Illyes, this is a complete misconception, as web pages do not receive ranking bonuses if Google frequently crawls them.

Mueller then said that SEOs and site owners should not force re-crawls for their existing content if there haven’t been any changes since doing so does not give any ranking bonuses.

Anticipating a Reduction in Crawl Rate

Google is unclear about whether or not it will reduce its refresh crawls, but the team is currently considering it. If Google implements this, it will not harm site rankings. After all, more crawling does not guarantee higher positions in the search results.

In addition, the goal is to figure out which pages require refresh crawls and which don’t. That implies that publishers who update their pages will most likely continue to be refreshed in the search results.

Tips on Improving Indexation

Most site owners don’t worry about their crawl budget once a site is live or has advanced past a certain age. As long as they keep adding new blog articles to their website, it should rank in Google’s search results.

But after a certain point, they may lose search rankings due to the poor technical structure of the site, a crawling problem, thin content, or a new algorithm update. Therefore, one must limit their crawl budget to stay competitive with hundreds of trillions of web pages in Google’s index. Here are a few quick tips:

1. Use Google Search Console to track crawl status

Checking crawl status regularly, such as once every 30 to 60 days or so, is critical in detecting problems that affect the site’s overall marketing performance. It’s the first step in SEO; everything else is meaningless without it. One can remove access to any web page directly via Search Console if it has a 404 error or has been temporarily redirected.

2. Create mobile-friendly web pages

SEOs must optimise their pages to show mobile-friendly versions on the mobile index. Here are some good technical tweaks to try:

  • insert the viewpoint meta tag in the content
  • implement responsive web design
  • tag web pages with the AMP cache
  • optimise and compress images to load them faster
  • reduce the size of on-page UI elements
  • minify on-page resources, like JS and CSS

3. Update content regularly

If a website produces fresh content regularly, Google will crawl its pages more frequently. This is especially advantageous for publishers who need fresh articles updated and indexed regularly.

If a site is continuously improving and producing fresh articles, it therefore must be crawled more often to reach its target audience.

Webmix Networks SEO Helps Improve Your Site Indexation

We help optimise your web pages to boost your chances of getting indexed and ranking higher in Google’s search engine. With our help, you can boost your visibility online and increase traffic to your website!

Webmix Networks SEO helps you improve website performance and accessibility, conduct sitemap installations and positive link building, and more. We also provide a thorough SEO audit and free phone consultation. Contact us today for more information on how to take your business’ SEO to the next level!