For small websites, i.e. those with less than 100 pages, it is not necessary to focus on working on this aspect, since Google has more than enough resources to process this type of page.
Based on our experience, we recommend optimizing the Crawl Budget for pages with more than 5,000 URLs or those where pages with parameters are generated en masse, for example.
What is my Crawl Budget?
To know your Crawl Budget, Google Search Console is your best ally. Simply access hotel email database the Crawl Stats Report through the side menu of your property:
What is my crawl budget step 1
When you access it, you will see a report like this:
What is my crawl budget step 2
Below is an official resource that explains each of these metrics in detail, but the most important chart of the three is the “pages crawled per day” chart. Simply compare this KPI against the number of pages on your site (do the same thing but only against the number of pages relevant to your business) to see if there is a problem or not. If fewer pages are crawled than your site actually has, it’s time to see what’s going on.
How to audit my Crawl Budget?
The easiest way to audit your Crawl Budget is through log analysis, as these allow us to know at all times how Google crawlers interact with our website.
There are several ways to analyze logs, but roughly speaking, the two most common are:
Through preconfigured tools: Screaming Frog File Log Analyzer or Botify
Create your own tools through other tools such as: Kibana, Elastic Search, Big Query, Datastudio,…
If you are just getting started with log analysis, we recommend you start with Screaming Frog Log Analyzer, as with a quick setup, you will have some excellent reports that will allow you to draw conclusions.
Improve your SEO!
When you're digging through the logs, try to answer questions like:
How often does Googlebot visit my website?
What pages or sections do you visit most?
What pages or sections do you not visit?
How long does it take to download each type of page?
What pages are you requesting and giving errors?
By applying common sense and answering these questions you are sure to find things that are not right, such as:
Google is allocating crawl time to pages with 4xx errors
There are sections or pages that have a very high level of importance in terms of tracking, but they are not representative in terms of business.
When is it advisable to work on the Crawl Budget?
-
- Posts: 227
- Joined: Sun Dec 22, 2024 5:31 am