Google’s 15MB crawl limit is a crucial factor to consider when optimizing your website.
But what exactly does this limit mean?
And how does it affect your website’s performance in search results?
In this post, we’ll delve into these questions and more, providing you with a comprehensive understanding of Google’s 15MB crawl limit and how to effectively manage it.
What is Google’s 15MB Crawl Limit?
This limit refers to the maximum size of a web page that Googlebot will download and process for indexing. If a web page exceeds this limit, Googlebot will only process the first 15MB of the page.
The primary reason for Google implementing this rule is to manage the resources used during the crawling and indexing process. While it helps Googlebot crawl and index the vast number of web pages on the internet, it’s not always a win for your site.
Is the 15MB Limit the Same as Crawl Budget?
The limit is separate from but related to Google’s crawl budget. Your crawl budget refers to the number of pages Googlebot can crawl on your site within a certain timeframe. If a page is close to or exceeds the 15MB limit, Googlebot may use up more of your allocated crawl budget to download and process that page. This leaves fewer resources for crawling other pages on your site.
While images and videos are not counted towards the 15MB limit, large images and videos can still impact a page’s loading time, which can affect Googlebot’s ability to efficiently crawl the page.
Related: PageSpeed Explained
Strategies and Techniques to Avoid the 15MB Limit
1. Server-Side Rendering (SSR)
However, it’s important to note that server-side rendering is not the optimal choice for every website.
Dynamic rendering offers similar benefits at a fraction of the cost. A tool like Prerender, for example, helps Google to easily crawl and index a website by generating a static HTML version of each page.
Determining and Tracking Your Website’s Size
You can determine the size of your website using various tools and techniques.
Google Search Console provides detailed information about how Googlebot interacts with your site. Other tools like Screaming Frog can mimic the behavior of web crawlers, allowing you to diagnose potential issues.
Make Use of Embedded or Linked SVGs
Including SVGs as image tags can help manage the page’s size, as the data for the image is not embedded in the HTML. However, this can increase the number of HTTP requests the page makes, which can impact load time. The best approach depends on the needs and constraints of your website.
In addition to the 15MB limit, increasing your crawl budget will ensure your most important pages get crawled and indexed by Google every time.
Struggling to get indexed? Get started with 1,000 URLs free.