If you’ve read about SEO, you’ve probably heard about “black hat” techniques like cloaking. Even if you’re not interested in using it, it’s easy to assume that it’s instantly a shady strategy from the outside.
In this article, we’ll explain one of the most misunderstood techniques and how to avoid getting penalized for unintentional mistakes. We will discuss the common misconceptions, with auditing tips. We will also explain the difference between cloaking and dynamic rendering.
🔍 Want to boost your SEO? Download the free technical SEO guide to crawl budget optimization and learn how to improve your site’s visibility on search engines.
What is Cloaking?
Cloaking is an SEO black hat technique consisting of presenting different content to search engines and users to manipulate Google’s algorithm and mislead users. In essence, a website doing cloaking uses the user-agent or IP address of the request to identify Google Bot and serve it a different page than users.
This is not only against Google’s guidelines, but it could be harmful too! For example, people could use cloaking to trick Google into ranking a page with illegal content or irrelevant keywords.
Using cloaking could easily get your website deindexed or even permanently banned.
What is Dynamic Rendering?
Dynamic rendering is a technique used to serve a fully rendered version of your JavaScript website (or single page applications) to search engines and bots that have problems executing JavaScript.
In most cases, search engines like Google and Bing have problems executing your website’s JavaScript, which can lead to several indexing issues. When using dynamic rendering, your website will be able to identify crawlers like Googlebot and Bingbot and provide them with a static HTML version of the requested pages with all JavaScript content executed.
This process is pretty much the same Prerender follows:
- Our middleware – installed on your server – identifies search engine crawlers and sends a request to our servers
- Prerender fetches all necessary data from your page and creates a snapshot of the fully rendered page
- Lastly, it sends the static page back to the crawler through your server and caches it for later
Resource: Learn how Prerender crawls and renders your pages
On the other hand, if the user-agent is a person, the request goes through the regular route, sending the user to your website.
Prerendering is Not Cloaking—According to Google
We’re very aware that these two concepts are practically the same. After all, you’re sending Google and users different versions of your site. So how is it that dynamic rendering is not considered cloaking?
Well, according to Google:
“Googlebot generally doesn’t consider dynamic rendering as cloaking. As long as your dynamic rendering produces similar content, Googlebot won’t view dynamic rendering as cloaking.”
That’s the main difference between the two.
Cloaking is not just about the process itself, but what is actually used for.
When using a service like Prerender, you’re generating a static version of your page, but the content will be the same for search engines as for users. You’re just cutting the rendering process out of Google’s servers. However, that also means that even if you don’t have bad intentions, you might experience cloaking penalties if you’re using prerendering or dynamic rendering the wrong way.
How to Avoid Undeserved Cloaking Penalties
Because of the way dynamic rendering works, many different variables can go wrong, making your website breach the cloaking guidelines without intention.
Here are some of the details you’ll need to keep in mind to stay in the clear:
1. Check for Hacks
A very common tactic used by evil-doers is hacking websites with decent traffic and then cloaking their pages to send traffic to their main websites.
If you’ve experienced a hack recently or you’re not sure why you’re receiving a cloaking penalty, this may be one of the reasons.
Check your website for any weird redirects and do a full audit of your backend to ensure no leftover scripts are cloaking your users.
Note: It’s also a common practice for hackers to use cloaking to make it harder for you to detect the hack. So it’s a good habit to check for cloaking processes regularly.
2. Check for Hidden Text
More often than not, your JavaScript script could be changing some of your text attributes and creating hidden text issues. These elements are still picked up by crawlers and are considered keyword-stuffing attempts – which potentially can turn into ranking penalties.
It can also be considered cloaking if enough hidden elements make your dynamically rendered page different from what users can see.
The good news is that you can use a tool like Screaming Frog to look for hidden text on your pages.
3. Partially Rendered Pages
Cloaking is about the differences between what search engines and users see. The problem with partially rendered pages is that some of the content will be missing, probably making Google think you’re trying to trick them. When using Prerender, there are two common reasons for why this happens:
- Page rendering times out – by default, Prerender has a 20-second render timeout but can be configured to be more. If your page fails to render completely by that time, Prerender will save the page in its current state, resulting in a partially rendered page.
- Page errors – things like your CDN blocking all non-standard requests (so blocking Prerender) or having a geo-blocked will cause rendering problems, as Prerender won’t be able to access the files from your server.
Check our documentation for a complete walkthrough of what causes empty or partially rendered pages when using Prerender.
(Fun fact: we recently had a customer discovery on this topic. The client’s page was so slow that Google always gave up on it. With our tool’s longer timeout and cache heating, it was a solved case.)
4. Uncached Changes
Keeping in mind the main distinction between dynamic rendering and cloaking, there’s a clear issue that could come up with constantly changing pages.
Because of the way Prerender works, it’ll only fetch your page the first time it’s requested by a search crawler after installing the middleware. This first rendering process will take a couple of seconds, creating a high response time.
From there on, Prerender will use the cache version of the page to avoid hurting your site’s performance.
However, what happens when you change the page partially or even completely? Google will receive an old version of your page, which would be interpreted as cloaking.
To avoid this issue, setting a timer using Prerender’s recaching functionality is important. After the time is up, Prerender will automatically fetch all necessary elements from your server and create a new snapshot of the updated content.
Final Thoughts
Cloaking issues can be a real pain if they go unresolved.
After your website gets banned, there’s pretty much nothing you can do about it – even if you put everything in order, there’s no guarantee that Google will take the penalty off.
If your website runs heavily on JavaScript – whether or not you’re using Prerendering for it – it’s important to audit for cloaking issues regularly.
Add these quick steps to your next site audit checklist:
- Navigate to Google Search Console > Crawl > Fetch as Google and compare the content on your website’s browser version against the fetched by Google.
- Check for weird redirects sending users to unwanted, broken, or sneaky destinations.
- Use a tool like SiteChecker to scan for cloaked pages.
- If you’re using Prerender, check for any sign of partially rendered pages in the crawl or recache stats dashboard.
(Timeout errors are highlighted in red, but you should check any page with a response time close or higher to 20 seconds.)
SEO is a constant process that requires handling a lot of details, so it’s easy to make mistakes. However, if you’re diligent and follow Google guidelines, your website will definitely succeed.
If you have any questions or concerns, don’t hesitate to contact us. Our team is ready to help you!