Cloaking in SEO is displaying different versions of content to visitors based off variables like their user-agent, browser, IP address or other determining factor. This sometimes is done un-intentionally, when javascript or code is rendered one-way for users and may render certain content inaccessible to crawlers and bots. Sometimes there are user-interface elements that are made inaccessible for bots and crawlers, even if users can access them, with the goal of preventing wasteful crawling from those programs.
Google’s Preferred Definition Of Cloaking (For Deceptive Purposes)
Google’s preferred description of this practice describes this effort as “deceptively displaying” different content to search engines like Google compare to what’s shown to your website visitors with the intent of artificially improving your site’s rankings or search traffic. This practice is described as being against Google’s Webmaster spam guidelines, and can cause your site to get a penalty or be removed entirely from Google’s Index of sites eligible to receive traffic.
Not recently,
but it's all fairly standard.Detect UA/IP – at server, script or JS,
serve different content.I think the main difference is modern cloaking tends to hide stuff from G, rather than from users?
— Darth Autocrat (Lyndon NA) (@darth_na) August 28, 2022
Examples Of SEO Cloaking For The Purpose of Improving Rankings
Google uses a couple of examples to illustrate the concept of cloaking. The first is a website that displays a page full of images to it’s users, but when Googlebot arrives it’s served a page full of content intended to convince Google that page is relevant to a specific topic. The machine injection of keywords when the user-agent for a page is Googlebot is also listed as a classic example of cloaking.