A google filter hits you

What are Google Filters? #

How to detect a Google Filter #

Google Filters describe demotions or suppressions in search results due to algorithmic reasons that Google won’t disclose.

Even if there is no “official” Google Penalty applied, a website can still suffer from suppression in search results due to a Google Filter.

Google has no increased interest in disclosing what their algorithm likes or dislikes.

This is because every little detail of information that Google gives out can be reverse-engineered and often is.

An example of that is the link penalties and link filters that helped train LRT and Link Detox Genesis for over a decade.

Some of the following examples of Google filters are more well known than others; some of them will never be confirmed by Google - no matter how often we ask as SEOs.

Examples of Google Filters:

Duplicate content filter #

A page on a website may not rank (at all) for duplicate results to others. Very often, these filtered results are on scraper sites. However, Domains with a lot of Trust often aggregate content from lower trust websites and outrank them. Google has claimed to be able to identify the original publisher, but we often see different results in the SERP.

When some SEO experts were polled on if Google can identify the original publisher/source of a document (and filter all copies of it correctly in the SERP) the answers were quite clear:

duplicate content detection working or not?

URL demotions #

Page demotions based on link spam due to Google Penguin algorithm.

If the Google Penguin algorithms flag a page, folder, the keyword (group), or whole domain, then its rankings can be (partially) suppressed, pushed back by, e.g., 30 positions, or removed from the search results altogether

Website Speed/UX filters #

The speed a page on a website returns is crucial for the user experience(UX), which is Google’s ultimate goal to optimize. Users happy with results mean more usage of the search engine. Pages with slow response times are demoted in the SERPs for the simple reason that users apparently will not like them as much as others that load fast.

Bounce Rate filter/UX #

Pages that have high bounce rates signal to Google that something is missing on the page, that the page does not answer the query. Because of that, it makes sense for Google to show it less often.

DMCA/Copyright violations filter #

If there is a claim for your website via the DMCA mechanism, then your website is not shown in the search results.

Google should notify webmasters about DMCA takedown notices that sometimes seem to go wrong.

https://twitter.com/dannysullivan/status/1365051924106539008

DMCA takedowns have therefore been abused for negative SEO since 2003 at least.

https://twitter.com/jenstar/status/1365177219316768769

There is an impressive list of removals provided by Google to the public.

EU Privacy removal #

The Right-to-be-forgotten law in the EU lets people remove certain content from the search results. These pages are then filtered in the EU countries but still accessible directly from the website or even in Google Search Results outside the EU.

Behaviors like phishing, violence, or explicit content may violate Google policies and also be removed.

Comparison to Google Penalties #

While Google never confirms Google Filters, nor documented officially and often the reason for a lot of guesswork and discussion among SEOs, the Google Penalties (or Manual Actions) are well described.

While Google filters often cause a soft decline in rankings, penalties often mean a total drop for a domain, folder, page, or keyword group. Google intends to make it as hard as possible to reverse engineer the effect of Google filters.

Special Case Google Penguin #

Google Penguin started as a harsh penalty issued by Google in 2012. Websites lost all their rankings overnight when that penalty hit them, made reverse-engineering the reasons possible.

When Google Penguin 4.0 was integrated into the core algorithm, Google also reduced the number of manual actions (penalties) and starting demoting parts of a website based on the spam patterns it detected.

That transformed the Google Penguin penalties into Google Penguin filters, to the point that some people are questioning if such devaluations of websites, their links, and keyword rankings still take place today.

We'd be happy if you share this