Recognition of unfair competition in SEO

Go here: Navigate to Triggers → New → Custom Event, enter the event name ipevent.

Everything is ready, go to the preview mode, an entry about the IP address will appear there.

The second way: through the server

It all depends on the hosting. You can collect data, including IP addresses of users from server logs.

When using Apache + nginx, you can collect user sessions by pre-installing the rpaf module.

Logs can be viewed here: /var/log/apache2/access.log.

We filter out suspicious IP addresses, check each one here in the search bar, we get a list of subnets, write it to a file.

The third way: we block all countries except the one where we are advancing

We take a list of ip networks with distribution by country here:

We proceed to blocking.

First you need to know which server your site is running on, up to the version. It depends on how best to block parasitic traffic.

I take Apache 2.4 as an example, there are many sites running on it.

So, we have a list of IP networks and unwanted User-Agents that we want to block access to our site.

Open the file in the root of the site .htaccess and enter the following:

Blocking IP networks, example:


Require all granted

Require not ip

Require not ip

Require not ip

Require not ip

Require not ip


Blocking unwanted User-agents, example:

<IfModule mod_setenvif.c>

SetEnvIf User-agent ^-?$ bad

SetEnvIfNoCase User-Agent Embedly bad

SetEnvIfNoCase User-Agent TweetmemeBot bad

SetEnvIfNoCase User-Agent discobot bad

SetEnvIfNoCase User-Agent Linux bad

SetEnvIfNoCase User-Agent PaleMoon bad

SetEnvIfNoCase User-Agent Pale Moon bad


Order Allow,Deny

Allow from all

Deny from env=bad



String SetEnvIf User-agent ^-?$bad is responsible for blocking access to the site for sessions with an empty User-Agent.

Blocking referral spam (traffic from other sites to your site), example:

we fix parasitic traffic from sites ,

RewriteCond %{HTTP_REFERER} semalt\.com [NC,OR]

RewriteCond %{HTTP_REFERER} buttons-for-website\.com [NC,OR]

RewriteRule .* – [F]

Redirecting users with certain IP addresses to another site

If there is a high probability of cutting off real users by blocking access to the site for certain subnets, then I recommend directing traffic from suspicious IP addresses to another site, for example a landing page that does not advance in SEO in any way.

It is done like this:

RewriteEngine On

RewriteCond expr “! -R ‘′”

RewriteRule .* [R=301,L] — suspicious subnet, / — the site to which users will be automatically redirected.

How to check that everything is configured correctly

Check the status of the server response here , relevant if access is blocked for certain User-Agents:

The screenshot shows that the server returns error 403 (access denied) when trying to open the site by a user with User-Agent: Linux.

Also, look at the status of the server response here /, it should be like this:

Code 200 indicates that the request has been successfully processed and the site is available to the main Yandex robot.

How to cut off unwanted IP addresses from contextual advertising.

“Yandex.Direct: You can disable a maximum of 25 IP addresses in each campaign in the “Special Settings” section.

The decision to cut off bots in Yandex.Direct”:

Create a segment in Yandex.Metrica with those who visited the site and spent less than 5-10 seconds on it, save the segment.

Creating an audience based on a segment in the Yandex.Audience service.

In the campaign settings, we look for the “Bid adjustment” item, add the saved segment with a minus 100% adjustment.


Google Ads: you can disable everything you need in a human way, and any IP addresses, including ipv6, and subnets, in the campaign section “Settings” → “Exclusion of IP addresses”.

You can turn off contextual advertising by 100% only if you turn off the advertising itself.

In Google Ads, I recommend doing an analysis of the time before conversion and repeated visits from ads, then cut it all off through remarketing lists.

Additional protection methods

Setting up /, I recommend it for loaded sites, so as not to miss the attack, because the data in GA is delayed.

A useful service with a database of spam IP, subnets, email, domains — .

If the site is already getting close to the top in a super-competitive niche, then think about and close unwanted User-Agent, countries and IP networks in advance, or think about smart redirection to other sites: landing pages that do not advance in SEO or your competitors’ sites in any way 🙂

Instead of concluding

With the arrival of the pandemic, the collapse of oil prices and the lack of direct financial assistance to the population and business, dark times have come to our country. It is at such times that scammers and criminals are maximally activated, crooks climb from wherever possible, crime grows, not only on the street, but also on the web.

It is possible that advertising will become more frequent and negative behavioral factors will be inflated to annoy competitors. I am against such methods.

We need to compete through the development of service and product, this is the only right way.

Leave a Reply

Your email address will not be published. Required fields are marked *