Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 5 years ago.
Improve this question
my website is often down because a spider is accessying to many resources. This is what the hosting told me. They told me to ban these IP address:
46.229.164.98
46.229.164.100
46.229.164.101
But I've no idea about how to do this.
I've googled a bit and I've now added these lines to .htaccess in the root:
# allow all except those indicated here
<Files *>
order allow,deny
allow from all
deny from 46.229.164.98
deny from 46.229.164.100
deny from 46.229.164.101
</Files>
Is this 100% correct? What could I do?
Please help me. Really I don't have any idea about what I should do.
based on these
https://www.projecthoneypot.org/ip_46.229.164.98
https://www.projecthoneypot.org/ip_46.229.164.100
https://www.projecthoneypot.org/ip_46.229.164.101
it looks like the bot is http://www.semrush.com/bot.html
if thats actually the robot, in their page they say
To remove our bot from crawling your site simply insert the following lines to your
"robots.txt" file:
User-agent: SemrushBot
Disallow: /
Of course that does not guarantee that the bot will obey the rules. You can block him in several ways. .htaccess is one. Just like you did it.
Also you can do this little trick, deny ANY ip address that has "SemrushBot" in user agent string
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
SetEnvIfNoCase User-Agent "^SemrushBot" bad_user
SetEnvIfNoCase User-Agent "^WhateverElseBadUserAgentHere" bad_user
Deny from env=bad_user
This way will block other IP's that the bot may use.
see more on blocking by user agent string : https://stackoverflow.com/a/7372572/953684
Should i add, that if your site is down by a spider, usually it means you have a bad-written script or a very weak server.
edit:
this line
SetEnvIfNoCase User-Agent "^SemrushBot" bad_user
tries to match if User-Agent begins with the string SemrushBot
(the caret ^
means "beginning with"). if you want to search for let's say SemrushBot
ANYWHERE in the User-Agent string, simply remove the caret so it becomes:
SetEnvIfNoCase User-Agent "SemrushBot" bad_user
the above means if User-Agent contains the string SemrushBot
anywhere (yes, no need for .*
).
You are doing the right thing BUT
You have to write that code in .htaccess file , not in Robots.txt File.
For denying any Search Engine from crawling your site, the code should like this
User-Agent:Google
Disallow:/
It will disallow Google from crawling your Site.
I would prefer .htaccess method by the way.