I am using CodeIgniter with sessions stored in my database. Over a short period of time, a large amount of sessions are created by bots/spiders, etc.
Is there a way of preventing this? Perhaps via .htaccess?
I am using CodeIgniter with sessions stored in my database. Over a short period of time, a large amount of sessions are created by bots/spiders, etc.
Is there a way of preventing this? Perhaps via .htaccess?
First and foremost you should create a robots.txt file in the web root of the domain to address two issues. First to control the rate at which the website is being crawled which can help prevent a bot/spider from creating a massive number of database connections at the same time. Second to prevent specific bots from crawling the website. use the following defaults, however you might want to add or remove the user agents denied, and adjust the crawl rate
Sample Code:
There are two important considerations when using /robots.txt: