how can i not allow someone to use curl or file_get_contents to get my page html?
for example:
my domain is:
www.domain.com
someone php page:
<?php
$info = file_get_contents('http://www.domain.com/theinfo.php');
?>
i can try to check it by user agent but its not the right way.
what is the best way to check when someone is trying to get the page content?
what i built contains information that many will try to copy it to they websites and it can make my server overload.
The user agent can indeed be changed through curl but that's pretty much the only way you can tell if someone is accessing your site through curl or not. There's nothing else that's part of the request that distinguishes between them.
That being said, you could try to look for some missing fields, as file_get_contents() by default leaves out a bunch of them:
though you do run the slight risk of hitting false negatives.
Use .htaccess with the corresponding IP of the site (domain.com). Paste this code in your .htaccess
If you are concerned about anyone, not a specific IP or domain, taking your content, you should implement some kind of registration process for your site. Using Apache to filter will probably cause more problems than it is worth. You should ask yourself if what you are putting on the internet is not actually meant for every man and machine out there to do as they please, it should either be login protected or not on the internet.
Here is a very simple to use PHP library for implementing a login and/or registration system: https://github.com/panique/php-login