How to protect against direct access to images?

2019-01-14 03:27发布

I would like to create a web site with many images. But I would like to protect against direct access to images, e.g. direct links to images without visiting the web site.

What is the preferred way to do this? And what are the alternatives with Pros and cons?

I have some ideas (I don't know if they are possible):

  • File permissions
  • PHP Sessions
  • Temporary file names or URLs
  • HTTP Redirection?

Maybe this isn't practiced on many web sites? E.g. I tried to access a private photo on Facebook without beeing logged in, but I could still visit the photo.

The platform will probably be a Ubuntu machine with NginX and PHP.

8条回答
乱世女痞
2楼-- · 2019-01-14 03:38

I use both methods - checking user gent and and referrer. User agent I check at .htaccess. And referrer check at php file. You can see it at http://coloring-4kids.com

Here is my code:

RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?pinterest\.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?pinterest\.com$ [NC]

RewriteCond %{HTTP_USER_AGENT} !(Googlebot|bingbot|msnbot|yahoo-mmcrawler|YandexImages) [NC]
RewriteCond %{HTTP_USER_AGENT} !googlebot-image [NC]
RewriteCond %{HTTP_USER_AGENT} !googlebot [NC]
RewriteCond %{HTTP_USER_AGENT} !googlebot-news [NC]
RewriteCond %{HTTP_USER_AGENT} !googlebot-video [NC]
RewriteCond %{HTTP_USER_AGENT} !googlebot-mobile [NC]
RewriteCond %{HTTP_USER_AGENT} !mediapartners-google [NC]
RewriteCond %{HTTP_USER_AGENT} !mediapartners [NC]
# RewriteCond %{HTTP_USER_AGENT} !adsbot-google [NC]
RewriteCond %{HTTP_USER_AGENT} !bingbot [NC]
RewriteCond %{HTTP_USER_AGENT} !facebookexternalhit [NC]
RewriteCond %{HTTP_USER_AGENT} !baiduspider [NC]
RewriteCond %{HTTP_USER_AGENT} !yandex [NC]
RewriteCond %{HTTP_USER_AGENT} !sogou [NC]
RewriteCond %{HTTP_USER_AGENT} !twitterbot [NC]
RewriteCond %{HTTP_USER_AGENT} !pinterest [NC]


RewriteRule (^.*\.(gif)$) /watermark.php?src=$1 [L]

watermark.php

<?php  

$test = getenv("HTTP_REFERER");
$proverka =  substr($test, 0, 25);

 header('content-type: image/jpeg');

 $image = imagecreatefromgif($_GET['src']);

 $watermark = imagecreatefromgif('watermark.gif');

 $watermark_width = imagesx($watermark);
 $watermark_height = imagesy($watermark);


 $dest_x = imagesx($image) - $watermark_width;
 $dest_y = imagesy($image) - $watermark_height;



if (strpos($proverka, 'media') !== false)
   { $pinproverka=true; }

 if (($proverka != 'http://coloring-4kids.com') && (imagesx($image) > 400) && ($pinproverka!=true) )  { imagecopymerge($image, $watermark, $dest_x - 5, 5, 0, 0, $watermark_width, $watermark_height, 100); }



 imagegif($image);

 imagedestroy($image);
 imagedestroy($watermark);

?>
查看更多
贼婆χ
3楼-- · 2019-01-14 03:39

You could use a PHP script to retrieve the images using something like:

<img src="mysite.com/getimage.php?id=001" />

and have the PHP script return the image data only after confirming that the domain of the HTTP_REFERER is your's.

If you have an account-oriented site, I suggest using PHP sessions as you stated and have the PHP script verify the session before returning the image data.

查看更多
太酷不给撩
4楼-- · 2019-01-14 03:43

This is going to be hard to do. In order for your clients' web browsers to access the pictures, they need to be readable. File permissions won't work because you'll need to grant access to the browser. You won't be able to stop someone from downloading them and doing something with them.

If you only want to stop direct linking, if you change the filenames on a regular basis and update your pages to reflect this, other pages will have their links broken.

查看更多
够拽才男人
5楼-- · 2019-01-14 03:54

Add a simple .htaccess file in your site folder with the follwoing lines

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http://www\.your-domain\.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www\.your-domain\.com$ [NC]
RewriteRule .*\.(wav|swf|jpg|jpeg|gif|png|bmp|js|css)$ - [F,NC,L]

Note I added also js and css file even if I think it's bizzare to find someone who attempts to scrape them.

查看更多
混吃等死
6楼-- · 2019-01-14 03:54

This might be useful: Allow/deny image hotlinking with .htaccess

Edit: One thing to note about this method is that some Browser/AV/Firewall software removes Referer data when you browse, which would cause potentially legitimate users to be treated as hotlinkers.

If your site already uses some kind of authentication or session system, then it would be better to use the method given in @Mark Baijens' answer.

Update: NGiNX rewrite rule to prevent hotlinking:

location ~* (\.jpg|\.png|\.css)$ {
    valid_referers blocked mydomain.com www.mydomain.com;
    if ($invalid_referer) {
        return 444;
    }
}
查看更多
迷人小祖宗
7楼-- · 2019-01-14 03:56

You can dynamically protect a folder using htaccess and the users ip.

Add a .htaccess file to your images folder with the following lines:

order deny,allow
deny from all

Then use PHP to insert the users ip into the htaccess file when they log in like this:

<?
$ip = $_SERVER['REMOTE_ADDR'];
if (!filter_var($ip, FILTER_VALIDATE_IP)) exit();
$file = $_SERVER["DOCUMENT_ROOT"].'/YOUR_IMAGE_FOLDER/.htaccess';
$current = file_get_contents($file);
$current .= "allow from ".$_SERVER['REMOTE_ADDR']." #".$_SESSION['id']."\n";
file_put_contents($file, $current);
?>

The folder will be blocked from any ip that is not logged in.

Notice that I checked to see if the ip is valid. It is important that you give the user no way to inject their own code into your htaccess file.

Also notice that I put the users id in a comment to the right of the ip in the htaccess file. When the user logs out you can search the htaccess file and remove the ip of the user.

You can update this on every request to prevent users who are using dynamic ips from getting kicked off.

I use this method with my entire members areas, it is an excellent added layer of security. Just make sure that you put your log in scripts outside of the members folder.

查看更多
登录 后发表回答