Nginx: Prevent direct access to static files

2020-07-16 08:54发布

I've been searching for a while now but didn't manage to find anything that fits my needs. I don't need hotlinking protection, as much as I'd like to prevent people from directly accessing my files. Let's say:

My website.com requests website.com/assets/custom.js, that'd work,but I'd like visitors which directly visit this file to get a 403 status code or something. I really have no idea if it's possible, and I don't have any logical steps in mind..

Regards !

标签: nginx
3条回答
霸刀☆藐视天下
2楼-- · 2020-07-16 09:12

You can use nginx referer module: http://nginx.org/en/docs/http/ngx_http_referer_module.html. Something like this:

server {
    listen 80;
    server_name website.com;
    root /var/www/website.com/html ;
    location /assets/ {
        valid_referers website.com/ website.com/index.html website.com/some_other_good_page.html ;
        if ($invalid_referer) {
            deny all;
        }
    }
}

This config guard assets directory. But remember, that not guaranteed and worked only for browser - any body can emulate valid request with curl or telnet. For true safety you need use dynamic generated pages with dynamic generated links.

You do not need to create the variable $invalid_referer as this is set by the nginx module.

查看更多
Luminary・发光体
3楼-- · 2020-07-16 09:22

You can simply deny access to any folder or file just by putting these lines with your folders' name

location ~ /(no_access_folder|folder_2)
{
     deny all;
     return 403;
}
查看更多
The star\"
4楼-- · 2020-07-16 09:28

If you nginx powered development instances are showing up in Google search results, there is a quick and easy way to prevent search engines from crawling your site. Add the following line to the location block of your virtualhost configuration file for the block that you want to prevent crawling.

add_header  X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";
查看更多
登录 后发表回答