I am trying to configure my nginx in such a way that whenever there is some bad gateway response, I try to fetch static html contents from the s3 bucket.
The url structure of the request is some_bucket/folder1/folder2/text
And the data is stored in s3 bucket with directory structure as s3.amazonaws.com/some_bucket/folder1/folder2/folder1_folder2.html
I am not able to determine the values for folder1 and folder2 so that I can make the html file dynamically and use proxy_pass. Also, I tried try_files but I think that does not work for urls.
Any idea how to tackle this problem.
Thanks.
Nginx S3 proxy can handle dynamically built URL, you can also hide a directory and even part of private URL such AWS Key:
For instance the basis URL is the following:
Resulted URL:
The configuration is not difficult:
See the full configuration for more details.
This is what I did for someone(probably newbie) who may encounter this problem.
~* means case insensitive regex match ^ means anything before () for catching parameters.
For example, User enters www.example.com/some_bucket/folder1/folder2/text
Then, it is processed as,
~* ensures case insensitive search(for case sensitive skip *(means just put ~))
^ matches www.example.com.
/some_bucket/ is matched then,
.* means any number of any character(for any numeric, replace with [0-9]*)
() ensures that matched values gets catched
So, $1 catches folder1
$2 catches folder2
Then
.* without parenthesis matches any charater but does not catch the matched value
Now the catched values can be used to find the file in amazon bucket using
proxy_pass http://s3.amazonaws.com/some_bucket/$1/$2/$1_$2.html
https://www.digitalocean.com/community/tutorials/understanding-nginx-server-and-location-block-selection-algorithms can be helpful