I'd like to automate deploying our site to AWS S3. I've written a node script to automate building and uploading the site, but I'd like to have the script automatically run whenever the master
branch of our repo is updated on github.
I looked into AWS CodeDeploy, but it looks like that's for specifically deploying to EC2. I've also looked at AWS Lambda, but there doesn't seem to be a clear way to pull a copy of the repo using git
so I can run the script.
Any services (preferably tied to AWS) that I can use?
I have used Deploy Bot in the past and I have been quite happy with it.
It can push to S3 or even FTP via Git and also it can run your build scripts and even push a notification to Slack for you.
https://deploybot.com/
You can set this up with a very simple 2-step CodePipeline. Basically, you just fill out the different sections in the AWS Console. There's no need for a separate CI tool and its added complexity.
In the first step of the pipline, pull from Github and store in S3. You can easily set this up through the AWS Console.
In the next CodeDeploy step, you can use the AWS CLI (pre-installed in CodeDeploy) to do a
You'll have to set the environment variables for AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY so that AWS CLI can run during your deploy step, and that can also be done in the AWS console for CodeDeploy, in the Advanced section under "environment variables". Once the environment variables have been set and if that AWS user has the correct permissions, you can run any aws-cli command you want inside your CodeDeploy.
Once this is done, when you check in to Github, CodePipeline will kick off, and a few minutes later your files will be on S3.
I know it's not git deploy.... But Instead of setting up a CI box, I just used s3cmd.
http://s3tools.org/s3cmd
Executing this command syncs my build directory with s3.
I'm using it on Linux. Not sure what their OSX and Windows stories are.
If you're really after a git push solution, you could set up a timed job which pulls your git repo to a folder and then executes this against it. I do this elsewhere on a cheap Linux VM. Unless you're going to go full CI though, there's probably not much point.
i also recommend to use codeship,simple and easy, but you need to create IAM user with proper permission (which is policy) to S3 bucket.
the basic plan for codeship is free.
Well there might be a problem so far i can see codeship will not remove files as you remove files in github, after all, s3 is not github repo, but anyway, the putObject operations for lots of github update just works good enough to me.
Rather than using an AWS service directly (as you say they nearly all expect a much more complicated setup, deploying to EC2 etc), you might be better off using a CI provider such as Shippable, Codeship or Wercker.
These all have the ability to fire from
git
updates, run build commands, install utilities into their CI images/containers and copy files to S3.There's probably some startup which has built an exact tool for your purpose, but they haven't appeared on my radar yet :-)
I had the same goal some time ago and have now released a little tool, which solves the problem at least for me. It uses
AWS Lambda
and deploys a specific branch of the repository toS3
afterpush
. You can take full advantage of the GitHub deploy key, which has fewer permissions as personal access tokens and can be configured per repository.Please take a look at github-bucket, it might help you too.