Gitlab-CI S3 Deployment

Jun 12, 2016 in TIL using tags gitlab , aws , s3 , hugo

I’ve just set up a new static site. It’s a single HTML page with a couple of images using the wonderful Hugo static site generator.

I want to host it on Amazon S3, behind Amazon CloudFront CDN using a free SSL certificate from Amazon Certificate Manager. This should give me near infinite scaling, a solid infrastructure and very minimal price.

In past I’ve used a more heavy duty static site generator Middleman and stuck a couple of commands from middleman-s3_sync in to a Makefile. The disadvantage of this is it runs locally.. It’s another thing to do.

Since this static site’s code lives in Gitlab.com I have the free Gitlab-CI available to me.

I chose not to use GitLab Pages for this but I did adopt a similar technique.

Here’s how I did it:

  1. Set up the S3 bucket (as a website), CloudFront distribution to point to this website endpoint as the origin, issue a new SSL cert via ACM in us-east-1 for the correct domain names. Use the CloudFront distribution name in your DNS for the domain name (CNAME or ALIAS record type).

  2. Create a bucket policy.. since this is a website lets make it all public:

    {
        "Version": "2008-10-17",
        "Statement": [
            {
                "Sid": "AllowPublicRead",
                "Effect": "Allow",
                "Principal": {
                    "AWS": "*"
                },
                "Action": "s3:GetObject",
                "Resource": "arn:aws:s3:::rjoc-example-bucket.com/*"
            }
        ]
    }
    
  3. Create an IAM user with access to the bucket (this could and should be much more specific but this will do as an example):

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "Stmt1418378429000",
                "Effect": "Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::rjoc-example-bucket.com/*",
                    "arn:aws:s3:::rjoc-example-bucket.com"
                ]
            }
        ]
    }
    
  4. Take the AWS_SECRET_ACCESS_KEY, AWS_ACCESS_KEY_ID, AWS_REGION (of the bucket) and AWS_BUCKET. Add these as variables in Gitlab (so they’re secure rather than committed to git).

  5. Created a .gitlab-ci.yml file in my project root and pushed.

    # .gitlab-ci.yml
    image: publysher/hugo
    
    
    production:
      type: deploy
      before_script:
      - apt-get update
      - apt-get -qq install python python-pip ca-certificates
      - pip install awscli
      script:
      - hugo
      - aws s3 sync public/ s3://$AWS_BUCKET
      artifacts:
        paths:
        - public
      only:
      - master
    

This file instructs CI on pushes to master to use the Docker Image publysher/hugo.

CI then installs the AWS CLI (and python as it’s a dependency).

$ hugo is then run - this compiles and dumps the static site to ./public.

Then the $ aws cli is used to sync ./public to our S3 bucket. The cli pulls the access credentials and regions from the configured environment variables.

An there we have it, we’ve very quickly got our static site onto S3, hosted behind a SSL certificate and a global CDN.. for a cents a month (depending on activity).. and all automated on a $ git push!