written by Nicolas Untz on 2017-04-01

Travis CI can automatically deploy your Lektor content if it has access to your repository; this is well documented here. You simply need to update the .travis.yml file to include a deploy statement directing Travis to run the lektor script:

deploy:
  provider: script
  script: "lektor deploy"

For this web site, we publish to AWS S3 using lektor-s3 and need to provide the credentials to Travis as environment variables (go to More options, settings to define them in Travis).

Note: we use specific AWS IAM credentials and policies to limit what Travis can do. First:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1491117107000",
            "Effect": "Allow",
            "Action": [
                "s3:*"
            ],
            "Resource": [
                "arn:aws:s3:::nuntz.com"
            ]
        },
        {
            "Sid": "Stmt1491117177000",
            "Effect": "Allow",
            "Action": [
                "s3:*"
            ],
            "Resource": [
                "arn:aws:s3:::nuntz.com/*"
            ]
        },
        {
            "Sid": "Stmt1491117192000",
            "Effect": "Allow",
            "Action": [
                "s3:ListAllMyBuckets"
            ],
            "Resource": [
                "arn:aws:s3:::*"
            ]
        }
    ]
}

Second, because we use the CloudFront invalidation feature of lektor-s3:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1491117901000",
            "Effect": "Allow",
            "Action": [
                "cloudfront:CreateInvalidation"
            ],
            "Resource": [
                "*"
            ]
        }
    ]
}

For reference, here is the full travis.yml file for this web site:

language: python
python: 2.7
cache:
  directories:
    - $HOME/.cache/pip
    - $HOME/.cache/lektor/builds
install: "pip install Lektor"
script: "lektor build"
deploy:
  provider: script
  script: "lektor deploy"

written by Nicolas Untz on 2017-02-06

I currently host this web site on an AWS nano instance. There is an alternate option using S3 Public Website Hosting along with CloudFront, which I suspect is more cost effective. The nice thing with the S3 option is that you no longer have to worry about managing a server. CloudFront is AWS's CDN solution, it distributes copies of the content on edge nodes for faster client access and it also supports SNI TLS.

In order to test this configuration I created a secondary domain, nuntz.net, hosted on S3/CloudFront, which you can test by using https://nuntz.net/ instead of https://nuntz.com/.

S3 Deployment

Here is the procedure to deploy the site on S3 using Lektor:

{
  "Version":"2012-10-17",
  "Statement":[{
    "Sid":"AddPerm",
        "Effect":"Allow",
      "Principal": "*",
      "Action":["s3:GetObject"],
      "Resource":["arn:aws:s3:::nuntz.net/*"
      ]
    }
  ]
}
$ lektor plugins add lektor-s3
[servers.s3]
name = S3
enabled = yes
target = s3://nuntz.net
$ lektor deploy s3

Faster Access with CloudFront

We can speed up access to the site using AWS's CDN (CloudFront):

That's it, the site is now available.

Performance Comparison

So, how does S3/CloudFront compare to the nano EC2 instance? Well it's slightly faster but not that much, see for yourself:

In conclusion, given the simplicity of the solution, I will likely switch to S3/CloudFront in the future...

written by Nicolas Untz on 2017-01-22

After Docker, I am now experimenting with Kubernetes, and more specifically Minikube. Minikube allows you to run Kubernetes locally inside a VM.

I am trying to setup a similar configuration to what I did earlier with Docker Compose, using Kubernetes.

diagram

More details here.

written by Nicolas Untz on 2016-12-30

I have been using Grafana for a while now, to track our bandwidth usage on our Ubiquiti ER-X router. Grafana is great, it supports a number of data sources, including Graphite, Prometheus, and InfluxDB.

Screenshot

I have chosen InfluxDB and its sibling, Telegraf, to collect data from the router using SNMP. The environment was running on a Raspberry Pi, which was a real challenge to maintain, especially when building Grafana on ARM. Since I was also running UniFi controller on the same Raspberry Pi, performance overall was becoming an issue. So I decided to upgrade to an Intel NUC to get better performance, and the ability to deploy standard Docker containers.

Installing Ubuntu 16.04 LTS and Docker was straightforward, as documented in this other page. Then I created a Docker Compose file to define the various container images required, and their configuration. The biggest challenge was the missing SNMP tools in the Telegraf official image. I ended up creating a new Docker Hub repository, with a new image that includes Net-SNMP. I documented the whole procedure here.

Bonus: I am also now running UniFi controller as a Docker container.

written by Nicolas Untz on 2016-12-02

I just found this cool little trick on Stack Overflow to send data from the Python interpreter to the Mac pasteboard. The use case for me was analyzing some large JSON objects and going back and forth between the interpreter and Vim, for my unicli project.

In your interpreter just add:

import subprocess

def write_to_clipboard(output):
    process = subprocess.Popen(
        'pbcopy', env={'LANG': 'en_US.UTF-8'}, stdin=subprocess.PIPE)
    process.communicate(output.encode('utf-8'))

Then, for example, to send the text result from the request:

write_to_clipboard(r.text)

Then switch to your favorite text editor to paste the content of the pasteboard. That's how I was able to easily use JSBeautify on the JSON objects to make them easier to read.

written by Nicolas Untz on 2016-11-24

I just enabled SSL on the site, using the Let's Encrypt service, which provides free SSL certificates with auto-renewal.

I documented the process here.

written by Nicolas Untz on 2016-11-23

I built this site to document my various projects, using Lektor. Definitely a bit more complex than a simple Google Site, or even Github Pages, but it looks fun and interesting. We shall see.