Continuous Deployment of Jigsaw Blog to Digital Ocean Droplet Using Bitbucket Pipelines
Author Name • August 24, 2018 12:20 PM
Well, after my last post about updating my Jekyll blog, I got on a roll and went ahead and ported my blog over to Jigsaw. Like I mentioned in that post, I’ve enjoyed Jekyll, and I think it is a great blogging engine. But Jigsaw (and Laravel) has been a breath of fresh air, and something new to play with. Through this process, I tweaked the design again, but more importantly, I pieced together a process of deploying my Jigsaw blog to Digital Ocean automatically via Bitbucket Pipelines, and I wanted to share this process for anyone else who is interested.
But first, some background:
I’m listening to some different podcasts these days, mostly shifting from the Apple ecosystem to the web development world. In particular, I think these podcasts are pretty good: - Syntax - The Laravel Podcast - Laravel News Podcast - Full Stack Radio
Several of these podcasts have been sponsored by Netlify, and the hosts talk about how simple it is to deploy a static blog with Netlify. Then I watched Michael Dyrynda’s screencast about deploying a Jigsaw blog with Netlify. I tried it out, and sure enough, it was mind-blowingly simple. But, I already have a Digital Ocean droplet for other reasons, so I wanted to find out how to deploy to this droplet in the same way Netlify can deploy to its CDN.
Back to Jekyll... I use Bitbucket for source control, and for a while I've heard about Bitbucket Pipelines at a pretty high level. It seemed like the right fit to automate deployments. After a bit of searching around, this post by Ayush Sharma gave me to tools I needed to deploy my Jekyll blog using Pipelines. It took some tinkering around, but I finally got Pipelines to build my Jekyll blog and push it to Digital Ocean.
Back to Jigsaw, again... This whole time, I had Jigsaw building my blog locally on my Mac, so now I just needed to commit it to Bitbucket and have Pipelines deploy it the same way it deploys my Jekyll blog. I'll spare you the details, but the end product is quite simple. Once you get your deployment SSH keys in place in Bitbucket and on your remote host, and you define your environment variable for your remote host, here is the bitbucket-pipelines.yml file that will run when you push to master:
- apt-get update && apt-get install -y unzip
- apt-get install -y rsync openssh-client
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- ./vendor/bin/jigsaw build production
- rsync -a build_production/ root@$PRODUCTION_HOST:/var/www/path_to_web_root/ --exclude=bitbucket-pipelines.yml --chown=www-data:www-data
This Pipeline has been working flawlessly for a couple weeks now. If I've left out any crucial details, please get in touch. Also, if anyone knows of any tweaks to this process to make it even better, I'd love that feedback. There may be a Docker image that already exists that includes PHP 7.2 along with Composer and rsync. If not, maybe that will be my next step to improve this process.