Moving a database driven side project to Netlify

FishermanI recently moved a side project to Netlify. Netlify, if you aren’t familiar with it, is a static file hosting service with a great workflow, free SSL certificates, and a built in CDN. I made the move because the side project was a database driven site, but didn’t really use other server side interactivity (no user generated content, etc). I haven’t invested much in the side project over the past couple of years, but it still gets tens of hits a day and is useful to a users. It’s a top hit on google for certain keywords. I didn’t want to spend time updating the underlying application, but wanted it to be more secure. The site is only updated over a month or so every year. It is then read only for the next eleven months of the year, as it provides information on a very seasonal service.

Moving to Netlify (or, frankly, any other static site provider) provides the following benefits:

  • extremely low cost (Netlify is free for the level of traffic I have).
  • faster browsing experience for the end user (due to being behind a CDN).
  • free SSL certificates, with no hassle of setting up letsencrypt myself.
  • indirection–the source site can now be anywhere. I could put it on an ec2 instance that I only boot up when I need to push a new build, or could even host it locally. This means that I don’t need to worry about the updating the source application.

I could have chosen to do this with S3/Cloudfront/AWS Certificate manager/Lambda (or using technologies from some other cloud provider). But even though I’m familiar with all the steps to do this, it was lot of work. And Netlify was free and had done all the grunt work of bringing all these technologies (or similar ones, I’m not familiar with the tech behind the site, but they do write about it a bit).

What steps did I take to move a database driven directory site to Netlify? Enough that I wanted to document this for my future self.

  • Change the DNS for the site to have a lower TTL. During the transition from your site to Netlify, users may see versions served from either the old site or the new site, and you want to minimize that.
  • Remove all forms and other server side interactivity that your site has. You can replace them with javascript driven interactivity (like Disqus for comments). Netlify has some support for functions, but I didn’t explore that.
  • Set up Netlify using a default URL (they provide it, it is something like ‘foo-bar-123.netlify.com’). I deploy from Bitbucket. (I know some people hate on Bitbucket, and I don’t like all of their UI, but they are free for private repos, which is a win for me.) This let me understand how to deploy a simple one page site.
  • Download the site. I used wget: wget -mk http://mysite.com/ . The switches set up mirroring and convert all links to be relative.
  • Move the site to a subdirectory (I used web) and configure Netlify to deploy the HTML from that subdirectory. This lets you have other scripts outside of the webroot that can help you build the site.
  • Check in the site with the downloaded content and push it up to Bitbucket.
  • Watch the site be deployed to the Netlify default URL.
  • Access the site via SSL: ‘https://foo-bar-123.netlify.com’ and look in the console for “mixed active content” errors. Fix those as applicable. (If this was for a client, rather than a side project, I’d solve all of these.) Note that if the source site is http only, you may need to hardcode https URLs.
  • See that “pretty” html pages like ‘https://foo-bar-123.netlify.com/about’ are rendered as text.
  • Ask on Stackoverflow about the problem.
  • End up solving it by generating a _headers file after downloading the site. Update Stackoverflow question with the answer.
  • Test again that the site at the default URL looks good.
  • Update the webserver config and DNS for the database driven site. I wanted it to answer to both the old addresses: mysite.com/www.mysite.com and a new address: generator.mysite.com
  • Update DNS to point mysite.com and www.mysite.com to the Netlify site. Instructions here.
  • Wait for DNS to propagate. When it does, check to make sure that SSL works.
  • Add a password for generator.mysite.com
  • Test the download process with the password (the wget command changes to wget --user=USER --password=PASS -mk http://mysite.com/
  • Write a script to do the download process.
#!/bin/sh
wget -mk  --user=USER --password=PASS http://generator.mysite.com/
mv generator.mysite.com mysitecom
cd mysitecom
find `pwd` -type f |grep  -v \\. |sed "s#`pwd`/##" > list 
for i in `cat list`; do echo "/$i" >> _headers; echo "  Content-Type: text/html" >> _headers; done
rm list
cd ..
rm -rf web
mv mysitecom web
git add web
git commit -m "pull in latest from generator.mysite.com"
# could do a git push here as well if you wanted

Now, any time that I make changes to the site, I need to run this script and push to Netlify. I could put it in cron or a scheduled lambda if I thought I was going to make changes often. For now, I’m content to run it manually. One downside is that I self-host my analytics code and that site is not SSL enabled. So I’ll need to make a change if I want to continue to get statistics.

So, in the end I have a fast, secure site that is hosted outside my infrastructure and that is easy to update. This process, while not trivial, was easy enough that it has me thinking about where else I could use it (this blog, for instance). Highly recommend.


Where I’m Writing

It’s been a bit quiet around here, but since I joined Culture Foundry I’ve been writing over on their site more. I’ve written on a number of topics, but this one about how to move files between different servers in order to scale a traditional CMS application horizontally, is one of my favs:

Compared to the other methods of syncing files, this works well with non cloud native applications. It also has the virtue of using old tested tools. You can use this system on prem or in the cloud, anywhere with SSH and rsync. This system works well with large numbers of small objects because rsync can be configured to only push new objects.

I’ll be splitting my blogging time between this site and the Culture Foundry site in the future. See you there.

 


Lessons from curating a link blog

I maintain a link blog about Colorado food and local food in general.  I use Tumblr, but I’m only incidentally interested in Tumblr traffic.  Tumblr hooks up to Facebook and Twitter, and pushes links there.  (I realize that I am missing interaction on Twitter and Facebook by using these networks as broadcast only, but I don’t have time to fully engage, so I thought a limited presence was better than nothing.)

Having maintained this link blog for over two years, I have learned a few things.

  • It is easy to start a project like this, but hard to finish.  There’s always more to do.  I think I’ll stop when it stops being interesting.
  • Deciding to do this is a great way to gain a broad understanding of a field while providing some value (via curating).  As you find more and more sources of links, videos, articles and audio content, you’ll gain a sense of what is happening.  Even if you don’t painstakingly read every article, you’ll still get a sense.
  • Speaking of sources, Google alerts is your friend.  I get emailed alerts on a variety of searches, and about 25% of the results are worth posting.  Facebook and twitter are additional great sources of links.
  • An RSS reader can help you if you are really diving in.
  • Giving someone notice that you’ve referenced their article via an ‘@’ mention will get you their attention.
  • Queuing up posts on Tumblr is a life saver.  This lets you stack up posts and portion them out one per day.  I typically have between 15 and 30 posts in my queue.  This makes timely posts more difficult, but frees me up to forget about the link blog for weeks at a time.
  • A link blog like this is a great use of your in between time, especially if you have a smartphone.  In five minutes I can scan and post two or three links, where five minutes is barely enough time to think of a regular blog post.  The Tumblr app is very good.
  • A linkblog is a great resource for other content generation.  I have a newsletter about local food as well, and a key section of that is interesting links.  Those are almost entirely drawn from the Tumblr.

The linkblog approach is very similar to Twitter, but differs in a few crucial ways:

These attributes make a linkblog a fine complement to Twitter.

There are some problems with this model, however.

  • Limited interaction with followers, either on Tumblr, Facebook or Twitter.
  • I’ve found that engaging on Twitter and Facebook directly is far more effective if you want content to be viewed or links to be clicked.
  • A linkblog like this is not truly building my tribe

So, if you have limited time, want to gain insight into a particular area of interest, and are OK with the drawbacks, create a linkblog.



© Moore Consulting, 2003-2017 +