Ultimate Static Site Setup

There are some small details that I didn't cover. It's always a good idea to RTFM.
https://www.getlektor.com/docs/
http://jinja.pocoo.org/docs/2.10/templates/

Feel free to ask any questions and I'll answer the best I can.

Also, I'm obviously not worried about aesthetics right now. This is sometimes a big stumbling block for less technical people, so please don't worry about how it looks. If you are a technical person, you'll understand that we can use any HTML, CSS and JS that we want, but the priority right now is getting things setup and managing the data.

You can see all of the code here: http://shop.recruiterbros.com/github

The `master` branch of the repo will have the latest code. If you want to see the code evolve from the beginning, browse through the repo's commit history.
 
Last edited:
This is the project structure now:

0HrF3q5.png


Even though we created the job posts using the admin panel, they are saved as text files in sub-directories of the "content" directory. So we can just as well create the pieces of content with just a text editor, or like I'm going to do right now, we can write a program to add the content to the site.

The contents.lr files look like this:
Code:
id: 1
---
description: Job description goes here.
---
title: A Job Title

The directory names are the post IDs, but they could've just as well been the post titles, I just chose to go with IDs so that I don't have to worry about duplicate titles. And there is no need to even have IDs for any of our content, it's just a choice I made.

Here's some simple code that takes the bullshit job posts out of the JSON file and turns them into site content:
Python:
import json
import os

from jinja2 import Environment, BaseLoader

JOB_DATA = "./job_data.json"
JOB_CONTENT_PATH = "./content/jobs/"

JOB_TEMPLATE = """id: {{ id }}
---
description: {{ description }}
---
title: {{ title }}
"""

with open(JOB_DATA, "r") as fh:
    job_data = json.load(fh)

env = Environment(loader=BaseLoader())

for index, job in enumerate(job_data):
    job_id = str(index + 1)
    job_path = os.path.join(JOB_CONTENT_PATH, job_id)

    try:
        os.mkdir(job_path)
        job_template = env.from_string(JOB_TEMPLATE)
        doc = job_template.render(
            {"id": job_id, "description": job["description"], "title": job["title"]}
        )
        with open(os.path.join(job_path, "contents.lr"), "w") as fh:
            fh.write(doc)

    except FileExistsError:
        pass

And now the site is full of posts:

Dvfodgx.png


Hopefully this starts giving you some ideas about how static sites can be used in the same way you'd use a traditional database-backed CMS. There's all kinds of room to get imaginative and creates scripts, cron jobs, etc.

Next, I'll get started deploying to AWS, then after it's deployed, we'll come back to rounding out the site content and styling, and we'll add more pages and features.
 
This thread gives me boner.

Never tried Lektor, but will now. I've played with static site generators, but most I've used have been to focused on the blog scope, i.e. most my sites are using data in a structured form, similar to your job posts.
 
Very nice!

Have you experimented with using JS to filter a JSON object for these sites? If I keep all my posts with meta data (some of it) in a JSON file I generate via python (as I generate the content anyways). I guess it would be similar to the search discussion earlier, but I would have some simple sliders and check boxes to filter and the listings would then link to the details page.

I get a lot of data from XML APIs for my niche, but it rarely updates, so I don't really need a db.

so it would just be get the xml from the API, generate the content files with fields/meta data and content and at the same time generate a JSON file with all or featured pages. Still testing on Lektor, but liking it so far.

Will test to generate content to for my test sites.
 
@BCN I think I understand what you're describing. Hopefully I'll get to some of the AWS stuff this weekend or in the next few days, then can dig into adding more advanced features to the site.
 
Awesome!

Yeah so it would be for a travel site where I have

domaintravel.com/destinations/mountain-1234

domaintravel.com/hotels/hotel-3456

and on the root pages, it would load a json in a table or loop and when the user contorls it, i.e. ticks off for family friendly and chooses price range, it would filter the json object to only list those items

and the JSON would just be generated parallell with the static pages in a format like
Code:
[
    {
        "name": "Hotel test",
        "priceLow": 300,
        "priceHigh": 900,
        "season": [1,2,3,4],
        "attributes" : [
            1,
            3,
            7
        ],
        "descriptionShort": "fsgfggfdffgfdgfdfgdfgfdgffdg",
        ....
    }
    ......


]

and then the hotel root page would show this, the hotels/winter/ would show the same but with season=1 etc.

So it would be a nice way to make a "dynamic" static page.


The very same JSON or at least the raw data would be used to generate the hotel details pages
 
Okay! Time to get the site hosted on AWS. This will include: hosting files on S3, delivering files via CloudFront (CDN), getting an SSL cert from AWS, setting up basic redirects (naked to www, HTTP to HTTPS) with S3 and creating DNS records.

The whole process is pretty simple, but there are some details you need to get right. Hopefully documenting how to do it here will take away the fear/mystery you might have if you're not sure about AWS.

Setting Up S3 Bucket To Host Your Site

From the list of AWS services choose "S3" and then in the S3 management console click the button to create a new bucket. Name your bucket using the the sub-domain (www) and domain of your URL. If you're more advanced, you don't have to follow this naming convention because you can use any source you want for your CloudFront distribution, but that is beyond what we're gonna get into here, so save yourself some trouble and name your bucket www.mysite.com.

create-bucket.png


In step 3, uncheck the boxes that prevent you from making the S3 bucket public.

uncheck-these-boxes.png


Once the bucket is created, go to the permissions tab and create a bucket policy that allows read-only access to the public.

bucket-policy.png


You can copy/paste/edit this bucket policy for your static site:


JSON:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadForGetBucketObjects",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<YOUR-SITE-HERE>/*"
        }
    ]
}

Now that the bucket can be read by the public, go to the bucket's "Properties" tab, find the "Static website hosting" section, and choose the "Use this bucket to host a website" option. Make sure you enter a filename for your index and error documents.

static-hosting.png


Now click the "Upload" button and upload a placeholder index.html file to make sure everything is working.

index-uploaded.png


You'll now see your file listed in the S3 bucket overview. If you go back to static site hosting inside the properties tab, you'll find the AWS URL for your static site, which should show your newly uploaded index.html when you navigate to it.

index-is-live.png
 
Uploading your site to S3

You can upload files using the S3 web interface and you can also use the AWS command-line interface to easily sync directories on your machine with an S3 bucket, but even easier is to use a Lektor plugin that deploys your site to S3 for you.

Here is the Lektor S3 plugin: https://github.com/spenczar/lektor-s3

It is easy to install and use, all of the instructions are in the README on the Github page. I'll walk through how I do it, but I'm not going to duplicate anything that's already in the README, so RTFM.

After following the installation instructions, this is what my lektorproject file looks like:

lektor-project-s3.png


Probably the trickiest part now is getting your AWS access keys (credentials) to work. You can find your AWS creds by logging into the AWS web console and going to "My Security Credentials". But you have to get the Lektor plugin to read your keys.

I use AWS everyday for work and my preferred method for managing AWS keys is to include them in my environment, in other words, export the values from a .env file and source that file before running the Lektor server. I put the .env file for each project in the project directory, but you must make sure to add .env to your .gitignore file because you don't want to check your credentials into version control.

dot-env.png


Now your creds are setup. So source the file and run Lektor:

source-dot-env.png


Now you can deploy your site from the Lektor web admin:

deploy-button.png

Clicking the "Deploy" button brings up a dialog where you can select a deploy target (we only have one right now, our S3 bucket) and publish the site changes.

publish-button.png


After clicking "Publish" your site is now live on S3.

site-is-live.png


My chosen domain isn't setup yet, but you can now view the site using the S3 static site URL: http://www.recruiterbros.com.s3-website-us-east-1.amazonaws.com/
 
^^ awesome post.

Really learning a lot in this. Stuff I could have google'd and then wasted my time on figuring out when I did it wrong.. so glad you have it spelled out here correctly.
 
^^ awesome post.

Really learning a lot in this. Stuff I could have google'd and then wasted my time on figuring out when I did it wrong.. so glad you have it spelled out here correctly.

Yea AWS has as huge number of services, each service has gone through multiple design iterations, so they have different features to accomplish similar, but sometimes subtlety different things, deprecated features, new features that don't actually work as advertised, and then all of the services can communicate with each other in different ways... so there's a million different ways to do things on AWS and it's often difficult to find a good, recommended solution in the AWS docs.

I wrestle with stuff like IAM roles and CloudFormation configs on a daily basis, a lot of it is completely useless and I don't care, but it's yak shaving I need to get through to do my actual work. I'm happy to be able to share some of the more worthwhile knowledge I've gained. As much as a PITA as AWS often is, it's actually really powerful, cheap, and easy when you do it correctly.
 
No @algospider, I haven't checked it out yet. Currently, I'm using niche-specific spinner-like software, where you can dump in a few word lists and then generate content. Rant gives you a more general solution for doing the same kind of thing, I am very interested in reading their code more closely. Most likely I will create my own software to do something similar (let users easily setup complex, high-quality article spinning). If you're interested in that kind of thing, PM me and maybe we can share ideas about what a good solution would look like. And when I do try generating articles with rant I will let you know for sure.
 
I saw this a few days ago and thought of your project:
New AI fake text generator may be too dangerous to release, say creators

Feed it the opening line of George Orwell’s Nineteen Eighty-Four – “It was a bright cold day in April, and the clocks were striking thirteen” – and the system recognises the vaguely futuristic tone and the novelistic style, and continues with:

“I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run. I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.”

The math is a little off, but wow. Or this:
One such, completely artificial, paragraph reads: “Asked to clarify the reports, a spokesman for May said: ‘The PM has made it absolutely clear her intention is to leave the EU as quickly as is possible and that will be under her negotiating mandate as confirmed in the Queen’s speech last week.’”

Sorry if this belongs in STDDIOT, but I thought it was semi-relevant here.
 
Had to do some travelling and had to focus on actual work for a little while, but I have a minute now to wrap up the first phase of development on this static site.

Right now the site is available via S3 at: http://www.recruiterbros.com.s3-website-us-east-1.amazonaws.com/

But we need the site to be available via CloudFront (CDN) at: https://www.recruiterbros.com

First thing I'll do is get a free SSL cert from Amazon. In the AWS console, open up Certificate Manager:

aws-menu-cert-manager.png


Click through "Request a Certificate" until it asks you for the domain name for the cert. I'm going to use the * subdomain so my certificate will work for all hosts on recruiterbros.com.

cert-domain-name.png


We've already setup DNS with AWS Route 53, so choose "DNS Validation" and AWS will be able to create all the DNS records for you:

dns-validation.png

Just expand the domain row in step 4 and choose to let Amazon Certificate Manager create the DNS records:

create-valdation-records.png

Once the cert is validated, use the AWS Services search to find CloudFront. In the CloudFront console, choose "Create Distribution" -> "Web" -> "Get Started".

The first thing you see and the thing that I've screwed up the most times is "Origin Domain Name". When you click that field, AWS helpfully populates a drop down with all of your S3 bucket endpoints. You don't want to select any of these. Under certain constraints when deploying a single-page application, you might want to use an S3 endpoint directly, but DO NOT do that here. Instead, find the S3 static site URL for your bucket, in our case it's: http://www.recruiterbros.com.s3-website-us-east-1.amazonaws.com/

Go ahead and redirect HTTP to HTTPS as well.

create-distro-1.png

IF YOU WANT CACHING TO WORK, PAY ATTENTION HERE.

In the screenshot below, I have selected "customize" caching and set all time to live values to zero. This is because I want to see changes immediately when I'm developing/designing the site and re-deploying. Once your site is out of the initial development phase, you probably want to use higher TTL values so CloudFront will actually cache your site and deliver it from edge locations, otherwise you're not getting the full value of using the CDN. Using the origin cache headers is a good choice, because you can then set the TTL values per object in S3, so most of your site can be aggressively cached, while files that change more often can have shorter TTL times.

create-distro-2.png

Now scroll down to "Alternate Domain Names" and enter the URL for your site www.recruiterbros.com. Select "Custom SSL" and when you click the field it will show a list of your certs, pick the cert you just created for your domain.

create-distro-3.png

Further down there is a field for "Default Root Object", make sure you give it the value: "index.html".

Click "Create Distribution" and then you wait awhile for AWS to get your site out to all of the edge locations. When the distribution is deployed, we're ready for the final piece, pointing the DNS records to the CDN.
 
Once the CDN distribution is deployed, it's time to actually hook up the URL and redirect the naked domain to the WWW subdomain.

Inside Route53, create a new record set and choose your CloudFront distribution as an alias target:

record-set.png


After this step, the site is now live at: https://www.recruiterbros.com/

One last problem is that if someone types only 'recruiterbros.com' into the address bar or a link, nothing loads. We use an S3 bucket to redirect from the naked domain to www. Create a bucket that matches the naked domain (recruiterbros.com), then enable static site hosting and setup the redirect:

www-redirect.png


Then create a DNS record that uses the S3 static site as an alias target:

www-redirect-dns.png


Done! The site is live and ready for the next iteration of development, design and content.

When I have a little more time, I will post a quick review and index to everything that was covered so far, and then I'll preview what topics might be worth looking at next. I think there is a ton of stuff about static sites to explore and discuss, some interesting technical challenges have already been brought up in this thread.

Hopefully the step by step instructions through the Amazon Web Services bits helps some people that might otherwise be hesitant to to host their site on a serverless AWS setup. AWS has a ton of moving parts and it can be a bitch to configure everything correctly, but ultimately it saves a lot of time, trouble and cost to host sites this way. I tried to highlight all of the dumb pitfalls that are difficult to spot in the official documentation.

If anyone uses this guide to actually setup a static site with Lektor or on AWS, please let me know!
 
Just wanna say I followed this guide to day when I had to quickly setup a site and looking at the screenshots made everything a lot easier than having to actually think.

So if this hasn't helped anyone else, at least I'm happy with myself :wink:
 
Back