Home Insights Environment Driven Robots.txt with Hugo Static Site Generator
22nd September 2017 in Web Development

Environment Driven Robots.txt with Hugo Static Site Generator

Louise Towler

By Louise Towler

Environment Driven Robots.txt with Hugo Static Site Generator

You know those times when you put a website live, everything goes well, you leave the office and you think to yourself, today was a good day. Then you remember, you forgot to uncheck the “block from search engines” checkbox… That’s totally happened to you too right?

Just use environment variables. Block everything unless you’re in production, and you’ll never again forget to check or uncheck that dreaded checkbox.

First, make sure you have robots.txt enabled from within your config.toml, which looks like so:

enableRobotsTXT = true

Then create a robots file at /layouts/robots.txt with the following contents:

User-agent: *
Disallow: {{ if ne (getenv "HUGO_ENV") "production" }}/{{ end }}

Make sure you set HUGO_ENV to production when you build your site, which you can do by prefixing the CLI command like so: HUGO_ENV=production hugo.

Because you’re probably using Netlify (why wouldn’t you?), you can create a netlify.toml file to assign the correct environment variable dependant on which GIT branch triggered the build. This is useful because Netlify can build your pull requests, which you don’t want showing up in Google!

[build]
    command = "hugo"
    publish = "public"

[context.production]
    [context.production.environment]
        HUGO_ENV = "production"

Leave a Reply

Your email address will not be published. Required fields are marked *