Skip to content

Environment Driven Robots.txt with Hugo Static Site Generator

You know those times when you put a website live, everything goes well, you leave the office and you think to yourself, today was a good day. Then you remember, you forgot to uncheck the “block from search engines” checkbox… That’s totally happened to you too right?

Just use environment variables. Block everything unless you’re in production, and you’ll never again forget to check or uncheck that dreaded checkbox.

First, make sure you have robots.txt enabled from within your config.toml, which looks like so:

enableRobotsTXT = true

Then create a robots file at /layouts/robots.txt with the following contents:

User-agent: *
Disallow: {{ if ne (getenv "HUGO_ENV") "production" }}/{{ end }}

Make sure you set HUGO_ENV to production when you build your site, which you can do by prefixing the CLI command like so: HUGO_ENV=production hugo.

Because you’re probably using Netlify (why wouldn’t you?), you can create a netlify.toml file to assign the correct environment variable dependant on which GIT branch triggered the build. This is useful because Netlify can build your pull requests, which you don’t want showing up in Google!

    command = "hugo"
    publish = "public"

        HUGO_ENV = "production"


There are currently no comments on this article. Why don't you add the first?

Have Your Say

Your email address will not be published. Required fields are marked *