Preferred method to block SEO for branch subdomains

If I turn on branch deploy subdomains (say for something like, will this be crawled and indexed for search? If so, what is the preferred method for ensuring this doesn’t happen. I’ve not done much work with SEO, but I know I don’t want a subdomain used for testing to accidentally be showing up in searches. I’m using Gatsby to build my site.

Hi @muffintheman! I’d suggest using a pattern like the one described here:

…which shows how to do “something conditional per branch during build”. You could put a robots.txt in place ONLY on branch deploys, or set a password as mentioned there - both will be effective for SEO. You could also just not link those branches anywhere google would find them :wink:

We do AUTOMATICALLY do this at deploy preview URL’s (how we build your PR’s) for you - we send an HTTP response header X-robots-tag: noindex to prevent THOSE from being indexed directly (these have URL’s like https://somelonghash–