Prevent specific domains from being indexed

So when we have a Netlify site in development (or in production), we also have the same instance under a subdomain on our staging server, e.g:

https://thing.com
https://thing.oursite.com

Both are the same instance, but we’d like the subdomained one not to be indexed by Google. I’m aware you can password-protect specific branches etc, but this is slightly different to that.

Is this possible with Netlify, or do we need to set this up differently?

I’d suggest using robots.txt - that should be able to support the feature you want (I think you can specify particular hostnames rather than just paths) and “good” search engines like google will ignore them.

Or, you can potentially use our custom headers feature to set a canonical link location, similar to what we do for the sitename.netlify.app requests automatically: Improved SEO with canonical link headers which by my understanding is also SEO-friendly.

Or, least easy, but surest it will work, you could make specific redirects for the “wrong” hostnames:

https://thing.oursite.com/* https://thing.com/:splat 301!

…to make sure the crawler starts indexing in the right location.