it seems googlebot is blocked by robots.txt when I have split tests enabled. This happens when googlebot is routed to a branch deploy, where the robots.txt disallows everything. It makes sense to disallow indexing of branch deploys, but Googlebot should never be routed to anything but the master branch. split tests are really useful, but this behaviour is a major flaw!
We never use robots.txt, so this sounds like something your code is creating. We DO intend to disallow robots on deploy previews - that is, PR builds - but we use
X-robots-tag: noindex for that. Could you let me know a branch deploy that shows this behavior so I can confirm my assertion?
Hi Chris, thanks for looking into this - rather embarrassing - It was my own robots configuration that did this block - BUT I kind of expected that googlebot would always be routed to the master branch, which obviously is not the case, and I think that adds a bit of confusion to the consequences of running split tests with branch deploys.
Is this intended behaviour for branch deploys/split tests? I’d like to request that bots are always routed to the master branch deploy when running split tests.
If you see a way to improve the documentation around split tests to make this more obvious, please let us know and I’ll make sure the documentation team seems your suggestions.
That is how the feature is designed - we are literally serving different branches - so it is working the way we built it. We will not be changing that feature, since, well, people are intended to be able to set anything they want in each branch otherwise the feature is not very flexible