Cannot access /sitemap.xml nor /robots.txt directly on my SPA

Hi, I have a single page application deployed on netlify and so far, everything is great. I’ve deployed it with a netlify.toml file with some redirects, one of them being:

[[redirects]]
from = "/*"
to = "/index.html"
status = 200

The thing is, that when I try to submit a /sitemap.xml url to google search console, google says it cannot find the file, and when I try the file url in my browser I get redirected to /index.html.

So the question is, how can I serve my single page application while maintaining direct access to existing static files on the root folder? I have the same problem, for example, with /robots.txt.

It would be nice to redirect to /index.html only if the requested static file is not found.

Hint: If I issue a curl command to /sitemap.xml it gets served correctly, but not on the browser.

Hey @brielov,

I’ve just tested this and, as we would expect, existing files are not overwritten by redirect rules such as these.

Are you sure that your deployed site features the sitemap and robots.txt files?

Could you provide me with your site name and/or API ID, please (both of which are safe to share publicly) for me to take a further look?

If push comes to shove, given netlify.toml’s rule processing order, it’s possible to add a rule similar to this which would get processed before the overarching ‘all-to-index’ rule for a single page site.