Cache not clearing on specific files

We’ve added a robots.txt file to our repo, and deployed it to the Netlify server.

We did this because noticed that Netlify was grabbing the robots.txt file from our CMS (Which is hosted on a subdomain) because it couldn’t find this file on Netlify.

But since adding this file, the file doesn’t update, and instead still shows the contents of the one from our CMS subdomain.

How long will this take to expire/clear?

Weirdly, I updated other files on the server and they updated instantly, but this robots.txt file doesn’t. Why would that be?

I can see on one of the deploy previews that it’s correct also. So i’m a bit confused.

hmm, is there a possibility you have a .gitignore in place that is excluding that file? If yes, then that would explain why it isn’t getting updated.

1 Like

We’re only deploying flat HTML files to it (After it’s been cached elsewhere) so theres nothing like that in the repo.

But the good news is that it’s fixed itself now.

Is there a cache delay on Netlify that I should be aware of? Sometimes files update instantly, but sometimes they take hours.

well, I’m glad its fixed! :tada:

As far as delays go - there isn’t really a way to pinpoint exactly what happened here without something like an x-nf-request-id (more below), because we serve so much traffic that even knowing an approximate time window when the issue occurred barely narrows it down.

It really depends on exactly how your site is set up, also. You can learn a little more about how best to take advantage of caching here:

1 Like