[Question] Fetching large amounts of data during compilation

Hello I’m currently in the process of building a static website which fetches roughly 3MB to 5MB of JSON data from an external API during compilation. This website will be rebuilt every 30 minutes at most. However, the compilation is very fast due to using a Rust-based generator.

Would it be possible to use Netlify for this? Am I allowed to pull in that amount of data from an external API during compilation? This would amount to at most 7.2GB per month. I could probably reduce it by a couple of gigabytes by compressing it, but these are the worst case scenario numbers.

Thanks!

Hi there,

Compared to the amount of bandwidth in a single node.js release or a single dependency installation, that is not much. No problem doing that (we don’t block outgoing network connections or responses in our build network).

Thanks so much for checking in!

1 Like