Deploy only specific files

Hi guys, I al creating SaaS which will allow users to deploy their sites to netlify.

I am just wondering is it possible to deploy only specific files because there is already incremental building system implemented at our origin so we would like to just push files that we know that needs to change. We just want to avoid sending complete file hashes and wait to netlify return to us what to send because we already know that information in our system and there could be a lot of files.

Also, how would these specific files push affected building minutes? Are deploying via api and sending files counts in building minutes?

Hiya @demil,

First off, please make sure you aren’t in violation of our Terms of Service - if you resell netlify, you need to have a talk with our sales team before you charge anything for it. Let me know if you’d like an introduction!

Second off, that is already how our deploys work - unless you throw a zipfile at us, all of our automate-able deployment methods:

all work like that: we don’t re-upload (or if you use that API path, request you to upload) files whose checksums we’ve already got in storage on any site. That’s how we don’t host 11 million copies of the same jQuery release for instance :wink:

Build minutes are only used when you build in our CI, from git. If you deploy via API, CLI, or drag and drop, we don’t build your site and thus you don’t use build minutes.

Hi @fool , thank you for your reply.

First of all i would never violate any Terms of Service, and in our project i think that we are not violating anything, we are more static site builder which can deploy on Netlify but users need to go and open account themselves and we are just tool to upload their site to Netlify. Of course if user need more than basic account we are happy to arrange some affiliate with Netflify.

Also, i don’t think that you understood my question. I know that we can send post request with all files checksum and you would return us list of files that we need to upload, but question is because we are already tracking our incremental builds on our side can we do that with just uploading files without sending all files all over again because if we have a site with for example 10 000 static pages, that can be really annoying because your response will always be same with information that we already know.

Is now more clear what my question was?

And of course, i would like an introduction with sales thim for reselling or affiliate with Netflify.

Thank you for your time and best regards.

Yes, I am saying that the digest list will not request files that we already have, so you do not ever need to upload a second copy of the same file unless you deploy via zipfile or drag and drop.

If you are not seeing this behavior, that means you are changing the files with every build. You can tell if this is happening by following the advice in this article and seeing how many files we really upload:

@demil, I wanted to chime in here because I think I might perceive a question being asked but not yet answered. I’m going to rephrase the question below. Would you please let me know if this is what you are asking?

  • Is it possible to send the file digest information for only the changed files instead of for all files?

If that is the question above then the answer is “no”. The checksum information for all files in the deploy must be sent for all deploys, even for files that have not changed since the last deploy.

What @fool said still applies as well. Once our API receives the file digest for all files, only files which have not been uploaded before will need to have their file contents uploaded.

I hope this answers the question you were asking. However, if I have misunderstood or if there are other questions please reply anytime.

yes that was question, and i am not sure what will happen with sites with 10k+ files?, they will have really big post request? It will be post request around 300-400mb for changing/updating just one file?

Hi, @demil, for the files which do not change, you don’t need to send the them. I’m guessing the the files object would average less then 100 bytes for unchanged files (this is a rough guess).

If that guess is close, then if only one 10 MB file changes, the 9999 files will only take about 1 MB to send their checksums and paths. The remaining file would be its size plus the other details.

Now, if all 10k files change then, yes, this could result in very larges POSTs.

If you run into timeout issues for very large POSTs, the recommended workaround is to break up the deploy into several smaller deploys because files sent in a previous deploy will not need to be uploaded again (only their checksums and paths need to be resent).

luke, I have a website with more than 200k files and using manual deploys to deploy it. Each time it breaks due to timeout and I’m doing it over and over again till all files are deployed.
What is the proper way to to break up the deploy into several smaller deploys?

And what is the reason for sending checksum information for all files in the deploy even for files that have not changed since the last deploy?

Best regards,

Hiya @noreff and welcome to our community!

Uploading 200k new files for one deploy will be a challenge, particularly if you have anything other than the best possible internet connection. I’d recommend “building” in our CI, which does have the best possible internet connection - you could commit your finished build to a repo, and have us “build” from there (really: just upload your files; set no build command for our CI to run and we’ll just upload).

If you don’t want to do that, try uploading in batches of 50k. Something like:

“make a draft deploy of ONLY first 50k files. make draft deploy of ONLY second 50k files. make draft deploy of ONLY third 50k files. finally, make a production deploy of all 200k files.”

(so for first those 3, you remove the already-uploaded and not-yet-uploading files from the “deploy”. The deploy will be partial and likely unbrowseable unbrowseable , but it’s ok - you aren’t publishing at the production URL with the draft deploys; you’re just getting the files into our system so you don’t have to reupload.)

Hey @fool thanks for your response.
Number of pages I have is due to 10 languages being supported, each on it’s own subdomain, so I ended up splitting it to 10 smaller sites and writing a small bash script to do deploys in parallel. That’s works just fine for me.

Thank you for the follow-up and for sharing your workaround, @noreff. If there are other questions please reply here or create a new topic (for new or unrelated questions) anytime.