Hosted PDF files loading slowly

Hello, I have recently moved my website from GitHub pages to Netlify. I have whole prebuilt website in a GitHub repository, and it is automatically synced to Netlify. Overall, the website loads faster than GitHub pages. However, I have several hosted PDF files (several MB in size, rarely exceeding 10 MB), which load noticeably slower than the same when served from GitHub via GitHub pages. Is there anything I can do to optimize PDFs? Thank you.

hi there,

our system really is optimized to serve small files as quickly as possible to our customers - generally, assets are rarely more than a few hundred KB at most with modern static sites, and anything larger than that will see slower download speeds, it is true, as they are served in a different way.

One thing you might investigate is using #netlify-large-media-nlm to serve those files - that is a different way of storing files that may provide you with the download speeds you are looking for :raised_hands:

Are the large media assets stored in different grade storage? I understand that for image files, netlify can process, and serve appropriate size/resolution. Does something similar happens with PDFs or other binary files? Thank you.

Actually, using Large Media won’t cause the images to be sent any faster. It does enable image transformation on the deployed site images (meaning transformations at browse time instead of during the site build and deploy).

The main benefit Git LFS (not Large Media but Git LFS itself) is the reduced overhead transferring the tracked large file change data between systems working with the repo.

Git stores the entire copy of a file it tracks. When the data changes a whole new file, not differences is saved. All the diffs (the differences) are generated on the file from the actual file data.

What does this mean for large files? If you have a 10 MB file and you change a single byte, Git will store two copies of this file. One file of 10 MBs and a single byte change to it now takes 20 MBs to store in Git.

If you make 10 edits of a single byte to that file and save all ten changes as individual commits, you now use 100 MBs of storage for that one file in Git. When cloning a repo without Git LFS, all these files must be sent to the system cloning the repo. If you use Git LFS, only the text pointers are sent and the most recent version of the file (depending on how Git and your repo are configured). This makes cloning repos with many large files with many changes much faster using Git LFS then without.

Large Media is a Git LFS service at Netlify and it also includes post deploy image transformations. However, the images are still sent from the same CDN nodes. This means moving to Large Media won’t send the images faster or slower.

Regarding if we are doing transformations for the PDF files, we are not. There are no transformations for most binary file types and we only do the transformations when the are requested by GET data in the URL itself. Only JPEG, PNG, and GIF image types support images transformations at this time.

If there are other questions about this, please let us know.