Failed to upload functions file - function too large

Getting this error during deployment. Using node v10.16.3. Have tried local deployment as well.

10:01:08 AM: 1 new functions to upload

10:03:12 AM: Failed to upload file: &{Name:get-allergy-info Sum:57182972fa83281f7aa8fbef6a6a684ba4b3cf56424034c621222a9e0454e1d8 Runtime:js Size:<nil> Path: Buffer:0xc0003c0000}

10:03:12 AM: failed during stage 'deploying site': Failed to execute deploy: [PUT /deploys/{deploy_id}/functions/{name}][500] uploadDeployFunction default &{Code:0 Message:}

10:03:12 AM: Failing build: Failed to deploy site

It might be because I’m using puppeteer which installs chromium alongside it. That might be too big for the function. Thoughts on that being the issue?

Hi @itwasmattgregg,

It’s certainly possible to use Netlify functions with Headless Chrome, and there’s an example of this at https://github.com/netlify-labs/netlify-functions-headless-chrome . I recommend you check out that repo. If you continue to have trouble, please provide us with a link to the deploy that’s failing. Thanks.

I copied that repo to a function and had a couple issues. I couldn’t get it to run locally with netlify dev. And I also got a timeout error after 10.1 seconds when I tried to access the deployed version of the function. I took it down but if you want I could put it back again and send you the URL.

Our functions run for only 10 seconds by default, so no surprise you’d get a timeout error at that point. The URL for the failing deploy may still provide some insight into your earlier failure - don’t need to put it back up, just link to the logs for that specific previous failed deploy so we can check our additional internal logs for obvious problems (there are some things like multiple.dots.in.the.function.filename.js and too-large environment variable settings that can cause failure in function deployment that we can see more easily from the inside).

@itwasmattgregg were you able to get this to work? I’m also using puppeteer and am running into this issue. The strange thing is that my first puppeteer function uploaded fine, my new function which introduces lighthouse with puppeteer fails.

Hi there, Lambda functions have a 50MB deploy size restriction and puppeteer by itself is already pretty big. You should check if adding lighthouse takes it over that limit.

Hi @karatechops and sorry to be slow to get back to you here! I believe this covers all reasons you might see this error:

  • function size too large. AWS puts a limit on function size - 50MByte zipped or 256MByte bundled (you can see it here: https://docs.aws.amazon.com/lambda/latest/dg/limits.html). This is almost certainly the problem with puppeteer. It fits in (see example here: https://functions.netlify.com/examples/?search=puppeteer), but additional dependencies are likely to overrun that limit from our experience.
  • environment variables too large. Must be under 4k text total when contactenated in this format: VAR1=val1,VAR2=val2,... (that limit is mentioned in the same AWS docs I linked above)
  • functions with multiple dots in their filename, such as charge.run.js. That’s a Netlify limitation, not AWS.

@fool @futuregerald thanks for the help. After some trials I found the function was too large - is there a roadmap item to add appropriate error messages if a file can’t be uploaded? Additionally, is a concept similar to Lambda Layers support planned for the future? I’ve moved my project directly to AWS so that I can add my large dependencies as a function Layer.