Stream content to function response body

Hello,

I was wondering if it’s possible to stream files to the response body.
I don"t see any ways to do this with a netlify function:

server.on('request', (req, res) => {
  const src = fs.createReadStream('./big.file');
  src.pipe(res);
});

Thanks!

Hi,

Thanks for writing in. Right now you can’t stream data to a function. You have to send the entire payload with the request.

Hello,

Do you have any plans on supporting that or a workaround ?

hi @armaldio - can you outline your use case a little bit more - how this would be advantageous, why you need this etc? The more information we have, the easier it is to make a case for this :slight_smile:

Sure,

I’m trying to stream a big file to gain access to progress features as the file is sent from a lambda to the browser and i’d like to show a progress bar

Right now, we don’t support streaming from a lambda function and we don’t have enough information to file a feature request. That said, how did you plan to implement what you describe?

I have a use case where I’m implementing a limited reverse proxy for a particular service to work around the fact one site enforces CORS and the service doesn’t support HTTPS. The images are small enough and I expect to have low-enough traffic I can get away with creating a single in-memory buffer and concatenating them all with Buffer.concat when it finishes (the images are all fairly small), but this is definitely suboptimal and it’d be easier to just return the stream. Streaming would also scale better and take less memory.

My code currently looks something like this:

"use strict"

const http = require("http")

exports.handler = async (event, context) => {
    const res = await once(http.get(url), "response")
    const chunks = []
    let size = 0

    function receive(buf) {
        size += buf.length
        chunks.push(buf)
    }

    res.on("data", receive)

    try {
        await once(res, "end")
    } finally {
        res.removeListener("data", receive)
    }
    
    return {
        // headers, status code, etc.
        isBase64Encoded: true,
        body: Buffer.concat(received, size).toString("base64"),
    }
}

function once(emitter, event) {
    return new Promise((resolve, reject) => {
        function pass(arg) {
            emitter.removeListener("error", fail)
            resolve(arg)
        }
        function fail(arg) {
            emitter.removeListener(event, fail)
            reject(arg)
        }
        emitter.once(event, pass)
        emitter.once("error", fail)
    })
}

Ideally, I’d use a rewrite rule and instructing Netlify to explicitly not cache the returned result as the backend server in question returns a random image that changes on each request, with headers explicitly stating to not cache the response. But since this functionality doesn’t exist, it’s much easier to just write a function than file a very specific feature request for that. And unlike OP, I do at least have the option of keeping the full response in memory for a short period of time.

If I could return a stream from the body knowing it’d get implicitly .piped to the response, I’d do this instead, removing about 50% of the code in my function’s handler:

exports.handler = async (event, context) => {
    const res = await once(http.get(url), "response")

    return {
        // headers, status code, etc.
        body: res,
    }
}

Thanks for chiming in here @isiahmeadows! We have an open request to support streaming from functions in our proxy. We’ll let you know once it’s been added. Thanks.

1 Like