Description
Apologies if this is not the correct place to ask. I am working on a project to capture audio/video in C++ and to then use that data in JS to send it over WebRTC. I have a proof of concept working, populating Buffer's with video and audio frames, and drawing it to a canvas (using putImageData
) or feeding the audio data into a script processor. From the canvas I can get a video stream to send, and from the script processor destination node I can get an audio stream. This is all great. What would be even better is if I could render the canvas off-screen, and use an AudioWorklet
to get my drawing and audio processing off of the main thread. It seems the only viable way to do this is to write to a SharedArrayBuffer that can be used by these workers, but there is no API (that I know of) for doing so in a node module.
I found a very old issue and associated PR here: nodejs/node#23276 but it hasn't been touched in about 3 years. Is this just totally non-viable? Are there any alternatives I can look into? At this point is it worth revisiting that old PR and seeing if there is anything salvageable?
Thanks