Integration with other library

As we know, react-native-webrtc can’t make photo with full cam resolution and seems like It will never be done.

On other side, exist good library react-native-vision-camera. It can many things( even focus the camera to a specific point! ), take photos and record videos BUT local only. I.e can’t stream video, just record to file.
But it provide Frame Processing, which:
«The Frame Processor gets called with a Frame object, which is a JSI HostObject. It holds a reference to the native (C++) Frame Image Buffer (~10 MB in size) and exposes properties such as width, height, bytesPerRow and more to JavaScript so you can synchronously access them. The Frame object can be passed around in JS, as well as returned from- and passed to a native Frame Processor Plugin.»

So, how we can pass frames from react-native-vision-camera to react-native-webrtc to stream them via WebRTC ?