Live stream not working while using react-native-camera along with webRTC

Hi, try to do the face detection on live streaming. Using react-native-webrtc <RTCView /> I have done live streaming. It’s working fine alone. while integrating react-native-camera for face detection live streaming not working. But face detection working perfectly. Please help with these guys almost 7days gone on this. I posted the same 5 days back. Today it’s the last date of me to submit this project :pray::pray::sleepy::sleepy:

Please suggest if we can do the face detection using react-native-webrtc and firebase mlkit also it’s fine

Packages which I used in my project
"react": "16.8.3",
"react-native": "0.59.4",
"react-native-camera": "^2.11.1",
"react-native-webrtc": "1.67.1",
"": "2.2.0"

I’m afraid you’re going to need to hack react-native-webrtc to get ahold of the actual video frames to do the detection. See the VideoTrackAdapter as an example, is receives each video frame and we use it for detecting remote video muting.

Good luck!

Is there any example with you saghul. for understanding the thing

Thankyou saghul. I will go throw this once get doubt I will ping here . Can you help on this?

No, I can’t sorry. I don’t have the experience with what you’re trying to do, but more importantly, I don’t have the time.

1 Like

Thank you, Did you have any idea why react-native-webrtc not working along with the react-native-camera?

Probably because both try to open the camera and only one of them will be able to do so.

Yup, the same thing is shown in the Xcode console. Is there any option to do both in the same ? Or should I go for VideoTrackAdapter?

No, you will need to go the adapter route.

1 Like

Pls let me know if you were able to stream react-native-camera feed through react-native-webrtc as we are also in the same situation as you.