I’m using react native webrtc on Ios for testing.
So far I have been able to use the front camera and display the render on my screen. BUT I don’t understand why the video is so slow. It’s take probably around 0.5s to render the right image. The video is late and lost some images.
Someone can help me to understand?
Here is the code:
const [isFront, setIsFront] = useState(false);
const [videoSourceId, setVideoSourceId] = useState(0);
const [stream, setStream] = useState(null);
const mediaStreamConstraints = {
audio: true,
video: {
facingMode: (!isFront ? 'user' : 'environment'),
optional: (videoSourceId ? [{ sourceId: videoSourceId }] : []),
mandatory: {
minFrameRate: 30,
},
},
};
mediaDevices.enumerateDevices(sources => {
for (let i = 0; i < sources.length; i++) {
const source = sources[i];
if (source.kind === 'videoinput' && source.facing === (isFront ? 'front' : 'environment')) {
setVideoSourceId(source.deviceId);
}
}
});
mediaDevices.getUserMedia(mediaStreamConstraints)
.then(mediaStream => {
console.log('mediaStream: ', mediaStream);
setStream(mediaStream.toURL());
})
.catch(error => {
console.log('getUserMedia error: ', error);
});
return (
<RTCView
style={styles.view}
mirror={!isFront ? true : false}
streamURL={stream}
/>
);
Thanks for helping me to understand