Hi. I have a react-native application that establish WebRTC communication using this library for Android. Also I have built the same application (from the same code, using react-native-web and react-native-webrtc-web-shim) for the Web (running in Browser). If I use Android devices - everything works fine, but if I use my application from the browser - for some reason I don’t hear sound at all on browser’s side only (on Android device the sound is transmitted and outputted).
Would anyone can help me to fugure out why the sound doesn’t outputted on web side? What could I act and check?
My app code snippet:
import React, {useEffect, useRef, useState } from 'react';
import { SafeAreaView, View, } from 'react-native';
import {
mediaDevices,
MediaStream,
RTCIceCandidate, RTCPeerConnection, RTCSessionDescription, registerGlobals } from 'react-native-webrtc'
import { webSocket } from './socket';
function App(): React.JSX.Element {
const [remoteStream, setRemoteStream] = useState<MediaStream | null>(null); //maybe it is not needed
const localStreamRef = useRef<MediaStream | null>(null);
const peerConnection = useRef<RTCPeerConnection | null>(null);
useEffect(() => {
registerGlobals();
startLocalStream();
webSocket.onmessage = async (event: WebSocketMessageEvent) => {
let message = JSON.parse(event.data);
switch (message.type) {
case "candidate":
{
await handleCandidate(message);
break;
}
case "offer":
{
await handleOffer(message);
break;
}
case "answer":
{
await handleAnswer(message);
break;
}
case "icecandidate":
{
await hadleICECandidate(message);
break;
}
}
}
const handleCandidate = async (candidate: any) => {
const pc = createPeerConnection();
peerConnection.current = pc;
//Add local stream
localStreamRef.current.getTracks().forEach(track =>
pc.addTrack(track, localStreamRef.current!)
);
const options = { iceRestart: false } //can be ommitted
const offer = await pc.createOffer(options);
await pc.setLocalDescription(offer);
//Send 'offer' via websocket ...
}
const handleOffer = async (offer: any) => {
const pc = createPeerConnection();
peerConnection.current = pc;
//Add local stream
localStreamRef.current.getTracks().forEach(track =>
pc.addTrack(track, localStreamRef.current!)
);
const rtcOffer = offer.data.description;
await pc.setRemoteDescription(new RTCSessionDescription(rtcOffer));
const answer = await pc.createAnswer();
await pc.setLocalDescription(answer);
//Send 'answer' via websocket ...
}
const handleAnswer = async (answer: any) => {
const rtcAnswer = answer.data.description;
await peerConnection.current.setRemoteDescription(new RTCSessionDescription(rtcAnswer));
}
const hadleICECandidate = async (candidate: any) => {
const iceCandidate = candidate.data.candidate as RTCIceCandidate;
await peerConnection.current.addIceCandidate(iceCandidate);
}
return () => {
//Cleanup
webSocket.onmessage = null;
peerConnection.current?.close();
peerConnection.current = null;
}
}, [])
const createPeerConnection = () : RTCPeerConnection => {
const pr = new RTCPeerConnection({iceServers: [{ url: '...' }]})
// Setup ICE handling
pr.addEventListener("icecandidate", (e: any) => {
if (e.candidate)
//resend to another user
webSocket.send(JSON.stringify({
type: "icecandidate", data: { candidate: e.candidate.toJSON() }
}))
});
pr.addEventListener("track", (e) => setRemoteStream(e.streams[0])); //maybe extra line
return pr;
}
const startLocalStream = async () => {
const _localStream = await mediaDevices.getUserMedia({ audio: true });
localStreamRef.current = _localStream;
}
const toogleCall = async () => {
//send online/offline signals via socket to the server ...
}
return (
<SafeAreaView>
<View>
<Button onPress={toogleCall} />
</View>
</SafeAreaView>
);
}
In the code above 'react-native-webrtc'
import is replaced to 'react-native-webrtc-web-shim'
by webpack while bundling.
As you can see, I don’t use RTCView
because I use audio only, but I don’t know it’s required for this case or not ( I think - not required, because it’s working on Android devices)