Issues with iOS Screenshare - Red Screen

Hello,
I am trying to create an app that streams the users’ Screen via WebRTC. For the iOS setup, I’ve followed the Jitsi tutorial ([iOS SDK · Jitsi Meet Handbook](iOS SDK | Jitsi Handbook)).

It seems to be working fine and I don’t get any errors in XCode or the Metro server. But when executing this code [1], I only get a red screen. That red screen however is in the correct resolution and even rotates when my phone rotates.

I’ve looked through both GitHub Issues as well as this page and haven’t found anything regarding this error. Help would be much appreciated!

Thanks in advance!
Niklas

[1]

const Home = () => {
  let [stream, setStream] = useState<any>(null);

  useEffect(() => {
    (async () => {
      mediaDevices
        .getDisplayMedia({
          video: true,
        })
        .then(stream => {
          console.log(stream);
          setStream(stream);
        })
        .catch(error => {
          // Log error
          console.log(error);
        });
    })();
  }, []);

  return (
      <RTCView
              streamURL={stream ? stream.toURL() : null}
              style={tailwind('h-64 w-full')}
            />
  );
};

export default Home;