Hi,
I’ve been trying for a while now to initiate a successful connection to a web server that exposes a livestream as a fragmented MP4 file, and display it on the camera as a constant feed for the user to view.
Initially I was using React-Native-Video to display the livestream. However, on iOS I receive an error that equates to it stating it does not support videos with an indefinite content-range. I tried hacking around this by specifying a content-range of 99999999999 on the server side but this often just results in it prematurely ending the stream.
Judging by this and the lack of any information I found about livesreaming using react-native-video online I assumed that it meant react-native-video didn’t support livestreams like this.
I did find a suggestion on a github post on React-Native-Video recommending use of react-native-webrtc for handling livestreams. I have now tried using that but I’ve been unable to get a connection to any sample livestreams that expose video content as an m3u8, and after having looked at the “getting started” page it seems more like this is solely intended for the purpose of peer-to-peer connections which, I think, is not what I want here.
If you require code, this is what I’ve been trying:
/**
* Sample React Native App
* https://github.com/facebook/react-native
*
* @format
* @flow strict-local
*/
import React, { useState } from 'react';
import {
Button,
SafeAreaView,
StyleSheet,
ScrollView,
View,
Text,
StatusBar,
} from 'react-native';
import { Colors } from 'react-native/Libraries/NewAppScreen';
import { mediaDevices, RTCView } from 'react-native-webrtc';
const App = () => {
return <>
<StatusBar barStyle="dark-content" />
<SafeAreaView style={styles.body}>
<RTCView
streamURL={"https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8"}
style={styles.stream} />
</SafeAreaView>
</>;
};
const styles = StyleSheet.create({
body: {
backgroundColor: Colors.white,
...StyleSheet.absoluteFill
},
stream: {
flex: 1
},
footer: {
backgroundColor: Colors.lighter,
position: 'absolute',
bottom: 0,
left: 0,
right: 0
},
});
export default App;
So, basically, what I am wondering is this:
Am I able to use React-Native-WebRTC to view livestreams as above?
If so, how?
If not, do you have any suggestions as to what libraries I should use for this purpose? Or how I might get react-native-video to support livestreams?
I am very new to media handling at this level and would appreciate any and all advice,
Oli