Livestream Quickstart
In this tutorial we'll quickly build a low-latency in-app livestreaming experience. The livestream is broadcasted using Stream's edge network of servers around the world.
This tutorial is structured into two parts:
Part 1: Building a Livestreaming React Native App
- Creating a livestream on the Stream dashboard
- Setting up RTMP input with OBS software
- Viewing the livestream on a React Native app
- Building custom livestream viewer UI
Part 2: Creating an Interactive Livestreaming App
- Publishing a livestream from a mobile device with WebRTC
- Implementing backstage and go live functionality
Let's get started! If you have any questions or feedback, please let us know via the feedback button.
Step 1 - Create a livestream in the dashboard
First, let's create our livestream using the dashboard. To do this, open the dashboard and select "Video & Audio" -> "Overview".
In that screen, you will see three buttons that allow you to create different types of calls, as shown in the image below.
Click on the third one, the "Create Livestream" option. After you do this, you will be shown the following screen, which contains information about the livestream:
You will need the RTMP URL
and RTMP Stream Key
from this page, which are needed to setup the livestream in OBS software.
Copy these values for now, and we will get back to the dashboard a bit later.
Step 2 - Setup the livestream in OBS
OBS is one of the most popular livestreaming software packages and we'll use it to explain how to publish video with RTMP.
After you download and install the software using the instructions provided on the link, you should setup the capturing device and the livestream data.
First, let's setup the capturing device, which can be found in the sources section:
Select the "Video Capture Device" option to stream from your computer's camera. Alternatively, you can choose other options, such as "macOS Screen Capture," to stream your screen.
Next, we need to provide the livestream credentials from our dashboard to OBS. To do this, click on the "Settings" button located in the "Controls" section at the bottom right corner of OBS.
This will open a popup. Select the second option, "Stream". For the "Service" option, choose "Custom". In the "Server" and "Stream Key" fields, enter the values you copied from the dashboard in Step 1.
With that, our livestream setup is complete. Before returning to the dashboard, press the "Start Streaming" button in the "Controls" section.
Now, let's go back to the dashboard. If everything is set up correctly, you should see the OBS livestream in the dashboard, as shown in this screenshot:
Note that by default, the dashboard will start the livestream immediately.
For your own livestreams, you can configure this behavior in the dashboard by enabling or disabling the backstage for the livestream
call type.
Step 3 - Show the livestream in a React Native app
Now that the livestream has started, let's see how we can watch it from a React Native app. First, set up the project and install the required dependencies.
Step 3.1 - Create a new React Native app
To create a new React Native app, you'll first need to set up your environment. Once you're set up, continue with the steps below to create an application and start developing.
You can use React Native Community CLI to generate a new project. Let's create a new React Native project called "LivestreamExample":
12npx @react-native-community/cli@latest init LivestreamExample cd LivestreamExample
If you are having trouble with iOS, try to reinstall the dependencies by running:
cd ios
to navigate to theios
folderbundle install
to install Bundlerbundle exec pod install
to install the iOS dependencies managed by CocoaPods
Step 3.2 - Install the SDK and its dependencies
To install the Stream Video React Native SDK, run the following command in your terminal of choice:
1yarn add @stream-io/video-react-native-sdk @stream-io/react-native-webrtc
The SDK requires installing some peer dependencies. You can run the following command to install them:
1234567yarn add react-native-incall-manager yarn add react-native-svg yarn add @react-native-community/netinfo yarn add @notifee/react-native # Install pods for iOS npx pod-install
Android specific: update buildscript with required SDK versions
In android/build.gradle
add the following inside the buildscript
section:
1234567buildscript { ext { ... minSdkVersion = 24 } ... }
Step 3.3 - View a livestream on a React Native app
The following code shows you how to create a livestream viewer with React Native that will play the stream we created above.
Let's open App.tsx
and replace its contents with the following code:
1234567891011121314151617181920212223242526272829303132import { LivestreamPlayer, StreamVideo, StreamVideoClient, User, } from "@stream-io/video-react-native-sdk"; import {useEffect} from 'react'; import IncallManager from 'react-native-incall-manager'; const apiKey = "REPLACE_WITH_API_KEY"; const token = "REPLACE_WITH_TOKEN"; const callId = "REPLACE_WITH_CALL_ID"; const user: User = { type: "anonymous" }; const client = new StreamVideoClient({ apiKey, user, token }); export default function App() { // Automatically route audio to speaker devices as relevant for watching videos. // Please read more about `media` options at https://github.com/react-native-webrtc/react-native-incall-manager#usage useEffect(() => { IncallManager.start({ media: 'video' }); return () => IncallManager.stop(); }, []); return ( <StreamVideo client={client}> <LivestreamPlayer callType="livestream" callId={callId} /> </StreamVideo> ); }
Before running the app, you should replace the placeholders with values from the dashboard:
- For the
apiKey
, use theAPI Key
value from your livestream page in the dashboard - Replace the
token
value with theViewer Token
value in the dashboard - Replace the
callId
withLivestream ID
That is everything that's needed to play a livestream with our React Native SDK.
The LivestreamPlayer
component allows you to play a livestream easily, by just specifying the call id and call type.
If you now run the app, you will see the livestream published from the OBS software. To run the app, execute the following command:
12345# run iOS app yarn ios # run Android app yarn android
You can find more details about the LivestreamPlayer
on the following page.
Step 3.4 - Customizing the UI
Based on your app's requirements, you might want to have a different user interface for your livestream. In those cases, you can build your custom UI, while reusing some of the SDK components and the state layer.
State & Participants
If you want to build more advanced user interfaces, that will include filtering of the participants by various criteria or different sorting,
you can access the call state via call.state
or through one of our Call State Hooks.
One example is filtering of the participants.
You can get all the participants with role host
with the following code:
12345import { useCallStateHooks } from '@stream-io/video-react-native-sdk'; const { useParticipants } = useCallStateHooks(); const participants = useParticipants(); const hosts = participants.filter(p => p.roles.includes('host'));
The participant state docs show all the available fields.
For sorting, you can build your own comparators, and sort participants based on your own criteria. The React Native Video SDK provides a set of comparators that you can use as building blocks, or create your own ones as needed.
Here's an example of a possible livestream related sorting comparator:
12345678910111213141516import { combineComparators, role, dominantSpeaker, speaking, publishingAudio, publishingVideo } from '@stream-io/video-react-native-sdk'; const livestreamComparator = combineComparators( role('host', 'speaker'), dominantSpeaker(), speaking(), publishingVideo(), publishingAudio(), );
These comparators will prioritize users who are the hosts, and then participants who are speaking, and then who are publishing video and audio.
To apply the sorting, you can use the following code:
1234567import { useCallStateHooks } from '@stream-io/video-react-native-sdk'; const { useParticipants } = useCallStateHooks(); const sortedParticipants = useParticipants({ sortBy: livestreamComparator }); // alternatively, you can apply the comparator on the whole call: call.setSortParticipantsBy(livestreamComparator);
To read more about participant sorting, check the participant sorting docs.
Now, let's build a custom player, that will display the livestream and the number of viewers.
To achieve this, create a new component, named CustomLivestreamPlayer.tsx
and add the following code:
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283import React, {useEffect, useState} from 'react'; import {View, Text, StyleSheet, SafeAreaView} from 'react-native'; import { Call, StreamCall, useCallStateHooks, useStreamVideoClient, VideoRenderer, } from "@stream-io/video-react-native-sdk"; import IncallManager from 'react-native-incall-manager'; export const CustomLivestreamPlayer = (props: { callType: string; callId: string; }) => { const { callType, callId } = props; const client = useStreamVideoClient(); const [call, setCall] = useState<Call>(); useEffect(() => { if (!client) return; const myCall = client.call(callType, callId); setCall(myCall); myCall.join().catch((e) => { console.error('Failed to join call', e); }); return () => { myCall.leave().catch((e) => { console.error('Failed to leave call', e); }); setCall(undefined); }; }, [callId, callType, client]); if (!call) { return null; } return ( <StreamCall call={call}> <CustomLivestreamLayout /> </StreamCall> ); }; const CustomLivestreamLayout = () => { const {useParticipants, useParticipantCount} = useCallStateHooks(); const participantCount = useParticipantCount(); const [firstParticipant] = useParticipants(); // Automatically route audio to speaker devices as relevant for watching videos. useEffect(() => { IncallManager.start({media: 'video'}); return () => IncallManager.stop(); }, []); return ( <SafeAreaView style={styles.flexed}> <Text style={styles.text}>Live: {participantCount}</Text> <View style={styles.flexed}> {firstParticipant ? ( <VideoRenderer participant={firstParticipant} /> ) : ( <Text style={styles.text}>The host hasn't joined yet</Text> )} </View> </SafeAreaView> ); }; const styles = StyleSheet.create({ flexed: { flex: 1, backgroundColor: 'white', }, text: { alignSelf: 'center', color: 'white', backgroundColor: 'blue', padding: 6, margin: 4, }, });
In the code above, we are checking if we have a participant in our call state.
If yes, we are using the VideoRenderer
component from the SDK to display the participant's stream. If no participant is available, we are simply showing a text view with a message that the livestream is not available.
We are also adding a label in the top corner, that displays the total participant count.
This information is available via the useParticipantCount()
call state hook.
Finally, for simplicity, we are using the useEffect
lifecycle to join and leave the livestream.
Based on logic of your app, you may invoke these actions on user input, for example, buttons for joining and leaving a livestream.
To test the new implementation, replace the SDK provided LivestreamPlayer
with the new CustomLivestreamPlayer
in the App
component.
12345678// ... the rest of the code export default function App() { return ( <StreamVideo client={client}> <CustomLivestreamPlayer callType="livestream" callId={callId} /> </StreamVideo> ); }
Part 2 - Build your own Youtube Live
In the first part of this tutorial, we built a simple livestream app, where we published a livestream using RTMP. The authentication was done using the Dashboard. In a real application you want to generate tokens programmatically using a server-side SDK.
The second part of this tutorial expands our app to include interactive functionality such as streaming from end-user devices.
Step 4 - Live streaming from a React Native app
We are going to send video from a React Native app directly using WebRTC and use the backstage functionality. Note that for this part of the tutorial, you will need a real Android or iOS device.
Step 4.1 - Permissions setup
Publishing a livestream requires camera and microphone access, you need to request permissions to use them in your app. To do this, first, the permissions required must be declared in your app.
In AndroidManifest.xml
add the following permissions before the application
section.
123456789101112131415161718<manifest xmlns:android="http://schemas.android.com/apk/res/android"> <uses-feature android:name="android.hardware.camera" /> <uses-feature android:name="android.hardware.camera.autofocus" /> <uses-feature android:name="android.hardware.audio.output" /> <uses-feature android:name="android.hardware.microphone" /> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" /> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" /> <uses-permission android:name="android.permission.INTERNET" /> ... <application ... /> </application> </manifest>
Add the following keys and values to Info.plist
file, under dict
tag.
123456789101112<plist version="1.0"> <dict> ... <key>CFBundleName</key> <string>$(PRODUCT_NAME)</string> <key>NSCameraUsageDescription</key> <string>$(PRODUCT_NAME) needs camera access for broadcasting</string> <key>NSMicrophoneUsageDescription</key> <string>$(PRODUCT_NAME) requires microphone access in order to capture and transmit audio</string> ... </dict> </plist>
Step 4.2 - Broadcasting a livestream
Replace all the existing code we had for the viewer experience in the App.tsx
(or create a new project by following the same steps as above) with the following code:
1234567891011121314151617181920212223242526272829303132import React from 'react'; import { StreamVideoClient, StreamVideo, User, StreamCall, } from "@stream-io/video-react-native-sdk"; import { SafeAreaView, Text } from 'react-native'; const apiKey = "REPLACE_WITH_API_KEY"; const token = "REPLACE_WITH_TOKEN"; const userId = "REPLACE_WITH_USER_ID"; const callId = "REPLACE_WITH_CALL_ID"; const user: User = { id: userId, name: "Tutorial" }; const client = new StreamVideoClient({ apiKey, user, token }); const call = client.call("livestream", callId); call.join({ create: true }); export default function App() { return ( <StreamVideo client={client} language='en'> <StreamCall call={call}> <SafeAreaView style={{ flex: 1, backgroundColor: 'white' }}> <LivestreamView /> </SafeAreaView> </StreamCall> </StreamVideo> ); } const LivestreamView = () => <Text style={{ fontSize: 30, color: 'black' }}>TODO: render video</Text>;
When you run the app now you'll see a text message saying: "TODO: render video". Before we get around to rendering the video let's review the code above.
In the first step, we set up the user:
123import type { User } from "@stream-io/video-react-native-sdk"; const user: User = { id: userId, name: "Tutorial" };
Next, we initialize the client:
123import { StreamVideoClient } from "@stream-io/video-react-native-sdk"; const client = new StreamVideoClient({ apiKey, user, token });
You'll see the token
variable. Your backend typically generates the user token on signup or login.
The most important step to review is how we create the call.
Our SDK uses the same Call
object for livestreaming, audio rooms and video calling.
Have a look at the code snippet below:
12const call = client.call("livestream", callId); call.join({ create: true });
To create the call object, specify the call type as livestream and provide a callId. The livestream call type comes with default settings that are usually suitable for livestreams, but you can customize features, permissions, and settings in the dashboard.
Additionally, the dashboard allows you to create new call types as required.
Finally, call.join({ create: true })
will create the call object on our servers but also initiate the real-time transport for audio and video.
This allows for seamless and immediate engagement in the livestream.
Note that you can also add members to a call and assign them different roles. For more information, see the call creation docs.
Step 4.3 - Rendering the video
In this step, we're going to build a UI for showing your local video with a button to start the livestream.
In App.tsx
replace the LivestreamView
component implementation with the following one:
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758import { useCallStateHooks, VideoRenderer } from '@stream-io/video-react-native-sdk'; import { View, Button, Text, StyleSheet } from 'react-native'; import IncallManager from 'react-native-incall-manager'; const LivestreamView = () => { const { useParticipantCount, useLocalParticipant, useIsCallLive } = useCallStateHooks(); const totalParticipants = useParticipantCount(); const localParticipant = useLocalParticipant(); const isCallLive = useIsCallLive(); // Automatically route audio to speaker devices as relevant for watching videos. // Please read more about `media` and `auto` options in the documentation of react-native-incall-manager // https://github.com/react-native-webrtc/react-native-incall-manager#usage useEffect(() => { IncallManager.start({ media: 'video' }); return () => IncallManager.stop(); }, []); return ( <View style={styles.flexed}> <Text style={styles.text}>Live: {totalParticipants}</Text> <View style={styles.flexed}>{localParticipant && <VideoRenderer participant={localParticipant} trackType='videoTrack' />}</View> <View style={styles.bottomBar}> {isCallLive ? ( <Button onPress={() => call?.stopLive()} title='Stop Live' /> ) : ( <Button onPress={() => { call?.goLive(); }} title='Go Live' /> )} </View> </View> ); }; const styles = StyleSheet.create({ flexed: { flex: 1, }, text: { alignSelf: 'center', color: 'white', backgroundColor: 'blue', padding: 6, margin: 4, }, bottomBar: { alignSelf: 'center', margin: 4, }, });
Step 5 - Backstage and GoLive
The backstage functionality makes it easy to build a flow where you and your co-hosts can set up your camera and equipment before going live.
Only after you call call.goLive()
regular users will be allowed to join the livestream.
This is convenient for many livestreaming and audio-room use cases.
If you want calls to start immediately when you join them, that's also possible.
Go the Stream dashboard, find the livestream
call type and disable the backstage mode.
Step 6 - Preview using React
To run the React Native app on a device, please follow the steps in the official documentation from React Native. Now let's press Go live in the React Native app. Upon going live, you will be greeted with an interface that looks like this:
You can also click the link below to watch the video in your browser (as a viewer).
Advanced Features
This tutorial covered the steps required to watch a livestream using RTMP-in and OBS software, as well as how to publish a livestream from a browser.
There are several advanced features that can improve the livestreaming experience:
- Co-hosts You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc. You can see how to render multiple video tracks in our video calling tutorial.
- Permissions and Moderation You can set up different types of permissions for different types of users and a request-based approach for granting additional access.
- Custom events You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
- Reactions & Chat Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
- Notifications You can notify users via push notifications when the livestream starts
- Recording The call recording functionality allows you to record the call with various options and layouts
- HLS Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the WebRTC approach is realtime. The benefit that HLS offers is better buffering under poor network conditions.
Recap
It was fun to see just how quickly you can build in-app low latency livestreaming. Please do let us know if you run into any issues. Our team is also happy to review your UI designs and offer recommendations on how to achieve them with Stream Video SDKs.
To recap what we've learned:
- WebRTC is optimal for latency, HLS is slower but buffers better for users with poor connections
- You set up a call:
const call = client.call("livestream", callId)
- The call type
livestream
controls which features are enabled and how permissions are set up - When you join a call, realtime communication is set up for audio & video:
call.join()
- Call State Hooks make it easy to build your own UI
- You can easily publish your own video and audio from a React Native app on a mobile device
Calls run on Stream's global-edge network of video servers. Being closer to your users improves the latency and reliability of calls.
The SDKs enable you to build livestreaming, audio rooms and video calling in days.
We hope you've enjoyed this tutorial and please do feel free to reach out if you have any suggestions or questions.
Final Thoughts
In this video app tutorial we built a fully functioning React Native messaging app with our React Native SDK component library. We also showed how easy it is to customize the behavior and the style of the React Native video app components with minimal code changes.
Both the video SDK for React Native and the API have plenty more features available to support more advanced use-cases.