Rendering Host and Audience Views - iOS
In a live stream setup, only hosts (participants in SEND_AND_RECV
mode) can broadcast their audio and video. Audience members (in RECV_ONLY
mode) are passive viewers who do not share their audio/video.
To ensure optimal performance and a clean user experience, your app should:
- Render video only for hosts (i.e., participants in
SEND_AND_RECV
mode). - Display the total audience count to give context on viewership without rendering individual audience tiles.
Filtering and Rendering Hosts​
The steps involved in rendering the audio and video of hosts are as follows.
- Filtering Hosts and Checking their Mic/Webcam Status
- Rendering Video Streams of Hosts
- Handling Audience Views
1. Filtering Hosts and Checking their Mic/Webcam Status
​
In a live stream, only participants in SEND_AND_RECV
mode (i.e., hosts) actively share their audio and video. To render their streams, begin by accessing all participants using Meeting
classs. Then, filter out only those in SEND_AND_RECV
mode.
For each of these participants, use the Participant
class, which provides real-time information like displayName, micOn, and webcamOn. Display their name along with the current status of their microphone and webcam. If the webcam is off, show a simple placeholder with their name. If it's on, render their video feed. This ensures only hosts are visible to the audience, keeping the experience clean and intentional.
In a live stream, only participants in SEND_AND_RECV
mode (i.e., hosts) actively share their audio and video. To render their streams, begin by filtering all participants to show only those in SEND_AND_RECV
mode.
// Function to get visible participants (hosts only)
private func getVisibleParticipants() -> [Participant] {
// Only show participants who are in SEND_AND_RECV mode
return liveStreamViewController.participants.filter { participant in
participant.mode == .SEND_AND_RECV
}
}
In the LiveStreamViewController
class, we track the microphone status of each participant:
// Inside LiveStreamViewController's ParticipantEventListener implementation
func onStreamEnabled(_ stream: MediaStream, forParticipant participant: Participant) {
DispatchQueue.main.async {
// Only handle streams for SEND_AND_RECV participants
if participant.mode == .SEND_AND_RECV {
if let track = stream.track as? RTCVideoTrack {
if case .state(let mediaKind) = stream.kind, mediaKind == .video {
self.participantVideoTracks[participant.id] = track
self.participantCameraStatus[participant.id] = true
}
}
if case .state(let mediaKind) = stream.kind, mediaKind == .audio {
self.participantMicStatus[participant.id] = true
}
} else {
// For RECV_ONLY participants, ensure their tracks are removed
self.participantVideoTracks.removeValue(forKey: participant.id)
self.participantCameraStatus[participant.id] = false
self.participantMicStatus[participant.id] = false
}
}
}
func onStreamDisabled(_ stream: MediaStream, forParticipant participant: Participant) {
DispatchQueue.main.async {
if case .state(let mediaKind) = stream.kind {
switch mediaKind {
case .video:
self.participantVideoTracks.removeValue(forKey: participant.id)
self.participantCameraStatus[participant.id] = false
case .audio:
self.participantMicStatus[participant.id] = false
}
}
}
}
Then create a view for displaying each participant's video along with their name and microphone status:
struct ParticipantContainerView: View {
let participant: Participant
@ObservedObject var liveStreamViewController: LiveStreamViewController
// Name and mic status overlay
private var nameAndMicOverlay: some View {
VStack {
Spacer()
HStack {
Text(participant.displayName)
.foregroundColor(.white)
.padding(.horizontal, 8)
.padding(.vertical, 4)
.background(Color.black.opacity(0.5))
.cornerRadius(4)
Image(systemName: liveStreamViewController.participantMicStatus[participant.id] ?? false ? "mic.fill" : "mic.slash.fill")
.foregroundColor(liveStreamViewController.participantMicStatus[participant.id] ?? false ? .green : .red)
.padding(4)
.background(Color.black.opacity(0.5))
.clipShape(Circle())
Spacer()
}
.padding(8)
}
}
var body: some View {
// Only render if participant is in SEND_AND_RECV mode
if participant.mode == .SEND_AND_RECV {
ZStack {
participantView(participant: participant, liveStreamViewController: liveStreamViewController)
nameAndMicOverlay
}
.background(Color.black.opacity(0.9))
.cornerRadius(10)
.shadow(color: Color.black.opacity(0.7), radius: 10, x: 0, y: 5)
}
}
private func participantView(participant: Participant, liveStreamViewController: LiveStreamViewController) -> some View {
ZStack {
ParticipantView(participant: participant, liveStreamViewController: liveStreamViewController)
}
}
}
2. Rendering Video Streams of Hosts
​
Once you've filtered for participants in SEND_AND_RECV
mode (i.e., hosts), you can use the Participant
class to access their real-time data, including their webcamStream, webcamOn, and whether they are the local participant.
To render the video stream of a participant, we need to create a view that handles the WebRTC video track.
First, we define a VideoView
class:
class VideoView: UIView {
var videoView: RTCMTLVideoView = {
let view = RTCMTLVideoView()
view.videoContentMode = .scaleAspectFill
view.backgroundColor = UIColor.black
view.clipsToBounds = true
view.transform = CGAffineTransform(scaleX: 1, y: 1)
return view
}()
init(track: RTCVideoTrack?, frame: CGRect) {
super.init(frame: frame)
backgroundColor = .clear
// Set videoView frame to match parent view
videoView.frame = bounds
DispatchQueue.main.async {
self.addSubview(self.videoView)
self.bringSubviewToFront(self.videoView)
track?.add(self.videoView)
}
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func layoutSubviews() {
super.layoutSubviews()
// Update videoView frame when parent view size changes
videoView.frame = bounds
}
}
Then create a SwiftUI wrapper to use this UIKit view:
struct VideoStreamView: UIViewRepresentable {
let track: RTCVideoTrack
func makeUIView(context: Context) -> VideoView {
let view = VideoView(track: track, frame: .zero)
return view
}
func updateUIView(_ uiView: VideoView, context: Context) {
track.add(uiView.videoView)
}
}
Now create a participant view to display either the video or a placeholder:
struct ParticipantView: View {
let participant: Participant
@ObservedObject var liveStreamViewController: LiveStreamViewController
var body: some View {
ZStack {
if participant.mode == .SEND_AND_RECV,
let track = liveStreamViewController.participantVideoTracks[participant.id] {
VideoStreamView(track: track)
} else {
Color.black.opacity(1.0)
VStack {
if participant.mode == .RECV_ONLY {
Text("Viewer")
.foregroundColor(.white)
Text(participant.displayName)
.foregroundColor(.gray)
.font(.caption)
} else {
Text("No media")
.foregroundColor(.white)
}
}
}
}
}
}
3. Handling Audience View
​
For audience members, we provide a different set of controls and manage their view state. The main distinction is determining whether the current participant is in audience mode:
private var isAudienceMode: Bool {
// Derive audience mode from the current participant's mode
if let localParticipant = liveStreamViewController.participants.first(where: { $0.isLocal }) {
return localParticipant.mode == .RECV_ONLY
}
return currentMode == .RECV_ONLY
}
Certainly! You can refer to the videosdk-ils-iOS-sdk-example directory in the official VideoSDK iOS example repository for a comprehensive implementation of interactive live streaming features, including participant management and UI controls.
API Reference​
The API references for all the methods and events utilized in this guide are provided below.
Got a Question? Ask us on discord