Meeting Class Methods - iOS
join()
- It is used to join a meeting.
- After meeting initialization by
initMeeting()
it returns a new instance of Meeting. However by default, it will not automatically join the meeting. Hence, to join the meeting you should calljoin()
.
Events associated with join()
:
- Local Participant will receive a
onMeetingJoined
event, when successfully joined. - Remote Participant will receive a [
addParticipant
] event with the newly joinedParticipant
object from the event callback.
Returns
void
leave()
- It is used to leave the current meeting.
Events associated with leave()
:
- Local participant will receive a
onMeetingLeft
event. - All remote participants will receive a
onParticipantLeft
event withparticipantId
.
Returns
void
end()
- It is used to end the current running session.
- By calling
end()
, all joined participants including localParticipant of that session will leave the meeting.
Events associated with end()
:
- All participants and localParticipant, will be emitted [
closeRoom
] event.
Returns
void
enableWebcam()
- It is used to enable self camera.
onStreamEnabled
event will be emitted withstream
object from the event callback, inside that participant object.
Returns
void
disableWebcam()
- It is used to enable self camera.
onStreamDisabled
event will be emitted withstream
object from the event callback, inside that participant object.
Returns
void
switchWebcam()
- It is used to switch between front and back camera.
Returns
void
unmuteMic()
- It is used to enable self microphone.
onStreamEnabled
event will be emitted withstream
object from the event callback, inside that participant object.
Returns
void
muteMic()
- It is used to disable self microphone.
onStreamDisabled
event will be emitted withstream
object from the event callback, inside that participant object.
Returns
void
startRecording()
-
It is used to start meeting recording.
-
All participants and localParticipant, will receive
onRecordingStarted
event. -
webhookUrl
will be triggered when the recording is completed and stored into server. Read more about webhooks here.
Parameters
-
webhookUrl: String
-
awsDirPath: String?
-
config:
- layout:
- type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
- priority: "SPEAKER" | "PIN"
- gridSize: Number `max 4`
- theme: "DARK" | "LIGHT" | "DEFAULT"
- mode: "video-and-audio" | "audio"
- quality: "low" | "med" | "high"
- orientation: "landscape" | "portrait"
- layout:
-
transcription: PostTranscriptionConfig?
- PostTranscriptionConfig.enabled: Bool
- PostTranscriptionConfig.summary: SummaryConfig?
- SummaryConfig.enabled: Bool
- SummaryConfig.prompt: String
Returns
void
Example
let webhookUrl = "https://webhook.your-api-server.com"
let awsDirPath = "/meeting-recordings/"
let config: RecordingConfig = RecordingConfig(
layout: ConfigLayout(
type: .GRID,
priority: .PIN,
gridSize: 4
),
theme: .DARK,
mode: .video_and_audio,
quality: .med,
orientation: .landscape
)
let transcription: PostTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
summary: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
)
)
startRecording(webhookUrl: webhookUrl!, awsDirPath: awsDirPath, config: config, transcription: transcription)
stopRecording()
- It is used to stop meeting recording.
- All participants and localParticipant, will receive
onRecordingStopped
event.
Returns
void
Example
stopRecording();
startLivestream()
- It is used to start meeting livestreaming.
- You will be able to start live stream meetings to other platforms such as Youtube, Facebook, etc. that support
RTMP
streaming. - All participants and localParticipant, will receive
onLivestreamStarted
event.
Parameters
- outputs:
[LivestreamOutput]
Returns
void
Example
startLivestream(outputs: outputs)
stopLivestream()
- It is used to stop meeting livestreaming.
- All participants and localParticipant, will receive
onLivestreamStopped
event.
Returns
void
Example
stopLivestream();
startHLS()
-
startHLS()
will start HLS streaming of your meeting. -
You will be able to start HLS and watch the live stream of meeting over HLS.
-
mode
is used to either start hls streaming of video-and-audio both or only audio. And by default it will be video-and-audio. -
quality
is only applicable to video-and-audio.
Parameters
-
config:
- layout:
- type: .GRID | .SPOTLIGHT | .SIDEBAR
- priority: .SPEAKER | .PIN
- gridSize: Number
max 25
- theme: .DARK | .LIGHT | .DEFAULT
- mode: .video_and_audio | .audio
- quality: .low | .med | .high
- layout:
-
transcription: PostTranscriptionConfig?
- PostTranscriptionConfig.enabled: Bool
- PostTranscriptionConfig.summary: SummaryConfig?
- SummaryConfig.enabled: Bool
- SummaryConfig.prompt: String
Returns
void
Example
var config: HLSConfig = HLSConfig(
layout: HLSLayout(
type: .GRID,
priority: .SPEAKER,
gridSize: 4
),
theme: .DARK,
mode: .video_and_audio,
quality: .high,
orientation: .portrait
)
let transcription: PostTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
summary: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
)
)
startHLS(config: config, transcription: transcription)
stopHLS()
stopHLS()
is used to stop the HLS streaming.
Returns
void
Example
stopHLS();
startTranscription()
-
It is used to start realtime transcription.
-
All participants and localParticipant, will receive
TRANSCRIPTION_STARTING
event state andonTranscriptionText
event. -
config: TranscriptionConfig: webhookUrl
will be triggered when the state of realtime transcription is changed. Read more about webhooks here. -
config: TranscriptionConfig: summary : SummaryConfig
Set summary config for the transcription.SummaryConfig: enabled
indicates whether to generate summary when meeting ends.SummaryConfig: prompt
the specific prompt around which summary is to be written.
Parameters
- config: TranscriptionConfig?
- TranscriptionConfig.webhookUrl: String?
- TranscriptionConfig.summary: SummaryConfig?
- SummaryConfig.enabled: Bool
- SummaryConfig.prompt: String
Returns
void
Example
let config = TranscriptionConfig(
webHookUrl: "YOUR_WEBHOOK_URL",
summary: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
)
)
startTranscription(config);
stopTranscription()
- It is used to stop realtime transcription.
- All participants and localParticipant, will receive
TRANSCRIPTION_STOPPING
event state.
Returns
void
Example
stopTranscription();
changeMode()
changeMode()
is used to change the mode of participant.
Parameters
- mode: Mode
- All available participant modes are
SEND_AND_RECV
,SIGNALLING_ONLY
andRECV_ONLY
.
- All available participant modes are
Important Changes IOS SDK in Version v2.2.0
- The following modes have been deprecated:
CONFERENCE
has been replaced bySEND_AND_RECV
VIEWER
has been replaced bySIGNALLING_ONLY
Please update your implementation to use the new modes.
⚠️ Compatibility Notice:
To ensure a seamless meeting experience, all participants must use the same SDK version.
Do not mix version v2.2.0 + with older versions, as it may cause significant conflicts.
Events associated with changeMode()
:
- Every participant will receive a callback of state change of mode
onParticipantModeChanged
Returns
void
Example
changeMode(.SIGNALLING_ONLY | .SEND_AND_RECV)
enableScreenShare()
enableScreenShare()
is used to share your local screen to the remote participants in the meeting.
Returns
void
Example
Task {
await meeting.enableScreenShare()
}
disableScreenShare()
disableScreenShare()
is used to stop sharing your local screen.
Returns
void
Example
Task {
await meeting.disableScreenShare()
}
getMics()
getMics
method will return an array of tuples, where each tuple consists of two elements deviceName and deviceType.
Returns
[deviceName: String, deviceType: String]
Example
meeting.getMics();
// sample returned array
// [("Speaker","Speaker"), ("iPhone Microphone", "Receiver")]
changeMic()
changeMic
method will return an array of tuples, where each tuple consists of two elements deviceName and deviceType.
Parameters
- selectedDevice: String
- Here selectedDevice should be the exact string of the
deviceName
that you get after calling thegetMics
method.
- Here selectedDevice should be the exact string of the
Returns
void
Example
meeting.changeMic(selectedDevice: "Speaker")
uploadBase64File()
uploadBase64File
method uploads a file to VideoSDK's temporary storage and returns the corresponding URL string to retrieve the file from the storage system.
Parameters
- base64Data: String
- The Base64 encoded string of the file to be uploaded.
- token: String
- The authentication token required to access VideoSDK's storage system.
- fileName: String
- The name under which the file will be stored.
- completion: @escaping (String?) -> Void
- A closure that will be called with the URL string of the uploaded file or
nil
if the upload fails.
- A closure that will be called with the URL string of the uploaded file or
Returns
void
Example
let base64Data: String = "<Your-base64-File>"
let token: String = "<Your-Token>"
let fileName: String = "name.jpeg" // Provide file name with extension
meeting.uploadBase64File(base64Data: base64Data, token: token, fileName: fileName) { url in
print("url: ", url)
}
fetchBase64File()
fetchBase64File
method retrieves a file from VideoSDK's temporary storage and returns it in the form of a Base64 String.
Parameters
- url: String
- The URL string of the file to be retrieved.
- token: String
- The authentication token required to access VideoSDK's storage system.
- completion: @escaping (String?) -> Void
- A closure that will be called with the Base64 string of the retrieved file or
nil
if the retrieval fails.
- A closure that will be called with the Base64 string of the retrieved file or
Returns
void
Example
let url: String = "<Your FileUrl>" // Provide fileUrl which is returned by uploadBase64File()
let token: String = "<Your-Token>"
meeting.fetchBase64File(url: url, token: token) { base64Data in
print("base64: ", base64Data)
}
setVideoProcessor()
setVideoProcessor
method sets the processor that will be used in processing the local video frames.
Parameters
- processor:
VideoSDKVideoProcessor
?- An instance of a processor class that conforms to the
VideoSDKVideoProcessor
protocol.
- An instance of a processor class that conforms to the
Returns
void
Example
let backgroundSource = URL(string: "https://cdn.videosdk.live/virtual-background/cloud.jpeg")!
let processor = VideoSDKBackgroundProcessor(backgroundSource: backgroundSource)
meeting.setVideoProcessor(processor: processor)
startWhiteboard()
-
startWhiteboard
method starts the whiteboard session. -
All participants and localParticipant, will receive
onWhiteboardStarted
event along with the url of the whiteboard.
Returns
void
Example
meeting.startWhiteboard();
stopWhiteboard()
-
stopWhiteboard
method stops the whiteboard session. -
All participants and localParticipant, will receive
onWhiteboardStopped
event.
Returns
void
Example
meeting.stopWhiteboard();
pauseAllStreams()
-
The
pauseAllStreams
method pauses active media streams within the meeting. -
Specifies the type of media stream to be paused. If this parameter is provided, only that particular type of media stream will be paused. Otherwise, by default, all media streams (audio, video, and screen share) will be paused.
Returns
void
Example
meeting.pauseAllStreams(kind: "video");
resumeAllStreams()
-
resumeAllStreams
method resumes media streams that have been paused. -
Specifies the type of media stream to be resumed. If this parameter is provided, only that particular type of media stream will be resumed. Otherwise, by default, all media streams (audio, video, and screen share) will be resumed.
Returns
void
Example
meeting.resumeAllStreams(kind: "video");
Got a Question? Ask us on discord