Skip to main content
Version: 2.0.x

Meeting Class Methods - iOS

join()

  • It is used to join a meeting.
  • After meeting initialization by initMeeting() it returns a new instance of Meeting. However by default, it will not automatically join the meeting. Hence, to join the meeting you should call join().

Events associated with join():

  • Local Participant will receive a onMeetingJoined event, when successfully joined.
  • Remote Participant will receive a [addParticipant] event with the newly joined Participant object from the event callback.

Returns

  • void

leave()

  • It is used to leave the current meeting.

Events associated with leave():

Returns

  • void

end()

  • It is used to end the current running session.
  • By calling end(), all joined participants including localParticipant of that session will leave the meeting.

Events associated with end():

Returns

  • void

enableWebcam()

Returns

  • void

disableWebcam()

Returns

  • void

switchWebcam()

  • It is used to switch between front and back camera.

Returns

  • void

unmuteMic()

Returns

  • void

muteMic()

Returns

  • void

startRecording()

Parameters

  • webhookUrl: String

  • awsDirPath: String?

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 4`
    • theme: "DARK" | "LIGHT" | "DEFAULT"
    • mode: "video-and-audio" | "audio"
    • quality: "low" | "med" | "high"
    • orientation: "landscape" | "portrait"
  • transcription: PostTranscriptionConfig?

    • PostTranscriptionConfig.enabled: Bool
    • PostTranscriptionConfig.summary: SummaryConfig?
      • SummaryConfig.enabled: Bool
      • SummaryConfig.prompt: String

Returns

  • void

Example

let webhookUrl = "https://webhook.your-api-server.com"

let awsDirPath = "/meeting-recordings/"

let config: RecordingConfig = RecordingConfig(
layout: ConfigLayout(
type: .GRID,
priority: .PIN,
gridSize: 4
),
theme: .DARK,
mode: .video_and_audio,
quality: .med,
orientation: .landscape
)

let transcription: PostTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
summary: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
)
)

startRecording(webhookUrl: webhookUrl!, awsDirPath: awsDirPath, config: config, transcription: transcription)

stopRecording()

Returns

  • void

Example

stopRecording();

startLivestream()

  • It is used to start meeting livestreaming.
  • You will be able to start live stream meetings to other platforms such as Youtube, Facebook, etc. that support RTMP streaming.
  • All participants and localParticipant, will receive onLivestreamStarted event.

Parameters

  • outputs: [LivestreamOutput]

Returns

  • void

Example


startLivestream(outputs: outputs)

stopLivestream()

Returns

  • void

Example

stopLivestream();

startHLS()

  • startHLS() will start HLS streaming of your meeting.

  • You will be able to start HLS and watch the live stream of meeting over HLS.

  • mode is used to either start hls streaming of video-and-audio both or only audio. And by default it will be video-and-audio.

  • quality is only applicable to video-and-audio.

Parameters

  • config:

    • layout:
      • type: .GRID | .SPOTLIGHT | .SIDEBAR
      • priority: .SPEAKER | .PIN
      • gridSize: Number max 25
    • theme: .DARK | .LIGHT | .DEFAULT
    • mode: .video_and_audio | .audio
    • quality: .low | .med | .high
  • transcription: PostTranscriptionConfig?

    • PostTranscriptionConfig.enabled: Bool
    • PostTranscriptionConfig.summary: SummaryConfig?
      • SummaryConfig.enabled: Bool
      • SummaryConfig.prompt: String

Returns

  • void

Example

var config: HLSConfig = HLSConfig(
layout: HLSLayout(
type: .GRID,
priority: .SPEAKER,
gridSize: 4
),
theme: .DARK,
mode: .video_and_audio,
quality: .high,
orientation: .portrait
)

let transcription: PostTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
summary: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
)
)

startHLS(config: config, transcription: transcription)

stopHLS()

  • stopHLS() is used to stop the HLS streaming.

Returns

  • void

Example

stopHLS();

startTranscription()

  • It is used to start realtime transcription.

  • All participants and localParticipant, will receive TRANSCRIPTION_STARTING event state and onTranscriptionText event.

  • config: TranscriptionConfig: webhookUrl will be triggered when the state of realtime transcription is changed. Read more about webhooks here.

  • config: TranscriptionConfig: summary : SummaryConfig Set summary config for the transcription.

    • SummaryConfig: enabled indicates whether to generate summary when meeting ends.
    • SummaryConfig: prompt the specific prompt around which summary is to be written.

Parameters

  • config: TranscriptionConfig?
    • TranscriptionConfig.webhookUrl: String?
    • TranscriptionConfig.summary: SummaryConfig?
      • SummaryConfig.enabled: Bool
      • SummaryConfig.prompt: String

Returns

  • void

Example

let config =  TranscriptionConfig(
webHookUrl: "YOUR_WEBHOOK_URL",
summary: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
)
)

startTranscription(config);

stopTranscription()

Returns

  • void

Example

stopTranscription();

changeMode()

  • changeMode() is used to change the mode of participant.

Parameters

  • mode: Mode
    • All available participant modes are CONFERENCE and VIEWER.

Events associated with changeMode():

Returns

  • void

Example

changeMode(.VIEWER | .CONFERENCE)

enableScreenShare()

  • enableScreenShare() is used to share your local screen to the remote participants in the meeting.

Returns

  • void

Example

Task {
await meeting.enableScreenShare()
}

disableScreenShare()

  • disableScreenShare() is used to stop sharing your local screen.

Returns

  • void

Example

Task {
await meeting.disableScreenShare()
}

getMics()

  • getMics method will return an array of tuples, where each tuple consists of two elements deviceName and deviceType.

Returns

  • [deviceName: String, deviceType: String]

Example

meeting.getMics();
// sample returned array
// [("Speaker","Speaker"), ("iPhone Microphone", "Receiver")]

changeMic()

  • changeMic method will return an array of tuples, where each tuple consists of two elements deviceName and deviceType.

Parameters

  • selectedDevice: String
    • Here selectedDevice should be the exact string of the deviceName that you get after calling the getMics method.

Returns

  • void

Example

meeting.changeMic(selectedDevice: "Speaker")

uploadBase64File()

  • uploadBase64File method uploads a file to VideoSDK's temporary storage and returns the corresponding URL string to retrieve the file from the storage system.

Parameters

  • base64Data: String
    • The Base64 encoded string of the file to be uploaded.
  • token: String
    • The authentication token required to access VideoSDK's storage system.
  • fileName: String
    • The name under which the file will be stored.
  • completion: @escaping (String?) -> Void
    • A closure that will be called with the URL string of the uploaded file or nil if the upload fails.

Returns

  • void

Example

  let base64Data: String = "<Your-base64-File>"
let token: String = "<Your-Token>"
let fileName: String = "name.jpeg" // Provide file name with extension
meeting.uploadBase64File(base64Data: base64Data, token: token, fileName: fileName) { url in
print("url: ", url)
}

fetchBase64File()

  • fetchBase64File method retrieves a file from VideoSDK's temporary storage and returns it in the form of a Base64 String.

Parameters

  • url: String
    • The URL string of the file to be retrieved.
  • token: String
    • The authentication token required to access VideoSDK's storage system.
  • completion: @escaping (String?) -> Void
    • A closure that will be called with the Base64 string of the retrieved file or nil if the retrieval fails.

Returns

  • void

Example

  let url: String = "<Your FileUrl>" // Provide fileUrl which is returned by uploadBase64File()
let token: String = "<Your-Token>"
meeting.fetchBase64File(url: url, token: token) { base64Data in
print("base64: ", base64Data)
}

setVideoProcessor()

  • setVideoProcessor method sets the processor that will be used in processing the local video frames.

Parameters

  • processor: VideoSDKVideoProcessor?
    • An instance of a processor class that conforms to the VideoSDKVideoProcessor protocol.

Returns

  • void

Example

  let backgroundSource = URL(string: "https://cdn.videosdk.live/virtual-background/cloud.jpeg")!
let processor = VideoSDKBackgroundProcessor(backgroundSource: backgroundSource)
meeting.setVideoProcessor(processor: processor)

Got a Question? Ask us on discord