Skip to main content
Version: Next

Room Class Methods - Flutter

join()

  • After creating the instance of VideoSDK Room, you can join VideoSDK Room by calling join() method.

Events associated with join():

  • Local Participant will receive a roomJoined event, when successfully joined.
  • Remote Participant will receive a participantJoined event with the newly joined Participant object from the event handler.

Participant having ask_join permission inside token

  • If a token contains the permission ask_join, then the participant will not join the room directly after calling join(), but an event will be emitted to the participant having the permission allow_join called entryRequested.

  • After the decision from the remote participant, an event will be emitted to participant called entryResponded. This event will contain the decision made by the remote participant.

Participant having allow_join permission inside token

  • If a token containing the permission allow_join, then the participant will join the room directly after calling join().

Returns

  • void

leave()

  • It is used to leave the current running room session.

Events associated with leave():

  • Local participant will receive a roomLeft event.
  • All remote participants will receive a participantLeft event with participantId string from the event handler.

Returns

  • void

end()

  • It is used to end the current running room session.
  • By calling end(), all joined participants including localParticipant of that session will leave the room.

Events associated with end():

Returns

  • void

enableCam()

Returns

  • void

disableCam()

Returns

  • void

unmuteMic()

  • It is used to enable microphone device.
  • streamEnabled event will be emitted with stream object from the event handler, inside that participant object.

Returns

  • void

muteMic()

Returns

  • void

getScreenShareSources()

  • It will return all available screens and opened windows.
  • This method only supports desktop apps.

Returns

  • Future<List<DesktopCapturerSource>>

Example

meeting.getScreenShareSources().then((value) => print("Sources : $value"));

enableScreenShare()

Parameters

  • source:
    • type : DesktopCapturerSource
    • required : false
    • It will specify selected screenShare source for desktop apps.

Returns

  • void

disableScreenShare()

Returns

  • void

changeMode()

  • It is used to change the mode fo the participant from CONFERENCE to VIEWER and vice-versa.
  • participantModeChange event will be emitted with participantId and mode of the participant

Parameters

  • requestedMode: Mode.CONFERENCE | Mode.VIEWER

Returns

  • void

Example

room.changeMode(Mode.CONFERENCE);

startRecording()

Parameters

  • webhookUrl: String

  • awsDirPath: String

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 4`
    • theme: "DARK" | "LIGHT" | "DEFAULT"
    • mode: "video-and-audio" | "audio"
    • quality: "low" | "med" | "high"
    • orientation: "landscape" | "portrait"
  • transcription:

    • enabled: Boolean
    • summary:
      • enabled: Boolean
      • prompt: String

Returns

  • void

Example

const webhookUrl = "https://webhook.your-api-server.com";

Map<String, dynamic> config = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
"mode": "video-and-audio"
};

Map<String, dynamic> transcription = {
"enabled": true,
"summary": {
"enabled": true,
"prompt":
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary",
}
};

room.startRecording(webhookUrl:webhookUrl, config: config, transcription: transcription);

stopRecording()

Returns

  • void

Example

room.stopRecording();

startLivestream()

  • It is used to start room livestreaming.
  • You will be able to start livestream the room to another platforms such as Youtube, Facebook, etc. that supports rtmp streaming.
  • All participants and localParticipant, will receive liveStreamStarted event.

Parameters

  • outputs: List<Map<String, String>> [{ url: String, streamKey: String }]

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 4`
    • theme: "DARK" | "LIGHT" | "DEFAULT"

Returns

  • void

Example

var outputs = [
{
url: "rtmp://a.rtmp.youtube.com/live2",
streamKey: "<STREAM_KEY>",
},
{
url: "rtmps://",
streamKey: "<STREAM_KEY>",
},
];
var liveStreamConfig = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
};
room.startLivestream(outputs, config: livestreamConfig);

stopLivestream()

Returns

  • void

Example

room.stopLivestream();

startHls()

  • It is used to start HLS.
  • All participants and localParticipant, will receive hlsStarted event with the playbackHlsUrl and livestreamUrl of the HLS feed.
    • playbackHlsUrl - Live HLS with playback support
    • livestreamUrl - Live HLS without playback support
note

downstreamUrl is now depecated. Use playbackHlsUrl or livestreamUrl in place of downstreamUrl

Parameters

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 25`
    • theme: "DARK" | "LIGHT" | "DEFAULT"
    • mode: "video-and-audio" | "audio"
    • quality: "low" | "med" | "high"
    • oreintation: "landscape" | "portrait"
  • transcription:

    • enabled: Boolean
    • summary:
      • enabled: Boolean
      • prompt: String

Returns

  • void

Example


Map<String, dynamic> config = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
"mode": "video-and-audio"
};

Map<String, dynamic> transcription = {
"enabled": true,
"summary": {
"enabled": true,
"prompt":
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary",
}
};
room.startHls(config: config, transcription : transcription);

stopHls()

Returns

  • void

Example

room.stopHls();

startTranscription()

Parameters

config

  • type : Map<String, dynamic>

  • required : false

  • This specifies the configurations for realtime transcription. You can specify following properties.

    • webhookUrl: Webhooks will be triggered when the state of realtime transcription is changed. Read more about webhooks here
    • summary:
      • enabled: Indicates whether realtime transcription summary generation is enabled. Summary will be available after realtime transcription stopped. Default: false
      • prompt: provides guidelines or instructions for generating a custom summary based on the realtime transcription content.

Returns

  • void

Example

Map<String, dynamic> config = {
"webhookUrl": "https://webhook.your-api-server.com",
"summary": {
"enabled": true,
"prompt":
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary",
}
};
room.startTranscription(config: config);

stopTranscription()

Returns

  • void

Example

room.stopTranscription();

getCameras()

  • It will return all connected camera devices.

Returns

Example

List<MediaDeviceInfo> cams = room.getCameras();
print(cams);

changeCam()

  • It is used to change the camera device.
  • If multiple camera devices are connected, by using changeCam(), one can change the camera device with camera device id.
  • You can get list of connected video devices using VideoSDK.getVideoDevices()

Parameters

  • deviceId: String

Returns

  • void

Example

room.changeCam(deviceId);

getAudioOutputDevices()

  • It will return all the available audio output devices.
note

For iOS devices:

  • EARPIECE is not supported whenever WIRED_HEADSET or BLUETOOTH device is connected.
  • WIRED_HEADSET and BLUETOOTH devices are not supported simultaneously. Priority is given to the most recently connected device.

Returns

Example

List<MediaDeviceInfo> audioOutputDevices = room.getAudioOutputDevices();
print(audioOutputDevices);

switchAudioDevice()

  • It is used to change the audio output device.
  • You can get list of connected audio output devices using getAudioOutputDevices

Parameters

  • device: MediaDeviceInfo

Returns

  • void

Example

room.switchAudioDevice(device);

getMics()

  • It will return all the available audio input devices.

Returns

Example

List<MediaDeviceInfo> audioInputDevices = room.getMics();
print(audioInputDevices);

changeMic()

  • It is used to change the audio input device.
  • You can get list of connected audio input devices using getMics

Parameters

  • device: MediaDeviceInfo

Returns

  • void

Example

room.changeMic(device);

on()

  • It is used to listen Room related events.

Parameters

  • event

    • type: Events
    • This will specify the event to be listened.
  • eventHandler

    • type: Function
    • This will be invoked whenever, specified event occurred.

Returns

  • void

Example

room.on(Events.roomJoined, () {
// do something
});

Got a Question? Ask us on discord