Skip to main content
Version: 3.x.x

Room Class Methods - Flutter

join()

  • After creating the instance of VideoSDK Room, you can join VideoSDK Room by calling join() method.

Events associated with join():

  • Local Participant will receive a roomJoined event, when successfully joined.
  • Remote Participant will receive a participantJoined event with the newly joined Participant object from the event handler.

Participant having ask_join permission inside token

  • If a token contains the permission ask_join, then the participant will not join the room directly after calling join(), but an event will be emitted to the participant having the permission allow_join called entryRequested.

  • After the decision from the remote participant, an event will be emitted to participant called entryResponded. This event will contain the decision made by the remote participant.

Participant having allow_join permission inside token

  • If a token containing the permission allow_join, then the participant will join the room directly after calling join().

Returns

  • void

leave()

  • It is used to leave the current running room session.

Events associated with leave():

  • Local participant will receive a roomLeft event.
  • All remote participants will receive a participantLeft event with participantId string from the event handler.

Returns

  • void

end()

  • It is used to end the current running room session.
  • By calling end(), all joined participants including localParticipant of that session will leave the room.

Events associated with end():

Returns

  • void

enableCam()

Returns

  • void

disableCam()

Returns

  • void

unmuteMic()

  • It is used to enable microphone device.
  • streamEnabled event will be emitted with stream object from the event handler, inside that participant object.

Returns

  • void

muteMic()

Returns

  • void

getScreenShareSources()

  • It will return all available screens and opened windows.
  • This method only supports desktop apps.

Returns

  • Future<List<DesktopCapturerSource>>

Example

meeting.getScreenShareSources().then((value) => print("Sources : $value"));

enableScreenShare()

Parameters

  • source:

    • type : DesktopCapturerSource
    • required : false
    • It will specify selected screenShare source for desktop apps.
  • enableAudio:

    • type : Bool
    • required : false
    • Enables screen sharing audio on Android devices.

Returns

  • void

disableScreenShare()

Returns

  • void

changeMode()

  • SEND_AND_RECV: Both audio and video streams will be produced and consumed.

  • SIGNALLING_ONLY: Audio and video streams will not be produced or consumed. It is used solely for signaling.

  • RECV_ONLY: Only audio and video streams will be consumed without producing any.

info

Important Changes Flutter SDK in Version v1.3.0

  • The following modes have been deprecated:
    • CONFERENCE has been replaced by SEND_AND_RECV
    • VIEWER has been replaced by SIGNALLING_ONLY

Please update your implementation to use the new modes.

⚠️ Compatibility Notice:
To ensure a seamless meeting experience, all participants must use the same SDK version.
Do not mix version v1.3.0 + with older versions, as it may cause significant conflicts.

Parameters

  • requestedMode: Mode.SEND_AND_RECV | Mode.SIGNALLING_ONLY | Mode.RECV_ONLY

Returns

  • void

Example

room.changeMode(Mode.SEND_AND_RECV);

startRecording()

Parameters

  • webhookUrl: String

  • awsDirPath: String

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 4`
    • theme: "DARK" | "LIGHT" | "DEFAULT"
    • mode: "video-and-audio" | "audio"
    • quality: "low" | "med" | "high"
    • orientation: "landscape" | "portrait"
  • postTranscriptionConfig:

    • PostTranscriptionConfig.enabled: Boolean
    • PostTranscriptionConfig.summaryConfig:
      • SummaryConfig.enabled: Boolean
      • SummaryConfig.prompt: String
    • PostTranscriptionConfig.modelId: String

Returns

  • void

Example

const String webhookUrl = "https://webhook.your-api-server.com";

Map<String, dynamic> config = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
"mode": "video-and-audio",
};

PostTranscriptionConfig postTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
modelId: "raman_v1",
summaryConfig: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
),
);

room.startRecording(
webhookUrl: webhookUrl,
config: config,
postTranscriptionConfig: postTranscriptionConfig
);

stopRecording()

Returns

  • void

Example

room.stopRecording();

startLivestream()

  • It is used to start room livestreaming.
  • You will be able to start livestream the room to another platforms such as Youtube, Facebook, etc. that supports rtmp streaming.
  • All participants and localParticipant, will receive liveStreamStarted event.

Parameters

  • outputs: List<Map<String, String>> [{ url: String, streamKey: String }]

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 4`
    • theme: "DARK" | "LIGHT" | "DEFAULT"
  • postTranscriptionConfig:

    • PostTranscriptionConfig.enabled: Boolean
    • PostTranscriptionConfig.summaryConfig:
      • SummaryConfig.enabled: Boolean
      • SummaryConfig.prompt: String
    • PostTranscriptionConfig.modelId: String

Returns

  • void

Example

var outputs = [
{
url: "rtmp://a.rtmp.youtube.com/live2",
streamKey: "<STREAM_KEY>",
},
{
url: "rtmps://",
streamKey: "<STREAM_KEY>",
},
];
var liveStreamConfig = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
};
PostTranscriptionConfig postTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
modelId: "raman_v1",
summaryConfig: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
),
);
room.startLivestream(outputs, config: livestreamConfig, postTranscriptionConfig: postTranscriptionConfig);

stopLivestream()

Returns

  • void

Example

room.stopLivestream();

startHls()

  • It is used to start HLS.
  • All participants and localParticipant, will receive hlsStarted event with the playbackHlsUrl and livestreamUrl of the HLS feed.
    • playbackHlsUrl - Live HLS with playback support
    • livestreamUrl - Live HLS without playback support
note

downstreamUrl is now depecated. Use playbackHlsUrl or livestreamUrl in place of downstreamUrl

Parameters

  • config:

    • layout:
      • type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
      • priority: "SPEAKER" | "PIN"
      • gridSize: Number `max 25`
    • theme: "DARK" | "LIGHT" | "DEFAULT"
    • mode: "video-and-audio" | "audio"
    • quality: "low" | "med" | "high"
    • oreintation: "landscape" | "portrait"
  • postTranscriptionConfig:

    • PostTranscriptionConfig.enabled: Boolean
    • PostTranscriptionConfig.summaryConfig:
      • SummaryConfig.enabled: Boolean
      • SummaryConfig.prompt: String
    • PostTranscriptionConfig.modelId: String

Returns

  • void

Example


Map<String, dynamic> config = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
"mode": "video-and-audio"
};

PostTranscriptionConfig postTranscriptionConfig = PostTranscriptionConfig(
enabled: true,
modelId: "raman_v1",
summaryConfig: SummaryConfig(
enabled: true,
prompt: "Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"
),
);
room.startHls(config: config, postTranscriptionConfig: postTranscriptionConfig);

stopHls()

Returns

  • void

Example

room.stopHls();

startTranscription()

Parameters

1. transcriptionConfig

  • type : TranscriptionConfig

  • required : false

  • This specifies the configurations for realtime transcription. You can specify following properties.

    • TranscriptionConfig.webhookUrl: Webhooks will be triggered when the state of realtime transcription is changed. Read more about webhooks here
    • TranscriptionConfig.summaryConfig:
      • SummaryConfig.enabled: Indicates whether realtime transcription summary generation is enabled. Summary will be available after realtime transcription stopped. Default: false
      • SummaryConfig.prompt: provides guidelines or instructions for generating a custom summary based on the realtime transcription content.

Returns

  • void

Example

TranscriptionConfig transcriptionConfig = TranscriptionConfig(
webhookUrl: "https://webhook.your-api-server.com",
summaryConfig: SummaryConfig(
enabled: true,
prompt:
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary"));

room.startTranscription(transcriptionConfig: transcriptionConfig);

stopTranscription()

Returns

  • void

Example

room.stopTranscription();

startWhiteboard()

  • It is used to initialize a whiteboard session.

Returns

  • void

Example

room.startWhiteboard();

stopWhiteboard()

  • It is used to end a whiteboard session.

Returns

  • void

Example

room.stopWhiteboard();

danger

The getCameras() method has been completely removed starting from version 1.2.0. Please use VideoSDK.getVideoDevices() instead.

getCameras()

  • It will return all connected camera devices.

Returns

Example

List<MediaDeviceInfo> cams = room.getCameras();
print(cams);

changeCam()

  • It is used to change the camera device.
  • If multiple camera devices are connected, by using changeCam(), one can change the camera device with camera device id.
  • You can get list of connected video devices using VideoSDK.getVideoDevices()

Parameters

  • device: VideoDeviceInfo

Returns

  • void

Example

room.changeCam(device);

uploadBase64File()

  • It is used to upload your file to Videosdk's Temporary storage

  • base64Data convert your file to base64 and pass here.

  • token pass your videosdk token. Read more about token here

  • fileName provide your fileName with extension

  • The method will return the corresponding fileUrl, which will use to retrieve the file from the VideoSDK's storage system.

Parameters

  • base64Data: String
  • token: String
  • fileName: String

Returns

  • fileUrl - It will use to retrieve the file from the VideoSDK's storage system.

Example

String? fileUrl = await room.uploadBase64File(
base64Data: "<Your File's base64>", // Convert your file to base64 and pass here
token: "<VIDEOSDK_TOKEN>",
fileName: "myImage.jpeg", // Provide name with extension here,
);

print("fileUrl $fileUrl");

fetchBase64File()

  • It is used to retrieve your file from the Videosdk's Temporary storage

  • url pass fileUrl which is returned by uploadBase64File()

  • token pass your videosdk token. Read more about token here

  • The method will return image in form of base64 string.

Parameters

  • url: String
  • token: String

Returns

  • base64 - image in form of base64 string.

Example

String? base64 = await room.fetchBase64File(
url: "<Your FileUrl>"; // Provide fileUrl which is returned by uploadBase64File(),
token :"<VIDEOSDK_TOKEN>",
);

print("base64 $base64");

danger

The getAudioOutputDevices() method has been completely removed starting from version 1.2.0. Please use VideoSDK.getAudioDevices() instead.

getAudioOutputDevices()

  • It will return all the available audio output devices.
note

For iOS devices:

  • EARPIECE is not supported whenever WIRED_HEADSET or BLUETOOTH device is connected.
  • WIRED_HEADSET and BLUETOOTH devices are not supported simultaneously. Priority is given to the most recently connected device.

Returns

Example

List<MediaDeviceInfo> audioOutputDevices = room.getAudioOutputDevices();
print(audioOutputDevices);

switchAudioDevice()

  • It is used to change the audio output device.
  • You can get list of connected audio output devices using VideoSDK.getAudioDevices()

Parameters

  • device: AudioDeviceInfo

Returns

  • void

Example

room.switchAudioDevice(device);

caution

On the web, the microphone will now be muted when the input audio device is disconnected. Users will need to unmute it to continue.

danger

The getMics() method has been completely removed starting from version 1.2.0. Please use VideoSDK.getAudioDevices() instead.

getMics()

  • It will return all the available audio input devices.

Returns

Example

List<MediaDeviceInfo> audioInputDevices = room.getMics();
print(audioInputDevices);

changeMic()

Parameters

  • device: AudioDeviceInfo

Returns

  • void
info
  • The changeMic() method is not supported on Mobile Applications.

Example

room.changeMic(device);

pauseAllStreams()

This method pauses active media streams within the meeting.

Parameters

  • kind: Specifies the type of media stream to be paused. If this parameter is omitted, all media streams (audio, video, and screen share) will be paused.

    • Type: String

    • Optional: Yes

      • Possible values:
      • "audio": Pauses audio streams.
      • "video": Pauses video streams.
      • "share": Pauses screen-sharing video streams.

Returns

  • void

Example

meeting.pauseAllStreams();

resumeAllStreams()

This method resumes media streams that have been paused

Parameters

  • kind: Specifies the type of media stream to be resumed. If this parameter is omitted, all media streams (audio, video, and screen share) will be resumed.

    • Type: String

    • Optional: Yes

      • Possible values:
      • "audio": Resumes audio streams.
      • "video": Resumes video streams.
      • "share":Resumes screen-sharing video streams.

Returns

  • void

Example

meeting.resumeAllStreams();

requestMediaRelay

This method starts relaying selected media streams (like camera video, microphone audio, screen share, etc.) from the current meeting to a specified destination meeting.

Parameters

  • MeetingId (String) – Required: ID of the target meeting where media should be relayed.

  • token (String) – Optional: Authentication token for the destination meeting.

  • kinds (List of Strings) – Optional: Array of media types to relay.

    • Possible values:
      • "audio": Resumes audio streams.
      • "video": Resumes video streams.
      • "share":Resumes screen-sharing video streams.
      • "share_audio":Resumes screen-sharing audio streams.

Returns

  • void

Example

room.requestMediaRelay("<Destination-meetingId>","<token>",["audio", "video"]);

stopMediaRelay

This method stops the ongoing media relay to a specific destination meeting.

Parameters

  • MeetingId (String) – Required: ID of the destination meeting where the media relay should be stopped.

Returns

  • void

Example

room.stopMediaRelay("<Destination-meetingId>");

switchTo

This method enables a seamless transition from the current meeting to another, without needing to disconnect and reconnect manually.

Parameters

  • meetingId (String) – Required: ID of the new meeting to switch to.

  • token (String) – Optional: Authentication token for the new meeting.

  • localParticipantId (String) - Required: Local PartiticpantId for switching the meeting

Returns

  • void

Example

room.switchTo("<meeitngId>", "<token>","<localParticipantId>");

on()

  • It is used to listen Room related events. - It is used to listen Room related events.

Parameters

  • event

    • type: Events
    • This will specify the event to be listened.
  • eventHandler

    • type: Function
    • This will be invoked whenever, specified event occurred.

Returns

  • void

Example

room.on(Events.roomJoined, () {
// do something
});

Got a Question? Ask us on discord