Room Class Methods - Flutter
join()
- After creating the instance of VideoSDK Room, you can join VideoSDK Room by calling
join()
method.
Events associated with join()
:
- Local Participant will receive a
roomJoined
event, when successfully joined. - Remote Participant will receive a
participantJoined
event with the newly joinedParticipant
object from the event handler.
Participant having ask_join
permission inside token
-
If a token contains the permission
ask_join
, then the participant will not join the room directly after callingjoin()
, but an event will be emitted to the participant having the permissionallow_join
calledentryRequested
. -
After the decision from the remote participant, an event will be emitted to participant called
entryResponded
. This event will contain the decision made by the remote participant.
Participant having allow_join
permission inside token
- If a token containing the permission
allow_join
, then the participant will join the room directly after callingjoin()
.
Returns
void
leave()
- It is used to leave the current running room session.
Events associated with leave()
:
- Local participant will receive a
roomLeft
event. - All remote participants will receive a
participantLeft
event withparticipantId
string from the event handler.
Returns
void
end()
- It is used to end the current running room session.
- By calling
end()
, all joined participants including localParticipant of that session will leave the room.
Events associated with end()
:
- All participants and localParticipant, will receive
roomLeft
event.
Returns
void
enableCam()
- It is used to enable camera device.
streamEnabled
event will be emitted withstream
object from the event handler, inside that participant object.
Returns
void
disableCam()
- It is used to disable camera device.
streamDisabled
event will be emitted withstream
object from the event handler, inside that participant object.
Returns
void
unmuteMic()
- It is used to enable microphone device.
streamEnabled
event will be emitted withstream
object from the event handler, inside that participant object.
Returns
void
muteMic()
- It is used to disable microphone device.
streamDisabled
event will be emitted withstream
object from the event handler, inside that participant object.
Returns
void
getScreenShareSources()
- It will return all available screens and opened windows.
- This method only supports desktop apps.
Returns
Future<List<DesktopCapturerSource>>
Example
meeting.getScreenShareSources().then((value) => print("Sources : $value"));
enableScreenShare()
- it is used to enable screen-sharing.
streamEnabled
event will be emitted withstream
object from the event handler, inside that participant object.- presenterChanged will also receive a callback with the
presenterId
.
Parameters
- source:
- type :
DesktopCapturerSource
- required : false
- It will specify selected screenShare source for desktop apps.
- type :
Returns
void
disableScreenShare()
- It is used to disable screen-sharing.
streamDisabled
event will be emitted withstream
object from the event handler, inside that participant object.- presenterChanged will also receive a callback with the
null
.
Returns
void
changeMode()
- It is used to change the mode fo the participant from
CONFERENCE
toVIEWER
and vice-versa. participantModeChange
event will be emitted withparticipantId
andmode
of the participant
Parameters
- requestedMode:
Mode.CONFERENCE
|Mode.VIEWER
Returns
void
Example
room.changeMode(Mode.CONFERENCE);
startRecording()
-
It is used to start room recording.
-
All participants and localParticipant, will receive
recordingStarted
event. -
webhookUrl
will be triggered when the recording is completed and stored into server. Read more about webhooks here.
Parameters
-
webhookUrl: String
-
awsDirPath: String
-
config:
- layout:
- type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
- priority: "SPEAKER" | "PIN"
- gridSize: Number `max 4`
- theme: "DARK" | "LIGHT" | "DEFAULT"
- mode: "video-and-audio" | "audio"
- quality: "low" | "med" | "high"
- orientation: "landscape" | "portrait"
- layout:
-
transcription:
- enabled: Boolean
- summary:
- enabled: Boolean
- prompt: String
Returns
void
Example
const webhookUrl = "https://webhook.your-api-server.com";
Map<String, dynamic> config = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
"mode": "video-and-audio"
};
Map<String, dynamic> transcription = {
"enabled": true,
"summary": {
"enabled": true,
"prompt":
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary",
}
};
room.startRecording(webhookUrl:webhookUrl, config: config, transcription: transcription);
stopRecording()
- It is used to stop room recording.
- All participants and localParticipant, will receive
recordingStopped
event.
Returns
void
Example
room.stopRecording();
startLivestream()
- It is used to start room livestreaming.
- You will be able to start livestream the room to another platforms such as Youtube, Facebook, etc. that supports
rtmp
streaming. - All participants and localParticipant, will receive
liveStreamStarted
event.
Parameters
-
outputs:
List<Map<String, String>>
[{ url: String, streamKey: String }] -
config:
- layout:
- type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
- priority: "SPEAKER" | "PIN"
- gridSize: Number `max 4`
- theme: "DARK" | "LIGHT" | "DEFAULT"
- layout:
Returns
void
Example
var outputs = [
{
url: "rtmp://a.rtmp.youtube.com/live2",
streamKey: "<STREAM_KEY>",
},
{
url: "rtmps://",
streamKey: "<STREAM_KEY>",
},
];
var liveStreamConfig = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
};
room.startLivestream(outputs, config: livestreamConfig);
stopLivestream()
- It is used to stop room livestreaming.
- All participants and localParticipant, will receive
livestreamStopped
event.
Returns
void
Example
room.stopLivestream();
startHls()
- It is used to start HLS.
- All participants and localParticipant, will receive
hlsStarted
event with theplaybackHlsUrl
andlivestreamUrl
of the HLS feed.playbackHlsUrl
- Live HLS with playback supportlivestreamUrl
- Live HLS without playback support
downstreamUrl
is now depecated. Use playbackHlsUrl
or livestreamUrl
in place of downstreamUrl
Parameters
-
config:
- layout:
- type: "GRID" | "SPOTLIGHT" | "SIDEBAR"
- priority: "SPEAKER" | "PIN"
- gridSize: Number `max 25`
- theme: "DARK" | "LIGHT" | "DEFAULT"
- mode: "video-and-audio" | "audio"
- quality: "low" | "med" | "high"
- oreintation: "landscape" | "portrait"
- layout:
-
transcription:
- enabled: Boolean
- summary:
- enabled: Boolean
- prompt: String
Returns
void
Example
Map<String, dynamic> config = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
"mode": "video-and-audio"
};
Map<String, dynamic> transcription = {
"enabled": true,
"summary": {
"enabled": true,
"prompt":
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary",
}
};
room.startHls(config: config, transcription : transcription);
stopHls()
- It is used to stop HLS.
- All participants and localParticipant, will receive
hlsStopped
event.
Returns
void
Example
room.stopHls();
startTranscription()
-
It is used to start realtime transcription.
-
All participants and localParticipant, will receive
TRANSCRIPTION_STARTING
event state andtranscription-text
event.
Parameters
config
-
type :
Map<String, dynamic>
-
required :
false
-
This specifies the configurations for realtime transcription. You can specify following properties.
webhookUrl
: Webhooks will be triggered when the state of realtime transcription is changed. Read more about webhooks heresummary
:enabled
: Indicates whether realtime transcription summary generation is enabled. Summary will be available after realtime transcription stopped. Default:false
prompt
: provides guidelines or instructions for generating a custom summary based on the realtime transcription content.
Returns
void
Example
Map<String, dynamic> config = {
"webhookUrl": "https://webhook.your-api-server.com",
"summary": {
"enabled": true,
"prompt":
"Write summary in sections like Title, Agenda, Speakers, Action Items, Outlines, Notes and Summary",
}
};
room.startTranscription(config: config);