Skip to main content
Version: 1.0.x

Participant Class Methods

enableCam()

  • enableCam() is used to enable participant's camera.

Events associated with enableCam():

  • First the participant will get a callback on cameraRequested and once the participant accepts the request, camera device will be enabled.
  • Every Participant will receive a streamEnabled event with stream object.

Returns

  • void

Example

participant.enableCam();

disableCam()

  • disableCam() is used to disable participant camera.

Events associated with disableCam():

  • Every Participant will receive a streamDisabled event with stream object.

Returns

  • void

Example

participant.disableCam();

unmuteMic()

  • unmuteMic() is used to enable participant microphone.

Events associated with unmuteMic():

  • First the participant will get a callback on micRequested and once the participant accepts the request, mic will be enabled.
  • Every Participant will receive a streamEnabled event with stream object.

Returns

  • void

Example

participant.unmuteMic();

muteMic()

  • muteMic() is used to disable participant microphone.

Events associated with muteMic():

  • Every Participant will receive a streamDisabled event with stream object.

Returns

  • void

Example

participant.muteMic();

setQuality()

  • It is used to set the quality of the participant's video stream.

Parameters

  • quality: "low"|| "med" | "high"

Returns

  • void

Example

participant.setQuality("high");

setViewPort()

  • setViewPort() is used to set the quality of the participant's video stream based on the viewport height and width.

Parameters

  • width: int
  • height: int

Returns

  • void

Example

participant.setViewPort(480, 360);

remove()

  • It is used to remove participant from the room.

Returns

  • void

Example

participant.remove();

on()

Parameters

  • event

    • type: Events
    • This will specify the event to be listened.
  • eventHandler

    • type: Function
    • This will be invoked whenever, specified event occurred.

Returns

  • void

Example

participant.on(Events.streamEnabled, (Stream stream){
// do something
});

getVideoStats()

  • getVideoStats() will return an List<dynamic>? which will contain details regarding the participant's critical video metrics such as Jitter, Packet Loss etc.

Returns

  • object
    • jitter : It represents the distortion in the stream.
    • bitrate : It represents the bitrate of the stream which is being transmitted.
    • totalPackets : It represents the total packet count which were transmitted for that particiular stream.
    • packetsLost : It represents the total packets lost during the transimission of the stream.
    • rtt : It represents the time between the stream being reached to client from the server in milliseconds(ms).
    • codec: It represents the codec used for the stream.
    • network: It represents the network used to transmit the stream
    • size: It is object containing the height, width and frame rate of the stream.
note

getVideoStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.

To view the metrics for the complete meeting using the stats API documented here.

info

If you are getting rtt greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.

If you are getting high packet loss, try using the setViewport() for better experience. To know more about setViewport() visit here


getAudioStats()

  • getAudioStats() will return an List<dynamic>? which will contain details regarding the participant's critical audio metrics such as Jitter, Packet Loss etc.

Returns

  • List
    • jitter : It represents the distortion in the stream.
    • bitrate : It represents the bitrate of the stream which is being transmitted.
    • totalPackets : It represents the total packet count which were transmitted for that particiular stream.
    • packetsLost : It represents the total packets lost during the transimission of the stream.
    • rtt : It represents the time between the stream being reached to client from the server in milliseconds(ms).
    • codec: It represents the codec used for the stream.
    • network: It represents the network used to transmit the stream
note

getAudioStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.

To view the metrics for the complete meeting using the stats API documented here.

info

If you are getting rtt greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.

getShareStats()

  • getShareStats() will return an List<dynamic>? which will contain details regarding the participant's critical video metrics such as Jitter, Packet Loss etc.

Returns

  • List
    • jitter : It represents the distortion in the stream.
    • bitrate : It represents the bitrate of the stream which is being transmitted.
    • totalPackets : It represents the total packet count which were transmitted for that particiular stream.
    • packetsLost : It represents the total packets lost during the transimission of the stream.
    • rtt : It represents the time between the stream being reached to client from the server in milliseconds(ms).
    • codec: It represents the codec used for the stream.
    • network: It represents the network used to transmit the stream
    • size: It is object containing the height, width and frame rate of the stream.
note

getShareStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.

To view the metrics for the complete meeting using the stats API documented here.

info

If you are getting rtt greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.