Participant Class Methods - Javascript
enableWebcam()
enableWebcam()
is used to enable participant's camera.
Events associated with enableWebcam()
:
-
First the participant will get a callback on
webcam-requested
and once the participant accepts the request, webcam will be enabled. -
Every Participant will receive a
stream-enabled
event withstream
object.
Returns
void
disableWebcam()
disableWebcam()
is used to disable participant camera.
Events associated with disableWebcam()
:
- Every Participant will receive a
stream-disabled
event withstream
object.
Returns
void
enableMic()
enableMic()
is used to enable participant microphone.
Events associated with enableMic()
:
-
First the participant will get a callback on
mic-requested
and once the participant accepts the request, mic will be enabled. -
Every Participant will receive a
stream-enabled
event withstream
object.
Returns
void
disableMic()
disableMic()
is used to disable participant microphone.
Events associated with disableMic()
:
- Every Participant will receive a
stream-disabled
event withstream
object.
Returns
void
setQuality()
setQuality()
is used to set the quality of the participant's video stream.
Parameters
quality
: low | med | high
Returns
void
pin()
- It is used to set pin state of the participant. You can use it to pin the screen share, camera or both of the participant. It accepts a optional paramter of type
String
. DefaultSHARE_AND_CAM
Parameters
- pinType:
SHARE_AND_CAM
|CAM
|SHARE
Returns
void
Example
participant.pin("CAM");
unpin()
- It is used to unpin participant. You can use it to unpin the screen share, camera or both of the participant. It accepts a optional paramter of type
String
. Default isSHARE_AND_CAM
Parameters
- pinType:
SHARE_AND_CAM
|CAM
|SHARE
Returns
void
Example
participant.unpin("CAM");
remove()
- It is used to remove a participant from the meeting
captureImage()
-
It is used to capture image of the participant's current videoStream.
-
It will return image in a form of
base64
.
Parameters
- height: number
- width: number
setViewPort()
setViewPort()
is used to set the quality of the participant's video stream based on the viewport height and width.
Parameters
- width: int
- height: int
Returns
void
getVideoStats()
getVideoStats()
will return an object which will contain details regarding the participant's critical video metrics such as Jitter, Packet Loss, Quality Score etc.
Returns
object
jitter
: It represents the distortion in the stream.bitrate
: It represents the bitrate of the stream which is being transmitted.totalPackets
: It represents the total packet count which were transmitted for that particiular stream.packetsLost
: It represents the total packets lost during the transimission of the stream.rtt
: It represents the time between the stream being reached to client from the server in milliseconds(ms).codec
: It represents the codec used for the stream.network
: It represents the network used to transmit the streamsize
: It is object containing the height, width and frame rate of the stream.
getVideoStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.
To view the metrics for the complete meeting using the stats API documented here.
If you are getting rtt
greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.
getAudioStats()
getAudioStats()
will return an object which will contain details regarding the participant's critical audio metrics such as Jitter, Packet Loss, Quality Score etc.
Returns
object
jitter
: It represents the distortion in the stream.bitrate
: It represents the bitrate of the stream which is being transmitted.totalPackets
: It represents the total packet count which were transmitted for that particiular stream.packetsLost
: It represents the total packets lost during the transimission of the stream.rtt
: It represents the time between the stream being reached to client from the server in milliseconds(ms).codec
: It represents the codec used for the stream.network
: It represents the network used to transmit the stream
getAudioStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.
To view the metrics for the complete meeting using the stats API documented here.
If you are getting rtt
greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.
getShareStats()
getShareStats()
will return an object which will contain details regarding the participant's critical video metrics such as Jitter, Packet Loss, Quality Score etc.
Returns
object
jitter
: It represents the distortion in the stream.bitrate
: It represents the bitrate of the stream which is being transmitted.totalPackets
: It represents the total packet count which were transmitted for that particiular stream.packetsLost
: It represents the total packets lost during the transimission of the stream.rtt
: It represents the time between the stream being reached to client from the server in milliseconds(ms).codec
: It represents the codec used for the stream.network
: It represents the network used to transmit the streamsize
: It is object containing the height, width and frame rate of the stream.
getShareStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.
To view the metrics for the complete meeting using the stats API documented here.
If you are getting rtt
greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.
getShareAudioStats()
getShareAudioStats()
will return an object which will contain details regarding the participant's critical screen share audio metrics such as Jitter, Packet Loss, Quality Score etc.
Returns
object
jitter
: It represents the distortion in the stream.bitrate
: It represents the bitrate of the stream which is being transmitted.totalPackets
: It represents the total packet count which were transmitted for that particiular stream.packetsLost
: It represents the total packets lost during the transimission of the stream.rtt
: It represents the time between the stream being reached to client from the server in milliseconds(ms).codec
: It represents the codec used for the stream.network
: It represents the network used to transmit the stream
getShareAudioStats() will return the metrics for the participant at that given point of time and not average data of the complete meeting.
To view the metrics for the complete meeting using the stats API documented here.
If you are getting rtt
greater than 300ms, try using a different region which is nearest to your user. To know more about changing region visit here.
Got a Question? Ask us on discord