Skip to main content
Version: 2.x.x

Live Captioning for Livestreams - Flutter

Live captioning enhances your livestreams by converting hosts' speech into text in real-time, boosting accessibility and engagement. Using the Room class, you can enable or disable captions on the fly, and display captions dynamically in your UI for all viewers.

Video SDK offers flexible configuration and event-driven updates to help you integrate captions seamlessly into your broadcast layout.

startTranscription()​

The startTranscription() method, accesible from the Room class, is used to initiate live captions in a live stream. This method accepts the following two parameters:

  • 1. webhookUrl (optional): This is the webhook URL where you can listen to events related to the recording, such as the start and stop of recording. It triggers when the recording is completed and stored on the server. You can learn more about webhooks here

  • 2. summary (optional):

    • enabled: Indicates whether realtime transcription summary generation is enabled. Summary will be available after realtime transcription stopped. Default: false.
    • prompt: provides guidelines or instructions for generating a custom summary based on the realtime transcription content.

stopTranscription()​

The stopTranscription() method, accesible from the Room class, is used to stop the live captions in a live stream.

Events associated with live captioning​

Example​

import 'package:flutter/material.dart';
import 'package:videosdk/videosdk.dart';
class LiveStreamView extends StatefulWidget {
...
}

class _LiveStreamViewState extends State<LiveStreamView> {
// Room object
late Room _room;

@override
void initState() {
setupLivestreamEventListener();
...
}

@override
Widget build(BuildContext context) {
return Column(children: [
ElevatedButton(
onPressed: () {
// Configuration for realtime transcription
final summaryConfig = SummaryConfig(
enabled: true, prompt: 'Initial prompt for summary');
final transcriptionConfig = TranscriptionConfig(
webhookUrl: "https://your-webhook-url.com",
summaryConfig: summaryConfig,
);
// Start realtime transcription
_room.startTranscription(transcriptionConfig: transcriptionConfig);
},
child: const Text("Start Realtime Transcription"),
),
ElevatedButton(
onPressed: () {
// Stop realtime transcription
_room.stopTranscription();
},
child: const Text("Stop Realtime Transcription"),
),
]);
}

void setupLivestreamEventListener() {
// Listen for transcription state changed event
_room.on(Events.transcriptionStateChanged, (Map<String, dynamic> data) {
//Status can be :: TRANSCRIPTION_STARTING
//Status can be :: TRANSCRIPTION_STARTED
//Status can be :: TRANSCRIPTION_STOPPING
//Status can be :: TRANSCRIPTION_STOPPED
log("Livestream transcription status : ${data['status']}");
});

// Listen for transcription text event
_room.on(Events.transcriptionText, (Map<String, dynamic> data) {
log("${data['participantName']}: ${data['text']} ${data['timestamp']}");
});
}
}

API Reference​

The API references for all the methods utilized in this guide are provided below.

Got a Question? Ask us on discord