How do Interprefy Captions work?

Frequently asked questions about Interprefy's automatic captioning services for meetings and events

Captions are on the rise. More than 80% of Netflix members use closed captions or subtitles at least once a month. We provide meeting and event organisers with a powerful tool to provide a live captioning experience in multiple languages during their online or hybrid events.

What we offer:

  • Automatic captions: Automatic closed captions, rendering the speaker audio into text in real-time.
    • Glossary function: Enhance captions accuracy by pre-loading the system with specific names, brand names or product names.
  • Translated captions: Real-time translated captions, rendering the content into a different language automatically.

Interprefy Captions are generated on Interprefy platform off the audio speech of each speaker (and interpreter, if active) using AI.

The AI provides text directly from the words being spoken. Just like interpretation, captions will follow as live transcription slightly after the speaker has delivered their words.

Additionally, we can use AI to deliver captions in another language different to the one the speaker is talking in. 

How it works

How it works illustrations (4)

Frequently Asked Questions


1. What is the difference between automatic captions and translated captions? 

Automatic captions are those AI-powered captions that transcribe the speech into text in real-time. 

Translated captions: This type of captions display in real-time and in a different language what the speaker is saying.


2. What are the differences between automatic captions of what the interpreter is saying and translated captions? 

This comes down to the key differences between AI-powered translation and human interpretation. Conference interpreters will always strive to convey the message of the speaker, and may paraphrase, while AI translation aims for completeness of translation of the sentences spoken.

Automatic captions from interpreting audio are being used in conferences involving simultaneous interpretation and are in sync with the audio interpretation.

Because the captions are based on a professional live audio translation from a vetted and subject-savvy conference interpreter, the speech is translated by taking cultural aspects, context, and tone of voice into consideration. 

Translated captions powered by AI provide a complete translation of the sentences spoken.


3. Are Interprefy Captions available for events or meetings without simultaneous interpretation? 

Yes, Interprefy Captions, automatic and translated, can be used for events without interpretation. However, we can also support both simultaneous interpretation (spoken or signed), automatic and translated captions simultaneously.  


4. Can a user select to listen to the floor audio language and read captions in a different language? 



5. Why do I need live captions? 

Captions are especially useful for delegates and attendees who are for some reason unable to hear what is being said or for those who choose to read rather than listen or those who need visual reinforcement. There are a number of benefits for event organisers and content creators that we outlined in detail in this blog article.

Example users include: 

  • The deaf and hard-of-hearing, who can follow the dialogue in written form with the aid of captioning. 
  • People who wish to follow the discussion but are in a location where another dialogue is taking place. 
  • Individuals in a noisy environment like in a café who wish to follow the event even when listening conditions are poor. 
  • Those who wish to have a readable feed to back up their understanding of what is being said. For instance, in a chemical conference when complex formulas are being voiced it is sometimes useful to have a readable text feed alongside the spoken words. 
  • Those attending (but not contributing) in areas of poor network connection where audio feeds may be unreliable. 


6. How do I make sure the captions are accurately reflecting the words of the speaker? 

The words and terms spoken by the speaker or interpreter are automatically recognised on Interprefy platform by AI technology. For the system to recognise the speech, good source audio quality is essential.

Things that might impact accuracy include:

  • Background noise
  • Volume and clarity of the speakers' voice
  • Lexicons and heavy accents

As with any multilingual meeting, we recommend educating speakers about the importance of high audio quality and clear, precise, and paced speech.  

Populating the glossary before the event further supports accuracy of the transcription and is essential to help key terms, acronyms and names to be spelled out correctly. 


7. What is the delay for captions to appear on screen? 

Interprefy Captions can be enabled in two different modes. By default, the text will appear within 4 seconds of the speaker having completed a sentence. If 'instant mode' is activated, text will appear in real-time with instant auto-correction. 


8. Are Interprefy Captions available in the Interprefy mobile app? 

Yes, the Interprefy mobile app also supports captions. This is particularly useful for audiences at a venue accessing live captions on their mobile. 


9. Are Interprefy Captions available in events using the Interprefy Select widget on a third-party platform? 

Captions are available on Interprefy Connect, Interprefy Connect Pro and selected third-party platforms. Please connect with an Interprefy representative to discuss availability in your preferred platform. 


10. Which languages are Interprefy captions available in? 

Click here to see the latest list of available captioning languages.


11. What is the pricing model for adding Interprefy Captions to my meeting or event? 

Interprefy Captions are available as a cost option. Pricing is dependent on two factors: Amount of languages required and event duration and involves.


12. Can I have both the audio and text appear at the same time? 

Yes, of course. Many users like to have written reinforcement of what’s being said whereas others prefer to read the content only. Users can turn on/off captions anytime and adjust text size and colour. 


13. Are captions available as transcripts after the event? 

Yes, transcripts of the captions can be made available after the event. 


14.  Which translation engine does Interprefy use? 

We don’t use a single translation engine, but hand-select engines for each language pair. Our AI Delivery team continuously tests and compares leading translation engines to ensure that for each language combination, we pick the best-performing engine. 


15. Can I see an example of live captions in a real event?

Here is a video recording of Interprefy translated captions at a recent Autodesk event.

The layout of the captions and the way they are accessed will depend on the platform your event runs on and the device your users use to follow the event.

To get more information or schedule a live demo, contact us