At a recent Accessibility Fair, I was able to view Microsoft Translator in action during a session. The speaker was not a native "American" English speaker and seemed to mumble at times, but the Translator was able to keep up with him and provide near perfect, live, English transcription that I could view on my cell phone or on my computer in any of more than 60 different languages. I was wondering if there would be a way for Echo360 to incorporate Microsoft Translator into Live Streams? I don't have any idea what this would look like in the live stream environment, but the thought would be that it could be used as not only a translation service (11 supported speech languages that translate into over 60 different languages), but also as a live closed caption method. The Microsoft Translator Hub is where the developers would get the API and other coding information: https://hub.microsofttranslator.com/ Just my thoughts; I'm interested in yours.