Since speech-service itself has this ability: https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-automatic-language-detection?pivots=programming-language-csharp
Thanks
⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.
@xieofxie Thanks for the question. We will update you shortly.
@xieofxie As you may have seen, a new Bot Framework channel named “Direct Line Speech” is in preview. This channel is optimized for voice-first conversational experiences, where you want users to talk to the Bot and have it speak back. The channel serves as an orchestration service, calling Cognitive Services Speech-To-Text, then calling your Bot, then calling Text-To-Speech to manage multi-turn voice dialogs. One of the benefits is low user-perceived-latency compared to the case where the client application orchestrates these multiple web calls.
Client applications use the Speech Services SDK to connect to Direct Line Speech channel and their Bot. See “Custom voice-first virtual assistants” for more info.
We will check with the product team for automatic language detection.
@ram-msft Thanks. My question here is whether Direct Line Speech channel will support detecting language automatically.
Currently, when using this channel, we should set language explicitly as described here https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/tutorial-voice-enable-your-bot-speech-sdk#optional-change-the-language-and-bot-voice.
I am just wondering since speech service has language detection ability, whether we will enable this feature internally in the Direct Line Speech channel. Then when we use the channel, we don't need to set it explicitly or we specify a set of languages and channel could determine the language of user automatically.
For speech to text in the Speech Service, the limit in the number of languages that can be tested at one time is 2 and these must be specified in advance, although language support is much broader - https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-automatic-language-detection?pivots=programming-language-csharp#automatic-language-detection-with-the-speech-sdk
Speech SDK offers automatic language detection but it currently has a services-side limit of two languages per detection. Language currently supported - https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/language-support
@trevorbye Could you please share any road map to support Speech Services automatic Language Identification in Direct Line Speech channel.
@mswellsi Could you please share road map to support Speech Services automatic Language Identification in Direct Line Speech channel.
The capability is not available automatically today, so this is a new request and I am in discussion with the relevant teams on this.
Hi @xieofxie, thanks for reaching out and the feedback. We're currently tracking to enable language detection feature in Direct Line Speech by end of this calendar year.
@mswellsi @ovishesh Thanks for the details provided and sharing the road map.
@xieofxie We will now proceed to close this thread. If there are further questions regarding this matter, please tag me in your reply. We will gladly continue the discussion and we will reopen the issue.