Azure Speech API language

2019-08-16 00:26发布

I have implemented a chat on a web page, with the possibility to use Speech to text, using Azure Speech API. It works fine but I don't understand where I can set the language understood by the API. I want it to understand french, but when I talk in french, it transcripts in english words with familiar sound. How / Where I can I set the language ? I precise that I'm not the one who set up the service on Azure dashboard.

2条回答
戒情不戒烟
2楼-- · 2019-08-16 01:20

The new SpeechSDK supports recognition in different languages, please check samples here.

Thanks,

查看更多
smile是对你的礼貌
3楼-- · 2019-08-16 01:23

There is a locale parameter that you can use optionally like the following example:

export interface ICognitiveServicesSpeechRecognizerProperties {
    locale?: string,
    subscriptionKey?: string,
    fetchCallback?: (authFetchEventId: string) => Promise<string>,
    fetchOnExpiryCallback?: (authFetchEventId: string) => Promise<string>
}

If you don't provide a value the following example is used:

const locale = properties.locale || 'en-US';

You can find the possible values for those parameters here

查看更多
登录 后发表回答