Is it possible to use this cool voice activation feature of "google now" in you own application?
So what I want is that the user does not have to trigger the activation by pressing a button or sth. like that.
I'd rather like to have an automatic speech recognition activated by a keyword.
For example: When "google now" is opened you only have to say: "google". After that command the system is listening for the actual input.
Is this possible by using the android API? Or is there any open source library that provides this behavior?
I know that this is possible with "open ears" but unfortunately open ears is not available for android.
You have to run the speech recognition as a service instead of as an activity.
Check out this git for sample code on how to do this:
https://github.com/gast-lib/gast-lib
I would suggest using CMU Sphinx, or just restarting your recognizer on every "onResults" and "onError" function call.
Use CMUSphinx library where it will work in offline mode, No need of buttons to trigger it you can name it and by using name you can trigger the recognition module In below link you can find full source code
1) It will work in offline mode
2) You can name it up
3) It will start listens when you call his name
private static final String KEYPHRASE = "ok computer";
private static final int PERMISSIONS_REQUEST_RECORD_AUDIO = 1;
private SpeechRecognizer recognizer;
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
captions = new HashMap<String, Integer>();
captions.put(KWS_SEARCH, R.string.kws_caption);
captions.put(MENU_SEARCH, R.string.menu_caption);
setContentView(R.layout.activity_maini);
}
private void runRecognizerSetup() {
// Recognizer initialization is a time-consuming and it involves IO,
// so we execute it in async task
new AsyncTask<Void, Void, Exception>() {
@Override
protected Exception doInBackground(Void... params) {
try {
Assets assets = new Assets(MainActivity.this);
File assetDir = assets.syncAssets();
setupRecognizer(assetDir);
} catch (IOException e) {
return e;
}
return null;
}
@Override
protected void onPostExecute(Exception result) {
if (result != null) {
((TextView) findViewById(R.id.caption_text))
.setText("Failed to init recognizer " + result);
} else {
switchSearch(KWS_SEARCH);
}
}
}.execute();
}
@Override
public void onRequestPermissionsResult(int requestCode,
String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode == PERMISSIONS_REQUEST_RECORD_AUDIO) {
if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
runRecognizerSetup();
} else {
finish();
}
}
}
public void onResult(Hypothesis hypothesis) {
((TextView) findViewById(R.id.result_text)).setText("");
if (hypothesis != null) {
String text = hypothesis.getHypstr();
makeText(getApplicationContext(), text, Toast.LENGTH_SHORT).show();
}}