Keyword Activated Speech Recognition on Android

Can I use this cool google now voice activation feature in your own app?

So I want the user to not trigger activation by pressing a button or button. like this.

I would like automatic speech recognition to be activated by keyword. For example: When "google now" opens, you only have to say: "google". After this command, the system listens for the actual input.

Is this possible using the Android API? Or is there an open source library that provides this behavior?

I know that this is possible with open ears, but unfortunately open ears are not available for Android.

+6
source share
3 answers

You should use speech recognition as a service, not as an action.

Check out this git for sample code on how to do this: https://github.com/gast-lib/gast-lib

+1
source

I would suggest using CMU Sphinx or just restarting the recognizer with every call to the onResults and onError functions.

+1
source

Use the CMUSphinx library, where it will work offline. There is no need to launch buttons that you can name, and with the name you can call the recognition module. In the link below you can find the full source code

1) It will work offline 2) You can call it 3) It will start listening when you call his name

private static final String KEYPHRASE = "ok computer"; private static final int PERMISSIONS_REQUEST_RECORD_AUDIO = 1; private SpeechRecognizer recognizer; public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN); captions = new HashMap<String, Integer>(); captions.put(KWS_SEARCH, R.string.kws_caption); captions.put(MENU_SEARCH, R.string.menu_caption); setContentView(R.layout.activity_maini); } private void runRecognizerSetup() { // Recognizer initialization is a time-consuming and it involves IO, // so we execute it in async task new AsyncTask<Void, Void, Exception>() { @Override protected Exception doInBackground(Void... params) { try { Assets assets = new Assets(MainActivity.this); File assetDir = assets.syncAssets(); setupRecognizer(assetDir); } catch (IOException e) { return e; } return null; } @Override protected void onPostExecute(Exception result) { if (result != null) { ((TextView) findViewById(R.id.caption_text)) .setText("Failed to init recognizer " + result); } else { switchSearch(KWS_SEARCH); } } }.execute(); } @Override public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) { super.onRequestPermissionsResult(requestCode, permissions, grantResults); if (requestCode == PERMISSIONS_REQUEST_RECORD_AUDIO) { if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) { runRecognizerSetup(); } else { finish(); } } } public void onResult(Hypothesis hypothesis) { ((TextView) findViewById(R.id.result_text)).setText(""); if (hypothesis != null) { String text = hypothesis.getHypstr(); makeText(getApplicationContext(), text, Toast.LENGTH_SHORT).show(); }} 
0
source

Source: https://habr.com/ru/post/945196/


All Articles