Background
I am working on an application that can respond to certain requests (phone numbers and possibly others).
Google introduced a new feature on Android 6 called "Google Now On Tap" (AKA "Assist API"), which allows the user to request information about things displayed on the screen (by long-clicking on the home button or saying something) without having to enter something.
Google provided a tutorial for developers here
Problem
I canβt find the code snippet to show how to prepare the application for it.
The only thing I noticed is that I can go from the Application class and add OnProvideAssistDataListener inside and register it.
But this opens up many questions about how to do this.
Unfortunately, because this topic is so new, I can not find almost anything about it, so I would like to ask questions here.
Questions
1) Is there any sample or at least a more explained tutorial for this new feature?
2) The documents say:
In most cases, the implementation of accessibility support will allow the assistant to get the information they need. This includes providing android: contentDescription attributes, populating the AccessibilityNodeInfo for custom views, ensuring that custom groups of views properly expose their children and follow the best practices described in the Application Access section.
Why and how does it work with application accessibility features? What does this have to do with revealing children's views (or even views)? How can this even be with respect to views if the application is not already running (since the function is activated in any application anywhere).
I think this is only caused if the foreground application is my application, but if so, how can I offer the queries that appear for all applications, depending on the input?
3) Is the class that is distributed from an application designed to implement OnProvideAssistDataListener? If so, why would he register? If not, maybe Google-Now-On-Tap works with it? He cannot just open all applications with such classes and see if they are registered ...
4) There is a sample fragment in the documents that I did not understand:
@Override public void onProvideAssistContent(AssistContent assistContent) { super.onProvideAssistContent(assistContent); String structuredJson = new JSONObject() .put("@type", "MusicRecording") .put("@id", "example.comhttps://example.com/music/recording") .put("name", "Album Title") .toString(); assistContent.setStructuredData(structuredJson); }
What does the new function do with each key? Is this used by the app or Google-Now-On-Tap? What are my options? This is where I determine if my application can handle the content that this feature offers me? Maybe AssistContent should be what I'm looking at and decide if my application can handle it or ignore it?