How to provide content to be shown on Google-Now-O

2020-06-19 04:32发布

问题:

Background

I work on an app that can answer to certain queries (phone number queries, and maybe others).

Google introduced a new feature on Android 6 , called "Google Now On Tap" (AKA "Assist API") , which allows the user to query about things that are shown on the screen (triggered by long-click on home button or by saying something) without the need to type anything.

Google provided a developers tutorial for it, here

The problem

I can't find any code snippet to show how to prepare the app for it.

Only thing that I've noticed is that I can extend from Application class, and add OnProvideAssistDataListener inside , and register to it.

But, it opens a lot of questions about how to do it.

Sadly, because this topic is so new, I can't find almost anything about it, so I'd like to ask the questions here.

The questions

1) Is there any sample or at least a more explained tutorial for this new feature?

2) It is said in the docs:

In most cases, implementing accessibility support will enable the assistant to obtain the information it needs. This includes providing android:contentDescription attributes, populating AccessibilityNodeInfo for custom views, making sure custom ViewGroups correctly expose their children, and following the best practices described in “Making Applications Accessible”.

Why and how does it work with the accessibility features of the app? What does it have anything to do with exposing child views (or views at all)? How could it even be about views, if the app doesn't run yet (because the feature is activated on any app, anywhere).

What I think is that this is called only if the foreground app is my app, but if it is this way, how can I actually offer queries that appear for all apps, depending on what the input is?

3) Does the class that extends from Application supposed to implement OnProvideAssistDataListener ? If so, why does it need to register to it? If not, how could it be that Google-Now-On-Tap works with it? It can't just open all apps that have such a classs, and see if they register...

4) The docs have a sample snippet which I didn't understand:

@Override
public void onProvideAssistContent(AssistContent assistContent) {
  super.onProvideAssistContent(assistContent);

  String structuredJson = new JSONObject()
       .put("@type", "MusicRecording")
       .put("@id", "example.comhttps://example.com/music/recording")
       .put("name", "Album Title")
       .toString();

  assistContent.setStructuredData(structuredJson);
}

What does the new feature do with each key? Is it used by the app, or Google-Now-On-Tap ? What are my options about it? Is this where I define if my app can handle the content that the feature suggests me? Is AssistContent supposed to be the input that I look at, and decide if my app can handle it or ignore it?