Simulate Touch Controls Through Code

2019-02-15 04:38发布

I'm trying to make it possible to navigate through my Google Glass application by using head gestures. I'm able to recognize head gestures like looking to the right left and up. They each have their own method for what to do when this gesture is recognized

Now I need to simulate the corresponding touch gestures inside each method. So it will think I'm swiping to the left or right which will allow me to navigate through the cards with the head gestures.

Does anyone have any idea on how to actually achieve this?


Edit

I created a quick hello world application to play with. I added my headgesture code and started trying to get the keys working.

I added the following to my onCreate()

Instrumentation instr = new Instrumentation();

Then I added the following lines to each respective headgesture method.

  • Headgesture upwards should correspond with tapping the touchpadinst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_CENTER)
  • Headgesture to the left should correspond with swiping left on the touchpad inst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_LEFT);
  • Headgesture to the right should correspond with swiping right on the touchpadinst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_RIGHT);

They are responding accordingly now, however I'm getting an exception saying:

 java.lang.RuntimeException: This method can not be called from the main application thread

2条回答
女痞
2楼-- · 2019-02-15 04:43

Using the Instrumentation class would work if you use a separate thread to call the sendKeyDownUpSync method from.

This can be done using the following steps:

  1. Create and start a thread from your activity
  2. In the run method, use the Looper class and create a Handler as explained here
  3. Every time you want to call sendKeyDownUpSync, post a Runnable instance to the Handler, which calls sendKeyDownUpSync in its run method.

A similar code sample (not from me) is available here

查看更多
太酷不给撩
3楼-- · 2019-02-15 05:00

The Solution

In the end I went a different direction then the one I mentioned in my edit above.

I found out that it is possible to call touch controls in the shell by using

adb shell input keyevent <keycode here>

I then found a way to use this in android, I have the following class named issueKey

public class issueKey {
public void issueKey(int keyCode)
{
    try {
        java.lang.Process p = java.lang.Runtime.getRuntime().exec("input keyevent " + Integer.toString(keyCode) + "\n");
    } catch (Exception e) {
        Log.wtf("IssueKeyError", e.getMessage());
    }
}
}

Then I simply call the class and pass the keycode for the corresponding gesture

mIssueKey.issueKey(4);//functions as swipe down

Here is the list of keycodes that I tested for anyone that is interested.

Keys for each respective button/gesture

  • 4: Swipe Down
  • 21: Swipe Left
  • 22: Swipe Right
  • 23: Tap
  • 24: Volume Up
  • 25: Volume Down
  • 26: Lock/Unlock Screen
  • 27: Camera Button

However, what I'm wondering now is. What would be best practice, getting the solution I metioned in my edit to work by using a asyncTask or is the solution I'm currently using better.

查看更多
登录 后发表回答