Sunday 30 March 2014

Android Wear Samples In Initial Preview



Before reading this post, to run the samples on your own, prepare the SDK as shown in Android Wear 101

These samples are run on a phone or tablet only. The emulated wearable device receive teh notifications and lets us perform some actions. Future versions are expected to do more as explained in Android Wear - Beyond Initial Preview.

As mentioned in the 101 post, apart from the usual (license and dummy readme files), the preview SDK has the wearable-preview-support.jar and few samples:
  • ElizaChat
  • RecipeAssistant
  • WearableNotificationsSample
  • prebuilt-libs/wearable-preview-support.jar (again within samples folder)


In Eclipse based IDE you downloaded as part of Android SDK/ADT bundle (not Wear SDK), choose 
File > Import > Android > Existing Android Code into Workspace


Now choose a sample application unzipped recently from the wear SDK. Change the name for each project as shown in the screen shot.
Tip: Also remember to tick 'Copy projects into workspace'. Otherwise any edits you do will be done on the original sample. 
In the current preview, the sources are in "java" folder and not under the default "src" folder as expected by eclipse. Rectify this by dragging contents of java folder into src or by right-clicking on java and choosing > Build Path > Use as source folder

Still the project will show errors. As we need to manually bring in two libraries.

Create a folder called libs in the project at top level (parallel to src). 

Copy wearable-preview-support.jar from Android Wear SDK into libs.
Copy android-support-v4.jar from Android SDK's extras/android/support/v4/ folder into libs

Now right click on these jar's one after other and choose Build Path > Add to Build Path

Source is ready now but before running, ensure that 

  • the Android Wear emulator instance is running as explained in 101
  • the device running Android Wear Preview application is running as explained in 101
  • the two are connected using adb tcp port forwarding as explained in 101

Now right click on any one of the projects we just imported and run it as an Android application. The resulting dialog will for a device/emulator instance to run it on. Choose to run the application on the phone/tablet NOT the wearable/wearable's emulator instance

Here we show Eliza Chat Sample application.

Tip: Disable screen lock on your phone/tablet or at least change the timeout to several minutes. ElizaChat application sends notifications to the wearable when the activity is visible (resumed state) and cancels the notifications when the screen is locked (paused state).

On launching the application on the phone, it triggers a notification on the wearable's home screen. 

On the wearable, 

  • if not on the home screen already, click on top to navigate to the home screen
  • drag the notification up to activate it 
  • hope you read Eliza's offer, "HEY THERE, HOW CAN I HELP YOU?"
  • drag the notification to the right to see actions 
Observe the navigation cue dots or dashes at the bottom indicate the position and number of screens within each notification

In Eliza Chat, the only action is reply. Choose to reply. 


You'll observe on adb -e logcat or on the DDMS perspective's logcat that a fake voice recognizer (FakeRecognitionService) has been launched. 

Tidbit: There also seems to be a HotwordRecognizerRunner which is responsible for identifying the hot word (OK Google) when on the home screen sans activated notifications. It is paused whenever a notification is chosen/activated. But since voice actions are disabled in this preview, we don't see it in full glory. Moreover it is a hardware component that needs to work day in and day out without draining the battery much. Motorola has already demonstrated this capability in their phones. These are actively listening for the hot word all through the day. Google Glass too has this feature. Now it is being promoted for wearables so that these can be used without touching the screen even for the initial trigger.
Anyway now FakeRecognitionService is active. This fakes as though listening for voice input and eventually the respective confirmation action (save/edit). Most importantly, it too times out just as a voice input dialog would. But being fake we need to type (instead of speak) before it times out. 


I typed in "No you can't I am beyond repair"
At the end of this, the chat application on the phone shows our response and Eliza's responses in history view. 

What we don't get to see is that the notification on the wearable is also updated with Eliza's new response. Drag to the right to go back to the notification's first screen and you'll be rewarded with the response (which in my case was "DID YOU COME TO ME BECAUSE YOU ARE BEYOND REPAIR").

This is a highly simplified version of the MIT's Eliza. Don't expect it to replace your therapist. Well actually, nobody needs one. 

Similarly try the other examples out and use the code from these in your own Android applications that run on a phone/tablet. 

Then of course we've more coming soon

No comments:

Post a Comment

PG content please :)