Nov 20, 2014

Introducing Android Network Intents

Android Network Intents is a library that I wrote for Lands of Ruin - a game that two friends and I are developing. To avoid a complicated network setup to play the game against a friend, we needed a way to discover games running on the local network. Android offers a Network Service Discovery (NSD) since API level 16 (Android 4.1) but we kept running into problems using it. This lead to writing this library.

What does the library do?
The library allows you to send Intents to listening clients on the local network (WiFi) without knowing who these clients are. Sender and receiver do not need to connect to each other. Therefore the library can be used to write custom discovery protocols.

Sending Intents (Transmitter)
An Intent is sent by using the Transmitter class. A TransmitterException is thrown in case of error.

Sending an Intent

Receiving Intents (Receiver)
Intents are received using the Discovery class. Once started by calling enable() the Discovery class will spawn a background thread that will wait for incoming Intent objects. A DiscoveryListener instance will be notified about every incoming Intent.

Writing a DiscoveryListener to receive events.

Starting and stoping the discovery.

Things you should know
The Intents are sent as UDP multicast packets. Unlike TCP the UDP protocol does not guarantee that a sent packet will be received and there is no confirmation or retry mechanism. Even though losing a packet only happens rarely in a stable WiFi, the library is not intended for using as a stable communication protocol. Instead you can use it to find other clients (by sending an Intent in a periodic interval) and then establish a stable TCP connection for communication.

On GitHub you can find a chat sample application using the library. While this is a convenient example, it is not a good use of the library for the reasons state above. You obviously do not want to lose chat messages.

We are using the library for almost two years in Lands of Ruin and didn't observe any problems. However the game only runs on tablets so far. In theory the library should run on all Android versions back to API level 3 (Android 1.5) but this has obviously never been tested.

You can find Android Network Intents on GitHub.

Dec 24, 2013

Hello World Immersion - Developing for Google Glass #2

This article describes how to create a simple hello world application for Google Glass using the Glass Development Kit (GDK). As described in the previous article you have two options how your Glassware should show up on the device: As a live card that is part of the timeline or as an immersion that is displayed outside of the context of the timeline. This article focuses on how to write an immersion.

What is an immersion?
An immersion is basically an Android activity. The name immersion implies that it is not part of the normal Glass timeline. Instead it takes full control of the device - except for the back gesture (Swipe down). To go back to the timeline you need to leave the immersion.

Once started an immersion takes full control of the screen.

Project setup
Create a normal Android project with the following settings:

  • Set minSdkVersion and targetSdkVersion to 15 (Android 4.0.3)
  • Set compileSdkVersion to "Google Inc.:Glass Development Kit Sneak Peek:15"
  • Do not assign a theme to your application or derive your own theme from Theme.DeviceDefault

Creating the immersion
Let's create a simple activity. The Card class helps us to create a layout that looks like a timeline card.

Launching the Glassware - Voice commands
After creating the activity we need a way to start our Glassware. A common way to launch Glassware is to use a voice trigger. Let's add a simple voice trigger to start our hello world activity.

First we need to declare a string resource for our voice command.


The next step is to create an XML resource file for the voice trigger using the previously created string value.


Now we can add an intent filter for the VOICE_TRIGGER action to our activity. A meta-data tag links it to the XML file we wrote above.


The developer guide requires you to add an icon for the touch menu to the activity (white in color on transparent background, 50x50 pixels). The Glass Asset Studio is a helpful tool to generate these icons.

The final Glassware
Now we can start our Glassware by saying "ok glass, show hello world":

Another option to start our Glassware is to use the touch menu and scroll to the "show hello world" command:

The source code for this Hello World Glassware is available on GitHub.

Dec 22, 2013

Android 2013

It's the end of the year - YouTube and Google Zeitgeist have posted their reviews. Let's have a look on what happened in the Android world in 2013.

2012 is over and the Nexus 4 is the current flagship phone made by Google and LG.

Google+ Sign-In is integrated into the Google Play Services and Google starts accepting applications for the Google Glass Explorer program.

The new Android developer console is out of preview. While Google Play celebrates it first birthday, the market share of Android hits 64%.

The tablet guidelines are updated and the Android developer console starts to show tablet optimization tips. Google pushes a Google Play app update that features a redesigned UI. Samsung releases it new flaship phone - the Samsung Galaxy S4.

The Google I/O takes place for three days from May 15th to 17th. This time there won't be a new Android release. Instead Google releases new game services and a new location API. At the Google I/O a new IDE for Android development is introduced: Android Studio. Since then every couple of weeks a new Android Studio update is pushed to the developer community.

A new flavor of Android Jelly Bean is released: Android 4.3. Open GL ES 3.0 and support for low-power Bluetooth Smart devices are some of the new features. Furthermore a new version of the Nexus 7 is released. Together with the new tablet Google releases the Chromecast dongle and the Google Cast SDK preview.

Google releases version 3.2 of the Google Play Services. The update includes several enhancements to the Location Based Services. With the r18 release of the support library Google released a new backward-compatible Action Bar implementation called ActionBarCompat. Motorola is releasing the Moto X - its first phone since the company has been acquired by Google. The same month Hugo Barra announces to leave Google after 5½ years to join the Xiaomi team in China.

RenderScript is now part of the support library and can be used on plaform versions all the way back to Android 2.2 (Froyo). Jean-Baptiste Queru, who worked on the Android Open Source Project at Google, starts a new job at Yahoo. Google launches the Android device manager website to locate, lock and ring misplaced devices.

After a lot of leaks and rumors a new Nexus phone is released on Halloween. Together with the Nexus 5 a new Android version - Android 4.4 KitKat - is published. Full-screen immersive mode, a new transitions framework, a printing framework and a storage access framework are some of the many new features. In addition to that he Google Play Services are updated to version 4.0. With Romain Guy another popular Android team member is leaving - but remaining at Google.

The App Translation Service, announced at Google I/O, is now available for every developer. Motorola releases a second phone - the Moto G. Android hits a new record with 80% market share. The Google Glass team releases a first sneak peek version of the Glass development kit (GDK).

Two small updates for Android KitKat are released: Android 4.4.1 and 4.4.2. The Android device manager is now available as an app.

The Android Design in Action team releases its 2013 Recap:

What has been your Android highlight in 2013 and what are your wishes for 2014?

Dec 16, 2013

Mirror API and GDK - Developing for Google Glass #1

I recently got my hands on Google Glass and decided to write some articles about developing applications for Glass. After all it's Android that is running on Glass.

What is Glass?
It's very complicated to explain Google Glass just using text. Only wearing and using it will give you this aha moment. However the following video, made by Google, gives you a good impression about how it feels like.

What is Glass from a developer's point of view?
Google Glass is an Android device running Android 4.0.3. What you see through Glass is basically a customized Launcher / Home screen application (a timeline of cards about current and past events) and a slightly different theme. This makes it really interesting for Android developers to develop for Glass: You can use almost all the familiar Android framework APIs. However wearing Glass feels totally different than using a mobile phone. So there's a big difference in designing applications. But not only the UI is different: You can't just port an existing application to Glass. Use cases have to be designed especially for Glass. Some features of your app might not make sense on Glass. Some other interesting features might only be possible on Glass. It's almost impossible to get a feeling for that without using Glass for some days.

Back to writing code.. Currently we can decide between two ways to develop for Glass: The Mirror API or an early preview of the Glass Development Kit (GDK). Let's have a look at both and see what they are capable of.

The Mirror API
The Mirror API has been the first API that has been introduced by the Glass team. It's a server-side API meaning the applications don't run on Glass itself but on your server and it's your server that interacts with Glass.

The Mirror API is great for pushing cards to the timeline of Glass and sharing content from Glass with your server application.

Some examples of applications that could use the Mirror API:
  • Twitter client: The server pushes interesting tweets to the timeline of the Glass owner. The user can share photos and messages with the application and they will be posted to the Twitter timeline.
  • Context-aware notifications: Your server subscribes to the user's location. Every now and then your server will receive the latest user location. You use this location to post interesting and related cards to the timeline of the user.

More about the Mirror API:
With the GDK you can build Android applications that run directly on Glass. Think of the GDK as Android 4.0.3 SDK with some extra APIs for Google Glass. It's worth mentioning that the GDK is currently in an early preview state. The API is not complete and some important parts are missing.

When developing Glass you have two options how your application should show up on Glass:

Live Cards

How a live card shows up in the Glass timeline.

Your application shows up as a card in the timeline (left of the Glass clock). You have again two options how to render these cards:
  • Low-Frequency Rendering: Your card is rendered using Remote Views. Think of it as a Home screen widget on Android phones. A background service is responsible for updating these views. You only update the views every now and then.
  • High Frequency Rendering: Your background service renders directly on the live card's surface. You can draw anything and are not limited to Android views. Furthermore you can update the card many times a second.

An Immersion is not part of the timeline but "replaces" it.

An immersion is at the bottom a regular Android activity. For your activity to look like a timeline card:
  • Don't assign a theme to your activity or use the DeviceDefault theme as base for your customization.
  • Even though you can use the touch pad of Glass almost like a d-pad: Try to avoid most input-related Android widgets. They don't make much sense on Glass because you are not using a touch screen. Instead try to use gestures with the GestureDetector class or voice input.
  • Use the Card class and its toView() method to create a view that looks like regular Glass card.

More about the GDK

Aug 22, 2013

Read the code: IntentService

In the new category Read the code I’m going to show the internals of the Android framework. Reading the code of the framework can give you a good impression about what’s going on under the hood. In addition to that knowing how the framework developers solved common problems can help you to find the best solutions when facing problems in your own app code.

What is the IntentService class good for?
This article is about the IntentService class of Android. Extending the IntentService class is the best solution for implementing a background service that is going to process something in a queue-like fashion. You can pass data via Intents to the IntentService and it will take care of queuing and processing the Intents on a worker thread one at a time. When writing your IntentService implementation you are required to override the onHandleIntent() method to process the data of the supplied Intents.

Let’s take a look at a simple example: This DownloadService class receives Uris to download data from. It will download only one thing at a time with the other requests waiting in a queue.


The components
Before we dip into the source code of the IntentService class, let's first take a look at the different components that we need to know in order to understand the source code.

Handler (documentation) (source code)
You may already have used Handler objects. When a Handler is created on the UI thread, messages can be posted to it and these messages will be processed on the UI thread.

ServiceHandler (source code)
The ServiceHandler inner-class is a helper class extending the Handler class to delegate the Intent wrapped inside a Message object to the IntentService for processing.

ServiceHandler inner class of

Looper (documentation) (source code)
The Looper class has a MessageQueue object attached to it and blocks the current thread until a Message is received. This message will be passed to the assigned Handler. After that the Looper processes the next message in the queue or blocks again until a message is received.

HandlerThread (documentation) (source code)
A HandlerThread is a Thread implementation that does all the Looper setup for you. By creating and starting a HandlerThread instance you will have a running thread with a Looper attached to it waiting for messages to process.

Read the code!

Now we know enough about all the components to understand the IntentService code.



At first a HandlerThread is created and started. We now have a background thread running that already has a Looper assigned. This Looper is waiting on the background thread for messages to process.

Next a ServiceHandler is created for this Looper. The Handler’s handleMessage() method will be called for every message received by the Looper. The ServiceHandler obtains the Intent object from the Message and passes it to the onHandleIntent() method of the IntentService.



The onStart() method is called every time startService() is called. We wrap the Intent in a Message object and post it to the Handler. The Handler will enqueue it in the message queue of the Looper. The onStart() method is deprecated since API level 5 (Android 2.0). Instead onStartCommand() should be implemented.


In onStartCommand() we call onStart() to enqueue the Intent. We return START_REDELIVER_INTENT or START_NOT_STICK depending on what the child class has set via setIntentRedelivery(). Depending on this setting an Intent will be redelivered to the service if the process dies before onHandleIntent() returns or the Intent will die as well.


In onDestroy() we just need to stop the Looper.


The IntentService code is quite short and simple, yet a powerful pattern. With the Handler, Looper and Thread class you can easily build your own simple processing queues.

Oh, and if you are looking for an exercise. The code of the onCreate() method contains a TODO comment that I omitted above:

TODO in onCreate()

May 27, 2013

Sharing the taken picture - Instant Mustache #9

This article is part of a series of articles about the development process of Instant Mustache, a fun camera app that adds mustaches to all faces using face detection. Click here to get a chronological list of all published articles about Instant Mustache.

Up to now our app can take and view pictures. The next step is to share the taken picture with other Android apps. This is done via Intents. The Intent system is one of the most powerful features of Android. It allows us to interact with any app that accepts images with almost no extra afford.

We could create an ActionBar item and when clicked launch an Intent to share the image but instead we are going to use a ShareActionProvider. The ShareActionProvider adds a share icon to the ActionBar as well as the icon of the app that the user has shared pictures the most with. By clicking this icon the user can share directly with this app. In addition to that the ShareActionProvider shows a sub menu with more apps that the given picture can be shared with.

A ShareActionProvider with Google+ as default share action.

Sub menu of a ShareActionProvider.

If sharing is a key feature of your activity, you should consider using the ShareActionProvider.

We start by creating an XML menu file for adding the share action. For legacy reasons the ActionBar uses the same approach for creating action items as the menu in Android 2.x.


Once we inflated the menu in onCreateOptionsMenu() we need to set the Intent used to share the photo.


Let's take a look at the different components of the Intent:
  • ACTION_SEND: The default action used for “sending” data to an other unspecified activity.
  • MIME type: The MIME type of the data being sent. Other apps can define multiple MIME types they accept. We are sending a JPEG image and therefore we are using the MIME type “image/jpeg”. To learn more about MIME types start with the "Internet media type" Wikipedia article.
  • EXTRA_STREAM: Uri that points to the data that should be sent. In our case the Uri is pointing to the image file on the external storage.

That's it already. For all changes done to the code base, see the repository on GitHub. In the next article we'll polish some aspects of the app before we start implementing the Face detection feature.

Jan 14, 2013

Fixing the rotation - Instant Mustache #8

This article is part of a series of articles about the development process of Instant Mustache, a fun camera app that adds mustaches to all faces using face detection. Click here to get a chronological list of all published articles about Instant Mustache.

Wrong orientation

If you run the current version of Instant Mustache and take some pictures you'll notice something odd: The orientation of the taken pictures is sometimes wrong. This may depend on the device you are using. When using a Galaxy Nexus the picture will be rotated 90° to the left when taking a picture in portrait mode but will be rotated correctly when taking a picture in landscape mode.

Wrong orientation of photo that has been taken in portrait mode

How does this happen? You may remember that we've used Camera.setDisplayOrientation() in one of the previous articles to explicitly set the display rotation. First, this setting only affects the preview picture. The picture passed to the Camera.ShutterCallback isn't affected by this setting. And second, we still have to account into how the device is rotated in the moment of taking the picture.

Detecting and remembering the orientation

What we need to do in our code is to register an OrientationEventListener to get notified whenever the orientation changes. We'll remember this orientation and use this to rotate the taken image once the callback returns.

Whenever the orientation changes onOrientationChanged(int) of the listener will be called. The orientation will be passed to the method in degrees, ranging from 0 to 359. We need to normalize this value as we are only interested in 90° steps for rotating the picture.

Another method called rememberOrientation() will be used to save the orientation of the device in the moment of the user pressing the shutter button.


Rotating the picture

Now we just need to rotate the Bitmap. We do this by creating a new Bitmap object and applying a rotated Matrix to the pixels. The rotation angle is calculated by summing the remembered orientation, the display orientation and the natural rotation of the device.



Photos rotated correctly in portrait and landscape mode