On The Cutting Edge — Building the Next Generation of Mobile Applications: Context Awareness

Mobile hardware and software producers spend an astounding amount of time considering human and user interface details. Focus groups, man-machine interaction studies and endless prototyping – the list goes on.

Mobile application development engineers sit through interminable User Interaction sessions in an effort to get things just right for the smaller screen, limited keypad capability and the sometimes-shallow resource pool in an effort to make the interaction simple for the user.

But is the user the only source of information a device can use to make interaction simpler and more effective? Not anymore.

Context awareness is an attempt to incorporate better information available from the local environment to better serve the user's needs -- without direct input from the user. Using environmental cues a simple handset can become that natural extension of self capable of fostering interactions between people and other people, as well as people and other objects. Once employed, context awareness brings applications closer to a living state – a state that promises more intuitive usage, more predictive and predictable behavior and more personalized consumer experiences.

Forward-thinking mobile application developers now have another opportunity to blaze a new trail for mobile software. And it's a profitable, game changing opportunity.

In practicality, context awareness is approached from two different directions – that hardware/software side, and the human side.

On the hardware side, context awareness describes mobile devices capable of using environmental cues to determine relative location, nearby associations and available resources. These cues can overlap quite a bit.

Relative location could mean your precise location (within about six feet) using GPS or triangulation, but more subtle changes elude GPS. What if I just stepped out of a darkened conference room and into daylight? Consider other subtle cues, such as noise level, the availability of an open WiFi network or perhaps even the social situation of the moment? All of these cues can be used to make applications more adaptive and intuitive.

Nearby associations can be created through wireless networks, and bluetooth. These associations provide the foundation for a much more seamless file synchronization and exchange experience by simply identifying and classifying the other devices available in any particular location in real-time … as they change. Those devices, once associated, form the basis of understanding your social context if you can map a device to an address book entry.

Location and association awareness combined with the resources mentioned above (Bluetooth devices, WiFi networks, GPS data, PIM access) bring a richer, more actionable set of data to represent context than simple location data ever could.

On the human side, input data can be enhanced to include information on the user's patterns and habits, frequency of feature usage, and possibly even biophysiological conditions (harder key presses indicate stress, long periods of static location indicate relaxation).

Though there are some engaging applications that are getting closer to being context aware, most are still a creative combination of hardware features that seem to work only in silos. Extending that existing creative thinking into multiple application environments and unconventional datasources will bring true context awareness to the market in more meaningful ways – ways that every user will be able to use without ever having to configure or spend long hours customizing. For instance:

  • Bring me a mobile phone that knows I'm currently in a meeting, and that I have just begun presenting. Let the phone conclude that I'm busy, and route calls to my voicemail until I'm finished. Now use that kind of thinking and change my ringtones based on other events: Louder, and through my car speakers when I'm driving home; silent and straight to voicemail at bedtime, and a nice quiet ring when I'm working in the office.
  • Show me context-shifting themes. My phone's home screen should show me stats for my favorite cricket team while I'm at the game, and switch to a movie theme when I'm in the theater then to a professional theme when I'm at work.
  • Let me know what's happening in the building right in front of me. How could I possibly know that I'm standing in front of the best Thai restaurant when I'm new in town? My handset could drop me a hint by combining LBS data with user-generated restaurant reviews on the web.
  • How about reading my calendar and serving me directions before I need to leave for events? I'd fall in love with driving directions auto-generated based on the information that my phone already has. Not only would I not need to pull over to dig up directions, anymore, I could stop typing in addresses multiple times. Once in the calendar is enough.
  • Auto tag my pictures for me while I'm taking them. Automatically add them to my blog or my social networking account, but be smart enough to know that a picture I take during work hours shouldn't end up on my blog.
  • My calendar observes time as its only context. A more effective context-aware application would notify me of upcoming appointments taking into account the travel time and traffic conditions on the way to my destination. Monitor my travel progress and if I'm running late, let the other meeting participants know via email. I'm busy driving.

If you're marveling that I have a wild list of demands, take another look. Most of the information I'm talking about using exists already on a dearth of handsets in circulation. And yes, it's a wild list, but this and much more is possible. I'm just sure of it.

Still, there's no throwing away UI considerations. Quite the opposite is true. Context-aware applications will likely collect experiences and data from several applications a consistent user interface will be fundamental in making these applications valuable and useful. Additionally, these applications will need to query and verify that the other interfacing applications are running and are available.

If you view the first wave of application development as the vertical, standalone applications of yesterday, we are right now riding out the second wave of mobile applications that integrate technologies and verticals. I'm looking forward to the third wave – the dawn of true context-sensitive applications that work horizontally, are tightly integrated with handset features yet savvy enough to pick up on subtle human input.

Context Aware applications enable a new generation of adaptive devices that can exploit device related technologies like GPS/LBS, accelerometers and exhibit predictive behavior based on analyzing user generated data. It is for developers like you to innovate on context aware technologies and create compelling user experiences and applications of the next generation.

– Asokan Thiyagarajan, Motorola Technology Evangelist

Back
© 2010 AsokanInc. All Rights Reserved.