Research Menu

Skip Search Box

The Next Wave | Vol. 20 | No. 1 | 2013

The next user interface

Sandia National Laboratories

Not long ago the state-of-the-art telephone was a heavy contraption with a sluggish rotary dial. The top-of-the-line television was a bulky console without a remote control. The world's best cameras were tricky gadgets filled with photosensitive film. All of that has changed. Technology transformed these common products, and it is about to transform another. In the next decade, the keyboard and mouse will be replaced by active "haptic interfaces" that respond to body and eye movements. Multimodal inputs will replace key inputs and mouse movements.

The transformation will occur gradually. First the keyboard and mouse will merge. Then the merged interface will transform into a peripheral device that responds to voice, gestures, and eye movements. The process will be enabled by improvements in materials and batteries and by the growth of wireless—ever smaller mobile devices powered by a new generation of batteries. No longer will people be tied to the stationary workstation. The days of ubiquitous computing have arrived.

The next ten years

To predict the shape of the next-generation user interface, we at Sandia National Laboratories developed a methodology for assessing trends in three areas essential to ubiquitous computing—keyboard technologies, virtual computer controls, and thin batteries. Our findings indicate that the future is mobile and the days of traditional interfaces, such as keyboards, are limited.

In the coming decade, humans will interact with their desktop and mobile devices in entirely new ways. The new ways of interaction will inspire technological changes that lead to further innovations. Much of the change will be driven by the needs of mobile Internet users, who now exceed the number of desktop users [1].

The hardware component most certain to change is the keyboard—a design that was introduced over 130 years ago as part of the mechanical typewriter. Although the keyboard's extraordinarily long life testifies to its usefulness, it is inherently inflexible and inefficient. It is neither portable nor adaptable to the mobile computing devices that will dominate the future.

Desktop users will not disappear, and they will need a more flexible interface—a simpler way to input information. That interface will be a multimodal peripheral device. At first, it will be equipped with feedback mechanisms (e.g., the sounds of keystrokes) to instill the same level of user confidence that the keyboard currently provides. After a while, the interface between human and machine will begin to blur. As it does, more novel control interfaces will appear. Clothing, furniture, print advertisements, packaging, even drug encapsulations might contain computing power and responsive interfaces. Future interfaces will be diverse and diffuse.

Keyboard technologies

Today's keyboard will not disappear overnight. But with the emergence of multitouch mobile devices, electronic paper, and flexible display technologies, the keyboard will transform. Yesterday's typewriter interface will become more virtual.

The needs of wireless users will stimulate disruptive changes in human-computer interactions for the next decade. As the number of mobile Internet users surpasses that of desktop users, new ways to input information will be needed. The dominant force driving change will be the demand for a virtual keyboard for use with mobile devices. The desktop will remain, but a steady transition will occur in the keyboard and mouse interface.

In the near term (i.e., one to three years), pointing tasks will be incorporated into the keyboard interface as the mouse is replaced by multitouch trackpads. Productivity will increase because finger motions will enable faster and more versatile pointing. In the mid-term (i.e., three to five years), the entire keyboard surface will become touch-active, with no boundary between keyboard and trackpad (see figure 1). Ultimately, a virtual-like keyboard will be printed or projected on an active surface. In the far-term (i.e., five years and beyond), the keyboard will be completely realized on a flexible surface that will allow for any key layout a user wants, and it will be powered by a thin battery.

FIGURE 1. Within the next five years, the keyboard and mouse interface will be replaced by a keyless, touch-active surface with a multitouch trackpad.

Virtual computer controls

The technologies that will replace today's keyboard and mouse—virtual computer controls—include such devices as touch screens and wireless game controllers. Our assessment focused on the key component of virtual controls—tracking. Tracking systems enable the computer to determine a user's head or limb position or the location of a hand-held device that interacts with a virtual object. In the past, tracking for gesture recognition was implemented via worn devices such as gloves or bodysuits. Recent advances in passive techniques (e.g., cameras and sensors) allow for a more ubiquitous interface. The market for these interfaces has expanded in recent years; in fact, the top three video game console makers have integrated motion detection and tracking into their units.

The resulting virtual environments provide users with direct manipulation and interaction with multisensory stimulation. This flexibility enables a broader range of users (e.g., varying ages, skill levels, and languages). This blurring of the user interface, where the keyboard and mouse disappear completely, was once a costly endeavor; for example, three-dimensional depth mapping technologies ranged from $10,000 to $15,000. But Microsoft has been active with gesture-controlled interfaces that convert motion into on-screen action and control, and today their Kinect hardware package for Xbox 360 costs about $110.

Kinect operates by projecting an infrared laser pattern onto nearby objects. A dedicated infrared sensor identifies the laser position and determines the distance for each pixel. The corresponding information is then mapped to an image from a standard RGB camera (i.e., a camera that uses the red, green, blue additive color model), resulting in an RGB-D image where each pixel has a color and a distance (the D stands for "depth"). One can then use the image to map out body positions, gestures, and motion. Coupling body, face, and motion recognition with a multiarray microphone enables a complete virtual interface like none ever experienced outside of a laboratory.

Our analysis examined open-source publishing trends in the research and development of new tracking capabilities in recent decades. We discovered that more than 75% of the global research has been focused on body- and hand-tracking topics. Eye-tracking research represents most of the remaining research. Gesture recognition will also play a critical role in interface design, since it is a critical link between conceptualizing thought and linguistic expression.

The dominant force driving change in the area of virtual computer controls will be the demand for tracking technologies that convert human motion into computer responses.

For the near term, we can expect to see rapid expansion of Microsoft's Kinect platform. This inexpensive platform will encourage professionals, academic communities, and do-it-yourselfers to develop innovative applications limited only by the imagination. Additionally, transforming the current stationary hardware onto a mobile device will be an active area of development. In the longer term, as the technologies of gesture-, body-, and eye-tracking advance and as the resolution capabilities increase alongside machine learning, we can expect to see signs of affective human-computer interactions. For example, the ability to develop machine-level recognition of facial expressions combined with emotion signatures via audio would enable a progression toward affective interfaces.

Thin batteries

Our final area of interest is thin batteries. Our findings indicate that these batteries will become the power sources for the next generation of user interfaces. Today's batteries function by chemical storage. They are reliable but limited by packaging and energy density constraints; as the batteries shrink, their energy density falls. To meet the need for very small batteries to power future interfaces, new methods for thin battery manufacture are starting to appear. These methods fall into two categories: printed and sputtered. Printed batteries can be integrated on paper and flexible surfaces. Sputtered batteries are often deposited on rigid surfaces. Sputtered batteries are processed at high temperatures (i.e., greater than 500°C), thus offering manufacturing integration possibilities such as lamination with significantly reduced thicknesses.

We searched the literature for both printed and thin batteries. Our search identified research in these areas:

  1. Printable lithium and other batteries based on liquid electrolytes,
  2. Microbatteries based on microelectromechanical systems and thin film integration,
  3. Sputtered batteries based on various chemistries, and
  4. Solid electrolyte batteries using established chemistries like lithium phosphate.

A number of commercial suppliers of these batteries already exist, but because of the current economic environment, most are not investing in research and development. Rather, they are supplying products for specialized "gimmicky" markets like greeting cards, toys, cosmetic patches, and other novelties.

The near-term strategy for most manufacturers and researchers in thin batteries will be to improve energy densities and performance in an incremental and stepwise manner. For the mid-term, efforts will address considerations for rate engineering—for example, the design of very high continuous or pulsed current densities. In the far-term, packaging innovations are expected to reduce dimensions. Packageless devices may even be possible with the emergence of anode/cathode chemistries that are stable in air (e.g., lithium titanate and lithium iron phosphate). Studies have shown that theoretical capacities of about 88% are possible; however, systems to date are limited by low voltage potentials and several processing restrictions. The ability to deposit batteries on demand regardless of location is an extremely attractive prospect, but significant hurdles need to be overcome in basic science research and development before that can happen.

One recent development is worth noting: Dynamics Inc. has been advertising a new credit card product in which more than 70 components have been mounted on a printed circuit board and embedded in a credit card. The device includes a number of peripheral management circuits that are directed to power management, timing, and control. Accordingly, a number of peripherals (e.g., buttons) may be added to the platform to allow a user to enter information into the card. Displays might allow users to receive information from the card. The device is designed to be thin, flexible, durable, and able to operate for three years on a single battery charge.


To predict the shape of the next-generation user interface, we developed a methodology for assessing trends in keyboard technologies, virtual computer controls, and thin batteries—three areas essential to what is called "ubiquitous computing."

Our findings indicate that the next user interface will be shaped by developments in flexible electronics and displays and in manufacturing technologies that embrace printable electronics. Ubiquitous computing will be enabled by processing chips that bend, flex, and stretch. Commercial manufacturers already have the capacity to print conductors, resistors, capacitors, primitive sensors, and displays onto flexible surfaces. In the next decade, manufacturers expect to develop cost-effective methods for integrating thin devices (e.g., battery, electronics, and display) into a single product. Several commercial battery manufacturers are aligning their capabilities with this vision. For the market to materialize, however, manufacturers must develop diodes, sensors, transistors, displays, and electronic components that are compatible with printable and flexible manufacturing technologies and processes. But the trend is evident. Just as the rotary phone transformed into today's smartphones, so will the keyboard transform into tomorrow's haptic interface.

About the author

Since 1949, Sandia National Laboratories (SNL) has developed science-based technologies that support national security. SNL is a federally funded research and development corporation and a government-owned, contractor-operated facility. Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, manages SNL as a contractor for the US Department of Energy's National Nuclear Security Administration. SNL works with other government agencies, industry, and academic institutions to support national security in the following strategic areas: nuclear weapons; defense systems and assessment; energy, climate, and infrastructure security; and international, homeland, and nuclear security.

For more on this topic, see Overview.


[1] Meeker M. "The mobile Internet." Dec 2009. Morgan Stanley.

Further reading

Department of Defense. "Technology Readiness Assessment (TRA) deskbook." Jul 2009. Office of the Director, Defense Research and Engineering. Available at: Technology Readiness Assessment Deskbook.pdf

Sato K, Tachi S. "Design of electrotactile stimulation to represent distribution of force vectors." In: 2010 IEEE Haptics Symposium; Mar 2010; Waltham, MA. p. 121–128. DOI: 10.1109/HAPTIC.2010.5444666

Wang Y, Mehler B, Reimer B, Lammers V, D'Ambrosio LA, Coughlin JF. "The validity of driving simulation for assessing difference between in-vehicle informational interfaces: A comparison with field testing." Ergonomics. 2010;53(3):404–420. DOI: 10.1080/00140130903464358

Kim S, Sohn M, Pak J, Lee W. "One-key keyboard: A very small QWERTY keyboard supporting text entry for wearable computers." In: Proceedings of the 18th Australia conference on Computer-Human Interaction; 2006; Sydney, Australia. p. 305–308. DOI: 10.1145/1228175.1228229

Foxlin E. "Motion tracking requirements and technologies." In: Stanney KM, editor. Handbook of Virtual Environments: Design, Implementation, and Applications. Magwah NJ: Lawrence Erlbaum; 2002. p. 163–210.

Yamamoto M, Sato H, Yoshida K, Nagamatsu T, Watanabe T. "Development of an eye-tracking pen display for analyzing embodied interaction." In: Salvendy G, Smith MJ, editors. Human Interface and the Management of Information. Interacting with Information. Springer, 2011. p. 651–658. (Lecture Notes in Computer Science, 6771).

View PDF version of this article (193 KB)


Date Posted: May 17, 2013 | Last Modified: May 17, 2013 | Last Reviewed: May 17, 2013