By Barry Levine / Mobile Tech Today. Updated December 17, 2012.
Imagine a day when you can feel the fabric of a new coat -- through your phone. That day, and the day when computers commonly have other sense perceptions, is coming within five years, according to new predictions from IBM.
The touch-phone prediction is part of IBM's seventh annual "5 in 5" set of predictions -- five innovations whose pending release within five years is based on the company's assessment of market and societal trends, as well as innovations in its own, world-famous labs. This year's crop explores how mobile devices will be able to mimic human senses, forming the underpinning of what IBM calls the next big phase in computing -- the "era of cognitive systems."
In the touch scenario, the company envisions a smartphone that allows a woman to shop for her wedding dress remotely and feel the satin or silk in the gown on the surface of the screen. IBM said its scientists are currently developing applications for retail, healthcare and other industries that uses haptic, infrared and pressure sensitive technologies to simulate such touch sensations as the texture or weave of a fabric.
Understanding Babies and Dogs
The technologies use vibration capabilities in the phone, emitted millimeters from the screen, to represent a particular touch sensation. IBM notes that such haptic devices as gloves or "rumble packs" have been used in gaming systems for years, but they are in closed environments that don't necessarily correspond to reality. Texture can already be conveyed through vibration, the company said, but what is needed is a "dictionary of textures" that matches the vibrations to the real world.
In the realm of sight, devices will have a heightened ability to understand the content of images. This has already begun, such as Facebook's ability to match faces between photos or FBI systems that can match faces captured by street cameras to a database of images. IBM envisions the next step, where healthcare systems can help doctors analyze MRIs, CT scans, X-rays or ultrasound images.
Systems will also be able to scan images posted on social networks and use that analysis to determine users' interests, or analyze photos from storm scenes so that needs are immediately conveyed to utility crews and emergency crews.
Superman's highly sensitive hearing is also within reach, IBM said. A distributed system of sensors will be able to detect sound pressure, vibrations and sound waves at various frequencies, predicting when trees will fall in a forest or a landslide is about to happen. Such a system could listen to a baby's babbling and crying through a mobile device, and, through a kind of "speech recognition," will be able to know if a baby is hungry, hot, tired or in pain. The same system could learn to interpret those whines, barks and whimpers from your dog.
The Computer Chef
If you thought taste and smell would remain uniquely human, think again. IBM is developing a system that experiences flavor by breaking down ingredients to their molecular level and blending the chemistry of those molecules with what the system knows about human psychology relating to those flavors and smells. The system will be able to play chef, creating new, healthy flavor combinations. The company noted that its Jeopardy-winning Watson computer uses known information to answer a question with a fixed answer, while this system will create delicious dishes that have never been seen or tasted before.
And then there's smell. Using tiny sensors, your smartphone will be able to detect if you're getting a cold by analyzing the odors, biomarkers and other molecules in someone's breath. The same technique will be able to diagnose or monitor liver or kidney disorders, asthma, diabetes and epilepsy. Such technology can also be used to "sniff" surfaces in a hospital, making sure they are sufficiently sanitized.
Brad Shimmin, an analyst with Current Analysis, noted that mobile devices in particular have already begun to have tremendous capabilities as front-ends to highly intelligent systems and as self-aware sensors that "know a lot about their environment." He said IBM's predictions are "not far-fetched, but are extensions into the mainstream as new applications of highly expensive implementations that often exist today."
While humans communicate through senses, Shimmin said he expected that sensory communication would primarily be used in one-on-one social communication, not one-to-many or many-to-many.