When Dr. Eduardo Torre-Jara talks about “sensitive robotics,” he’s not talking about robots that cry when you yell at them.

He’s referring to sensor technology, and how an integration of sensor technology can create more powerful, or more versatile robots. He explains that the concept itself ties into the artificial intelligence notion of “embodied intelligence.” In order for a physical body to be useful in terms of contributing to intelligence, Eduardo explains that the robot needs to be sensitive to actual physical contact.

“Machine vision” and “machine learning” are “buzzing” right now a lot more than “machine tactile sense,” but for Eduardo – this is a crucial area of focus, and one that may be getting relatively little attention.

Eduardo explains that in an industrial environment – machine arms are very precise, but very dangerous when they come in contact with objects in their path “They are heavy – because they need that to be precise – but they are also moving fast… and if they hit an obstacle, they are likely to break it, or to break themselves.” This is, in large part, because the robots are not sensitive to contact.

CaminanteWhen it comes to explaining the kind of “tactile sense” that we humans take for granted, Eduardo uses an example we can all relate to. “You and I can easily get up in the middle of the night, in pitch black, find the remote control and point it at the TV and push the ‘on’ button. That’s it… no vision, all with the tactile feedback coming from your hands.”

He explains that the changes literally need to come at all levels – not just hardware – and not just software. In order to detect more sensitive feedback, new and versatile hardware is needed. In order to make “sense” (no pun intended) of that data, new software is needed. And that’s the work he’s undertaking at WPI.

He is now taking the idea of “sensitive manipulation” (hands and arms) and using it on robot feet and bodies, and even in flying robots whose feathers can be used to “feel” the air. With feet, Eduardo explains that we humans use our feet much more than we are consciously aware of in order to remain standing and in balance, because we can feel pressures through the ground.

Where does the inspiration come for sensor technology and applications? Nature, of course.

In my conversation with Eduardo, I told him that in the world of marketing, it’s usually not the best idea to reinvent the wheel. If you’re starting from scratch with a marketing goal, finding someone who’s doing it well and modeling them is usually a better idea than starting on a “Guinea-pig tangent.” Dr. Torres likens this to what he’s doing in the lab. “I’m alway fascinated by humans and animals can do… and what we take for granted.”

When it comes to replicating a tactile sense, Eduardo models everything from the ridges of human skin: “The skin has been built with layers… and we went to that level. We built a better sensor which was very functional because it included the ridges that you see on human finger tips… It’s not just the sensor but the structure that allows the biological systems to detect better. On top of that, not only are they sensing, but they provide a good interface that lets them interact with a variety of object.” The human hand, for example, is not too hard or too soft to handle an egg, a stone, a branch, a feather, or a variety of other objects. A very dense surface makes for a difficult interface to handle objects, even if the sensors work perfectly. Finding this balance and intricacy that nature seems to exhibit so easily has been the brunt of Eduardo’s work in the lab, and at his startup company, Robot Rebuilt.

Needless to say, keeping all of these considerations in mind and aiming to replicate the complexity of human and animal forms is no easy task, but Eduardo has seen a great deal of progress. Like many of the robotics experts I’ve brought on (from MIT and elsewhere), he attributes a good deal of this progress to cheap and increasingly effective sensors. “Because of this, we can finally put sensors where we should have had sensors long ago. Think about your car, it is two tons of metal that you drive around but it hardly has any sensors.” Now, many cars use rear cameras to help with reverse motions, and many more use sonar to detect objects in front of or behind the vehicle during it’s movement. Eduardo and others see this increased integration with sensors are being a huge part of our transition towards an “intelligent” physical world.


TechEmergence conducts direct interviews and consensus analysis with leading experts in machine learning and artificial intelligence. Stay ahead with of the industry with charts, figures, and insights from our unparalleled network, including executives from Facebook, Google, Baidu, Yahoo!, MIT, Stanford and beyond: