Stanford Students Learn to Enhance Computers and Robots with Touch

Reading time ( words)

The woman’s eyes were closed, her hands at her sides, as she walked confidently toward a wall. Just before making contact, she jerked to a halt – “Oh!” A small vibration on her forehead had warned her of the obstacle just a few feet ahead.

Image caption: A user tries out Virtual Gear Shift, the brainchild of Tiger Sun, Jonathan Sosa and Brad Immel, who provided live sound effects to go along with the shifter’s haptic feedback. Sun said he was inspired in part because he’d missed the chance to learn how to drive a car with manual transmission. Virtual Gear Shift vibrates slightly when in gear and resists gear shifts until drivers let the clutch out.

Haptic Headband had done its job.

The device’s creators, Sarah Pinto, Bryce Huerta and Elina Thadhani, presented the headband at a Dec. 7 open house for Haptics: Engineering Touch, an Introductory Seminar for frosh taught by Allison Okamura, a professor of mechanical engineering. Haptics refers to touch-based interactions, which are central to much of Okamura’s research.

The group’s project involves ultrasonic distance sensors and miniature vibrating motors mounted on a headband, plus some prototype electronics stored in a waistpack. The idea, the team said, was to help blind people navigate their surroundings more easily. While Haptic Headband doesn’t yet give users very detailed information about what’s around them, “you can feel large objects coming toward you,” Thadhani said.

The Virtual Gear Shifter created by Brad Immel, Jonathan Sosa and Tiger Sun. Freshman haptics class open house. Alison Okamura is the professor. Grad student Laura Blumenschein, left, observes Grace Zhao and Goli Emami's project which gives haptic feedback while navigating a maze. Cara Welker, right, a grad student in bioengineering tests the device. The inner workings of one of the devices at the freshman haptics class open house. 

Haptics to assist, haptics to train

Okamura has taught the class twice before, always to students with little or no exposure to robotics or computer programming. This year, despite no explicit instructions to do so, each of the class’s six teams developed haptic devices to assist users with special tasks or teach them new skills. “The students are very interested in assistive technology and training,” said Okamura, who is also a member of Stanford Bio-X and the Stanford Neurosciences Institute.

In addition to Haptic Headband, three other teams built assistive haptic devices: a vibrating glove designed to give a kind of depth perception to people with blindness, a navigational wristband that vibrates when users are supposed to turn right or left, and a footbed fitted with pressure sensors that relay information to a vibrating armband.

Charlotte Peale, who designed the foot haptic device with Johnny Armenta and Jabreea Johnson, said that during high school she worked at a podiatry clinic, where she encountered many people with diabetes who had lost feeling in their feet. “We wanted to know, what could we do to help them with that?” Peale said.

Two other teams designed and built haptic training systems: a realistic manual gear shift and clutch to help train young drivers and a haptics-enhanced version of the game Operation, complete with forceps that respond to how tightly they’re gripped and shake when players make a mistake, designed to help teach new surgeons.

Okamura wanted to teach a course on haptics because, she said, “it has a unique interdisciplinary nature to it” that touches on engineering, human biology, social interaction and even ethics. By choosing students with little or no background in robotics, she thought the class would be especially open-minded and explore a wide range of ideas.


Suggested Items

Copyright © 2018 I-Connect007. All rights reserved.