Create a free Manufacturing.net account to continue

MM: Robotic Helping Hands, And Navigation Systems For The Visually Impaired

In this episode, we examine robotic "Metalimbs" out of Japan, and MIT's new system that could help visually impared individuals navigate their surroundings.

Robotic Helping Hands

Trying, and failing, to hold too many things at once or juggling different tasks with only our two puny human arms may soon be a thing of the past thanks to a recent innovation out of Japan.

A team at Inami Hiyama Laboratory have created Metalimbs. Designed to be used while sitting, the extra robotic limbs are controlled by tracking markers on the user’s legs and feet. This allows you to control the wrist and elbow joints with your legs. Bending your toes causes the robotic hands to make a fist letting you shake hands or even hold a soldering iron.

Metalimbs will be displayed at this year’s Siggraph Emerging Technologies showcase. Most people aren’t as nuanced using their feet as they are their hands, so there is still some work needed before Metalimbs are completely comfortable to use. However, the tech has potential for anyone who’s wished they had an extra pair of hands.

SO, WHAT DO YOU THINK? 
Do you think Metalimbs could revolutionize work for engineers and scientists? In what ways could this technology be utilized in the manufacturing world? Tell us what you think in the comments below.

Navigation Systems For The Visually Impaired

Continued advancements in high-tech materials and gadgets — from robotic limbs to the self-driving car — hold tremendous promise for people with disabilities. But for the visually impaired, even the most sophisticated navigations systems have struggled to match the practicality and reliability of the humble white cane.

But researchers from the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory say that could be about to change.

They developed a new system that could give visually impaired individuals more detailed information about their environments, such as what types of objects are around them or whether they are occupied by other people. The system incorporates a 3-D camera worn around the neck, which uses algorithms to quickly identify objects and surfaces.

A haptic belt, meanwhile, can alter vibrations based on the wearer's proximity to objects, and a refreshable Braille display at their side can describe nearby objects to the wearer. The device was specifically designed to avoid audio signals or other systems that could interfere with the wearer's senses. 

Tests showed that the MIT system was more effective than a cane in two separate tests — one in which the wearers looked for chairs, another in which they tried to avoid hitting passerby with a cane.

SO, WHAT DO YOU THINK? 
Could this type of automatic navigation system eventually impact manufacturing or other industries? Tell us your thoughts in the comments below.

More in Industry 4.0