Create a free account to continue

MM: Driving With Dragonfly Vision; Capturing Human Movement Energy

In this Manufacturing Minute episode, an ultra-thin material to capture human movement energy and driving with dragonfly vision.

Capturing Human Movement Energy

Tired of the anxiety a dying cellphone produces in you? Then Vanderbilt University’s research might be of some interest. That’s because they’re developing an ultra-thin material that can generate electricity from our motions like sitting down. It if works out, we could one day have clothing that charges your phone.

Made with layers of black phosphorus, each just a few atoms thick, the materials creates a small electric charge when it’s bent or squeezed. With just a thin profile, the materials can be “impregnated into textiles without affecting the fabric’s look or feel and it can extract energy from movements that are slower than 10 Hz.” Similar systems work best a frequencies of over 100Hz. The material can produce 40 microwatts of power per square foot, and maintain a constant current through extremely slow movements.

Applications for the energy could charge phones, power LCD-embedded clothing or wearable virtual or augmented reality controllers. 


Could an energy-harvesting system like this help power our future? How could the technology be used in a manufacturing setting? Tell us what you think by leaving your comments below.

Driving With Dragonfly Vision

Researchers believe that robots' visual recognition capabilities could soon be dramatically enhanced with insights gained from a common outdoor companion: the dragonfly.

Scientists from the University of Adelaide in Australia and Lund University in Sweden said that despite dramatic differences in the complexities of brains in mammal and insects, the dragonfly, like humans, can anticipate where an object in motion is likely to move next.

Their study found that one particular species features brain cells that allow the dragonfly to pursue and catch flying prey by focusing on and tracking a small object in an otherwise complex background.

Neurons in the insect's brain showed increased responses in an area just in front of the moving object.

And if the object disappeared, the focus area spread forward to allow the brain to predict where it would reappear based on previous flight paths.

Researchers said the study shows how even single neurons can make advanced predictions — and said it could eventually be applied to artificial control and vision systems, such as self-driving vehicles or bionic vision.


Is this the missing link between current driving systems and fully autonomous driving? Tell us your thoughts in the comments below.

More in Industry 4.0