99p camera behind intuitive prosthetic hand responses

Intuitive bionic hands fitted with a 99p camera are enabling prosthetic wearers to move their arms responsively, in the same way as real limbs, developers have revealed.

Trialling the “hands with eyes” technology for the first time, biomedical engineers from Newcastle University, England, say the cheap camera instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand.

This automatic method bypasses the usual processes that require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb.

A small number of amputees have already trialled the new technology and now the Newcastle University team are working with Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the prosthetic to patients at Newcastle’s Freeman Hospital.

Dr Kianoush Nazarpour, from the university’s Biomedical Engineering department, said: “Prosthetic limbs have changed very little in the past 100 years; the design is much better and the materials’ are lighter weight and more durable but they still work in the same way.

"Using computer vision, we have developed a bionic hand which can respond automatically. In fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.

"Responsiveness has been one of the main barriers to artificial limbs. For many amputees, the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison.

"Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking."

Current prosthetic hands take practice and time to control them properly because they are controlled via myoelectric signals.

For this study, lead author Ghazal Chazaei, a PhD student in the university’s School of Electrical and Electronic Engineering, used neural networks, showing a computer numerous object images, teaching it to recognise the ‘grip’ needed for different objects from various angles, orientations, against different backgrounds and in different light conditions.

"So the computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up,” she said.

“It is this which enables it to accurately assess and pick up an object which it has never seen before — a huge step forward in the development of bionic limbs.”

The team programmed the hand to perform four different “grasps”: palm wrist neutral; palm wrist pronated; tripod; and pinch.

Using the camera fitted to the prosthesis, the hand “sees” an object, picks the most appropriate grasp and sends a signal to the hand, all within milliseconds.

“The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects, which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before,” said Dr Nazapour.

The work is part of a larger research project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain.

The research was funded by the Engineering and Physical Sciences Research Council (EPSRC).

Ghazaei G, Alameer A, Degenaar P et al. Deep learning-based artificial vision for grasp classification in myoelectric hands. Journal of Neural Engineering. 17(3): 036025, 2017.

[Video]

, ,

Leave a Reply

Your email address will not be published. Required fields are marked *

Search

Categories

Monthly Posts

Our Clients

BSH
Practice Index