paint-brush
From Amputee to Cyborg with this AI-Powered Hand 🦾 by@whatsai
217 reads

From Amputee to Cyborg with this AI-Powered Hand 🦾

by Louis BouchardApril 12th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Researchers used AI models based on the current neural networks (RNN) to read and accurately decode the amputee’s intent of moving individual fingers from peripheral nerve activities. The AI models are deployed on an NVIDIA Jetson Nano as a portable, self-contained unit. With this AI-powered nerve interface, the amputede can control a neuroprosthetic hand with life-like dexterity and intuitiveness. i think it is one of the most exciting applications of the current state of transformers and can change the lives of many people.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - From Amputee to Cyborg with this AI-Powered Hand 🦾
Louis Bouchard HackerNoon profile picture
Researchers used AI models based on the current neural networks (RNN) to read and accurately decode the amputee’s intent of moving individual fingers from peripheral nerve activities. The AI models are deployed on an NVIDIA Jetson Nano as a portable, self-contained unit. With this AI-powered nerve interface, the amputee can control a neuroprosthetic hand with life-like dexterity and intuitiveness.

Watch the video

►Subscribe to my newsletter:

References

[1] Nguyen & Drealan et al. (2021) A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based Finger Control: [2]. Luu & Nguyen et al. (2021) Deep Learning-Based Approaches for Decoding Motor Intent from Peripheral Nerve Signals: [3]. Nguyen et al. (2021) Redundant Crossfire: A Technique to Achieve Super-Resolution in Neurostimulator Design by Exploiting Transistor Mismatch: //experts.umn.edu/en/publications/redundant-crossfire-a-technique-to-achieve-super-resolution-in-ne​ [4]. Nguyen & Xu et al. (2020) A Bioelectric Neural Interface Towards Intuitive Prosthetic Control for Amputees:

Video Transcript

00:00in this video i will talk about a00:02randomly picked application of00:03transformers from the 600 new papers00:06published this week00:07adding nothing much to the field but00:09improving the accuracy by 0.1 percent on00:11one benchmark by tweaking some00:13parameters00:14i hope you are not too excited about00:16this introduction because that was just00:18to mess with the transformers recent00:19popularity00:20of course they are awesome and super00:22useful in many cases00:23and most researchers are focusing on00:25them but other things exist in ai00:28that are as exciting if not more you can00:31be sure00:31will cover exciting advancements of the00:33transformers architecture applied to nlp00:36computer vision or other fields00:38as i think it is very promising but00:40covering these new papers making slight00:42modifications to them is not as00:44interesting to me00:45just as an example here are a couple of00:47papers shared in march00:49applying transformers to image00:51classification and since they are all00:53quite similar and i already covered one00:55of them00:56i think it is enough to have an overview00:58of the current state of transformers00:59in computer vision now let's enter the01:02real subject01:03of this video which is nothing related01:05to transformers or even01:06gans in that case no hot words at all01:09except maybe01:10cyberpunk and yet it's one of the01:12coolest applications of ai01:14i've seen in a while it attacks a real01:16world problem and can change the lives01:18of many people01:19of course it's less glamour than01:21changing your face into an anime01:23character01:23or a cartoon but it's much more useful i01:26present you the portable01:28self-contained neuroprosthetic hand with01:30deep learning based01:31finger control by anguian drilan ital01:34before diving into it i just wanted to01:36remind you of the free nvidia gtc event01:39happening01:40next week with many exciting news01:42related to ai01:43and the deep learning institute giveaway01:45i am running if you subscribe to my01:46newsletter if you are interested i01:48talked about this giveaway with much01:50more details in my previous video01:52also i just wanted to announce that from01:54now on all new youtube members will have01:57a specific role on my discord channel as01:59a thank you for your support02:00now let's jump right into this unique02:03and amazing paper02:04this new paper applies deep learning to02:06a neuroprosthetic hand to allow02:08real-time control of individual finger02:11movements02:11all done directly within the arm itself02:14with02:15as little as 50 to 120 milliseconds of02:18latency02:19and up to 99 accuracy an arm amputee who02:22has lost his hand for 14 years can move02:24its cyborg fingers02:26just like a normal hand this work shows02:28that the deployment of deep neural02:30network applications embedded directly02:32on wearable biomedical devices is first02:35possible02:36but also extremely powerful here deep02:38learning is used to process and decode02:40nerve data acquired from the amputee to02:43obtain dexterous finger movements02:45the problem here is that in order to be02:47low latency this deep learning model02:50has to be on a portable device with much02:52lower computational power02:53than our gpus fortunately there has been02:56recent development of compact hardware02:58for deep learning users to fix this03:00issue03:01in this case they use the nvidia jetson03:03nano module03:04specifically designed to deploy ai in03:07autonomous applications03:08it allowed the use of gpus and powerful03:10libraries like tensorflow and pytorch03:13inside the arm itself03:14as they state this offers the most03:16appropriate trade-off03:18among size power and performance for our03:20neural decoder implementation03:22which was the goal of this paper address03:24the challenge of03:26efficiently deploying deep learning03:28neural decoders03:29on a portable device using real-life03:31applications03:32towards long-term clinical uses of03:35course there are a lot of technical03:36details that i will not enter into03:38as i am not an expert like how the nerve03:40fibers03:41and bioelectronics connect together the03:44microchip's designs that allows the03:46simultaneous neural recording03:48and stimulation or the implementation of03:50software and hardware03:51to support this real-time motor decoding03:54system you can read a great explanation03:56of these in their papers03:58if you'd like to learn more about it03:59they are all linked in the description04:01of the video04:02but let's dive a little more into the04:04deep learning side of this insane04:06creation04:07here their innovation leaned to04:09optimizing the deep learning motor04:10decoding to reduce as much as possible04:13the computational complexity04:15into this jetson nano platform this04:17image shows04:18an overview of the data processing flow04:20on the gesture nano04:21at first the data in the form of04:23peripheral nerve signals04:25from the amputee's arm is sent into the04:28platform04:29then it is pre-processed this step is04:31crucial to cut04:32raw input neural data into trials and04:35extract their main features04:36in the temporal domain before feeding to04:39the models04:40this preprocessed data correspond to the04:42main features04:43of one second of past neural data from04:46the amputee04:47cleaned from all noise sources then04:50this process data is sent into the deep04:52learning model04:53to have a final output controlling each04:55finger's movement04:56note that there are five outputs one for04:58each finger04:59to quickly go over the model they used05:01as you can see it starts with a05:03convolutional layer05:05this is used to identify different05:06representations of data input05:09in this case you can see the 64 meaning05:11that there are05:1264 convolutions made using different05:15filters05:15so 64 different representations these05:18filters are the network parameters05:20learned during training to correctly05:22control the hand when finally deployed05:25then we know that time is very important05:27in this case since we want fluid05:28movements of the fingers05:30so they opt for gated recurrent units or05:33gru to represent this time dependency05:35aspect when decoding the data05:37grews will allow the model to understand05:40what the hand was doing in the past05:41second05:42what is first encoded and what it needs05:44to do next05:45what is decoded to stay simple gru's are05:49just an improved05:50version of recurrent neural networks or05:52rnns solving computational problems05:54rnns had with long inputs by adding05:57gates to keep only the relevant05:59information06:00of past inputs in the recurrent process06:02instead of washing out06:04the new input every single time it's06:06basically allowing the network to decide06:08what information should be passed to the06:10output06:11as in recurrent neural networks the one06:13second data here06:14in the form of a 512 features is06:17processed iteratively06:19using the repeated gru blocks each dru06:22block06:22receives the input at the current step06:25and the previous output to produce the06:27following output06:28we can see gru's as an optimization of06:30the basic recurrent neural network06:32architecture finally this decoded06:35information is sent to linear layers06:37basically just propagating the06:38information and condensing it06:40into probabilities for each individual06:42finger06:43they studied many different06:44architectures as you can read in their06:46paper06:47but this is the most computationally06:49effective model they could make06:50yielding incredible accuracy of over 9506:53percent for the movement of the fingers06:56now that we have a good idea of how the06:57model works and know that it's accurate07:00some questions are still left such as07:02what does the person using it07:04feels about it does it feel real does it07:06work07:07etc in short is this similar to a real07:10arm07:10as the patient himself said i feel like07:13once this thing is fine tuned as07:15finished products that are out there07:17it will be more lifelike functions to be07:19able to do everyday tasks07:21without thinking of what positions the07:23hand is in07:24or what mode i have the hand programmed07:26in it's just like if i want to reach and07:28pick up something07:29i just reach and pick up something07:31knowing that it's just like my able hand07:34for every functions i think we will get07:36there i really do07:38please just take one more minute of your07:40time to watch this short07:41touching video where the amputee uses07:44the hand07:44and shares his honest feedback07:50is it pleasurable playing with it oh07:52yeah07:53it's just really cool like this is this07:58is crazy cool08:00to me these are the most incredible08:02applications that we can work on08:04with ai it directly helps real people08:07improve their lives quality08:09and there's nothing better than that i08:11hope you enjoyed watching this video08:13and don't forget to subscribe to the08:14channel to stay up to date with08:16artificial intelligence news08:18thank you for watching and as he just08:20said in the video08:21i will say the same about ai in general08:24this is08:24crazy cool         



바카라사이트 바카라사이트 온라인바카라