Artificial intelligence that translates sign language

Student develops Artificial Intelligence that translates sign language in real time

Alexandre Marques Avatar
Gabriel Sales, programmer and statistics student at UFF, promises to revolutionize the way hearing impaired people communicate through Libras. Understand!

In the current technology scenario, no subject is as speculated about new innovations as Artificial Intelligence. It is in the midst of this competitive scenario that Gabriel Sales, a statistics student at UFF, in Rio de Janeiro, emerges as a visionary by leading an Artificial Intelligence project that translates sign language. His project seeks to overcome communication barriers between deaf and hearing people, offering an effective way to transcribe sign language into Portuguese in real time.

According to Gabriel, the main objective of this AI is to revolutionize the way deaf people communicate, providing accessibility, independence and opportunities for the community. The project, still in development, demonstrates that it is capable of interpreting the signs of deaf people in videos, instantly translating them into Portuguese.

How does AI for deaf people work?

The project uses three distinct artificial intelligences to achieve its objective. The first AI is responsible for capturing key points on the body, mapping them across the frames of a video. This data is then sent to the second AI, a classifier that identifies the specific signal being made. The latest AI comes into action to contextualize the identified signals, transforming the list of predicted signals into a readable and understandable message in Portuguese.

The potential of this project is vast. In general, all communication between deaf people and people with difficulties in interpreting Libras would be simplified into a simultaneous translation that would resolve the noise in the dialogue. If effective, the technology could solve several communication problems that affect around 10 million deaf citizens in Brazil, around 5% of the Brazilian population, according to IBGE.

However, Gabriel Sales faces challenges in its project, such as hardware and resource limitations for collecting essential data. He seeks contributions to acquire a more powerful computer, high-quality cameras and resources to advance the project's research and development, through a vakinha online.

The project led solely by Gabriel Sales not only reflects his passion for technology, but also highlights the power of artificial intelligence to create innovative solutions with significant social impact. Posting videos about your AI updates on Instagram, @projeto_ia_libras, Gabriel has already gained more than 86,1 thousand followers, in addition to thousands of views on his posts.

Interview with the creator of AI

To better understand the functionality of this AI in real-time transcription of dialogues involving sign language, we interviewed the programmer and creator of the project, statistics student Gabriel Sales.

Who is the programmer and UFF statistics student behind the project? And how did you arrive at this idea?

Gabriel Sales: My name is Gabriel, I'm from Rio de Janeiro. I am a student passionate about technology. I had my first computer when I was ten years old and, since then, I got into this area of ​​computing. I started studying programming when I was twelve years old, and progressed until I got into data science. When I discovered the area I thought it was really cool, especially the artificial intelligence part, where robots think in an almost human way, so that left me very fascinated. And so I started studying these things. I entered the statistics faculty at UFF, as statistics is an important basis for data science. And I started my journey in this area of ​​artificial intelligence.

I started to delve deeper into AI, I started doing several small projects. And then I had a Libras class at my college, in the first semester of last year, and I was very fascinated by the teacher's story, because she was deaf and managed to become a doctor, despite all the difficulties she mentioned in her story.

So I started thinking about what I could do that was interesting and inclusive for this community, to help them. And then I started with an idea of ​​artificial intelligence that could classify the Libra alphabets. Type: letter a, letter b, letter c, etc. All this via video. That was my first idea, my first test to see how difficult this would be, how complex it would be, and then I evolved, started adding robust signals and improving the knowledge of AI.

How do you define your project? What do you intend to change in the current reality, especially for the hearing impaired, with it?

Gabriel Sales: The main objective is to revolutionize communication between deaf and hearing people, because there is a huge barrier between these people. Because, necessarily, you need to know Libras to be able to communicate with deaf people, so by having this artificial intelligence system, you can break down this barrier a little. We can apply this in companies, in customer service, in digital accessibility. The deaf client goes there and can clearly say what he wants. If he wants to buy something, if he needs any special assistance, without relying on a Libra interpreter, everything can be done digitally with AI.

Also in hospitals, in emergency situations, where the person needs quick care and sometimes you don't have someone who knows how to speak Libras, right? Sometimes you need to know her blood type, you need to know if she has any illness, or something like that, and with this system that would also be possible.

In education too, to teach Libras, this system in education will be very efficient in testing students' accuracy, if they are making the signs correctly, if they are learning well. And even for deaf teachers, who not all have the ability to speak, because there are some deaf people who can develop the ability to speak, but not all can, the system would also be useful in this case. So there are several applications that can be made.

Artificial intelligence that translates sign language
AI project aims to change the way hearing impaired people communicate. Photo: Freepik.

How do you describe how this AI works?

Gabriel Sales: There are three AIs. One to capture the key points of the body, so it will map the person's entire body, the face, the shoulder, the elbow, the hands, the fingers, and this over the course of the frames, because we need to do this on video . So this mapping will be done over the course of the frames. And after that, it will be passed to a classifier. It will classify this data, from the key points of the body, and will tell you which option it is.

And the last AI is to create the context of the signals. So after she has mapped the body, after she has classified which sign it is, she will put together the list of predicted signs and will transform this into a message, a natural message that is readable and understandable. Then this AI will assemble the context of the signals that have already been predicted by the second AI.

As a whole, is your project unique or do you have similar projects involving AI to assist deaf people in this way?

Gabriel Sales: There must be similar projects for sure. But none of them went ahead, perhaps due to lack of investment in Brazil. People end up going outside the country, especially because of the easy access to technology. But I don't know if they are developed (other projects), there may be prototypes.

In November 2023, Showmetech reported that Lenovo intends to launch, in 2024, an AI capable of interpreting and translating sign language.

In this case, for you, what is the biggest difference of your project?

Gabriel Sales: The ability to scale, it can scale very easily with investment and be developed faster... The main difference is that we overcome the communication barrier for deaf people, as this will help in education, economy, culture and politics.

What would you need to make the project scale?

Gabriel Sales: I need data, because the AI ​​needs videos to be trained... It's like teaching a child, we need to show what the signal is and repeat it several times until it understands. And also equipment to process all of them, since we will be creating a Big Data thus.

Gabriel launched, at the end of 2023, a Vakinha so that other people could support the development of the project.

What are your next steps with the project?

Gabriel Sales: I've been looking for a partner company for a while, there's one that has been helping me with my website. And now I'm looking for a partnership with some influential people in this market, to maybe open a startup, get investors and then be able to scale more easily. The idea is to sell to companies that want to have digital accessibility to offer this to their customers.

How do you protect your AI from being “plagiarized”?

Gabriel Sales: When it comes to artificial intelligence, it's difficult to plagiarize. You can do something similar, but the same, it's very difficult. It's like Elon Musk making an AI similar to ChatGPT, he can do it, but both will have different answers. Because what interferes with this is the available data, and in pounds it is scarce. Need investment in data collection.

What tests do you undergo to understand that the AI ​​is correctly interpreting the signals in pounds?

Gabriel Sales: I'm doing tests in real-time, making different signals to check the accuracy. The little knowledge in pounds that I learned during the college subject, I apply and watch videos on the internet to develop AI training.

See also:

Sources: Milne, Forbes e smart click.

reviewed by Glaucon Vital in 18 / 1 / 24.


Discover more about Showmetech

Sign up to receive our latest news via email.

Related Posts