Bumble Open Sources AI To Detect Unsolicited Messages

There has never been a more important time to protect the safety and security of users online. As tech takes over our everyday lives, we must keep up with technological advancements and use them for good. This is why it’s so great that Bumble recently open-sourced their AI that detects unsolicited nudes on their platform.

Through this innovation, they are championing an effort to promote digital safety and lending their technical knowledge so that developers worldwide can help implement similar systems across different platforms. Whether used as inspiration or directly applied, this development marks major progress in creating safer digital spaces for all users!

In keeping with its pledge to oppose “cyber flashing,” Bumble is making its Artificial Intelligence application available to the public, which can identify unwelcome explicit images.

After it was first introduced in 2019, Private Detector (which, let’s take a second to appreciate the name) can conceal nudity sent via the Bumble app, providing the recipient with the decision of whether they should open the picture or not. Pen the picture or not.

Even though a mere 0.1% of users send inappropriate images via our apps, our expansive network enables us to create a superior dataset of non-lewd and lewd pictures. As mentioned in our press release, this is intended to produce the most impressive results for the task.

A new and improved version of AI is now available on GitHub for commercial use, distribution, and alteration. Though it may not be the latest technology to create a model that can identify nude images, this service could benefit smaller companies that need more time to develop one.

It’s possible that dating apps and any product wherein people might send inappropriate images, such as the internet, could use this technology to protect users from unsolicited explicit content.

Since releasing Private Detector, Bumble has collaborated with legislators in the United States to introduce legal repercussions for sending unsolicited nude images.

Bumble says:

“There’s a need to address this issue beyond Bumble’s product ecosystem and engage in a larger conversation about how to address the issue of unsolicited lewd photos — also known as cyberflashing — to make the internet a safer and kinder place for everyone,”

Bumble announced that their AI had an accuracy rate of 98% when it was first introduced.

Bumble’s decision to open-source its unsolicited nude detector AI is a step in the right direction for the company. It shows they are willing to share their technology with others to help make the internet safer. This move will also help advance AI research and development, which everyone should welcome.

Source: TechCrunch

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top