The Inquisitr - Tech
The Bumble dating app is rolling out a new artificial intelligence system programmed to detect, censor, and flag nude, lewd, and otherwise sexually explicit photos.
Called the “Private Detector,” the system can detect inappropriate photos with an accuracy rating of 98 percent. The dating app hopes to use the AI to prevent their users from seeing unwanted lewd photos.
When an inappropriate photo is shared, the Private Detector will flag the snapshot and blur it before sending it.
“From there, the user can decide whether to view or block the image, and if compelled, easily report the image to the moderation team,” a representative of Bumble explained in a statement, according to PC Magazine.
For those who are unfamiliar with Bumble, CEO Whitney Wolfe Herd developed and founded the dating app after being inspired to create a more woman-first product based on her experience as the
To read the full article click on the 'post' link at the top.