x
Breaking News
More () »

Companies using tech to fight rise in sextortion, protect youth

Tech algorithms will automatically alert young users of potential scammers or dangerous images.

BOWLING GREEN, Ohio — Social media companies are working to protect kids from sextortion by sending parents notifications and alerting users about graphic content or nudity.

Computer science experts said this new wave of cyber security algorithms is all a part of a larger plan to protect kids from sextortion and unwarranted and illegal sexual content. 

The FBI defines sextortion as when criminals have videos or pictures of you and they threaten to publish that content. Bad actors can then threaten violence to get the victim to produce more images. According to the FBI, cases like this are on the rise.

RELATED: FBI warns of explosive rise of 'sextortion' schemes targeting teen boys

Tech algorithms are now coded with new protection features that will automatically blur nude images with a warning screen and remind users of the potential risks of sharing sensitive content. 

Experts said this tech will allow users of any age to make good, safe choices. 

"They are clearly tying it into age and age groups and things like that," said Bowling Green State University computer science professor Rob Green. "Kids need to be aware of it; parents need to be aware of it. And at the least, all of those pop up and pauses will let people take a deep breath, and perhaps start some conversations."

Now, online platforms like Meta are working to prevent sextortion. This includes removing message options on teen profiles on Instagram, pop-ups and blurring images to alert the profile user of potential risks and scammers. 

Professor Green, with BGSU has worked in Computer Science for nearly 20 years.

Green, who has worked in computer science for nearly 20 years, said algorithms are not new and you shouldn't fear them. Instead, he advised the public to work with them to promote safety. Green said platforms like Meta have feedback portals; using them will help protective algorithms become more accurate. 

RELATED: How sharing explicit videos and pictures can have legal consequences for kids and their parents

Green said the algorithms function by generating the likelihood that an image contains explicit content, but that human intervention is still an important part of the process. 

"[The algorithms] say, 'that's 70% likely, or 90% likely to be pornographic or sexual in nature,' and that's good; but what if it's 51%? Or 47%? Those edge cases are when things start to get dicey," said Green. "Do you alert people, don't you alert people - what's the proper place? And that's when you really need some type of human feedback that can step in from the outside and provide updated data."

Meta turns on these features by default for teens under 18.

The FBI says if young people are being exploited, they are the victim of a crime and should report it. 

WATCH MORE FROM WTOL 11

Before You Leave, Check This Out