Scroll Top

Bots are taking out paedophiles and keeping children safe from online sexual predators

WHY THIS MATTERS IN BRIEF

There are many ways to use bots, they can spread fake news, or as in this case, be used to catch paedophiles and keep children safe from online sexual predators.

 

Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.

Everyone knows that it’s getting harder to tell Bots from real people on the internet, in fact, in a recent survey in the US over 60 percent of adults said they could no longer tell the difference. And oddly, as far as this latest piece of news goes, that’s a good thing. A very good thing. Because the same bots that are out there trolling fake news, malware and other nefarious wares are now being used to catch paedophiles.

As many people know it’s a sad fact, a very sad fact, that paedophiles often hang out in online chatrooms, looking to strike up conservations with unsuspecting children, and in the worst cases, they arrange face-to-face meetings, resulting in sexual assault. But now a new Artificial Intelligence (AI) algorithm, however, has been designed to help keep that from happening.

 

See also
ClearView puts the full power of facial recognition into law enforcement’s hands

 

Known as the Chat Analysis Triage Tool (CATT), the algorithm was created by a team from Indiana’s Purdue University led by assistant professor Kathryn Seigfried-Spellar. It was developed by analyzing 4,353 messages in 107 chat sessions that involved subsequently arrested sex offenders. More specifically, the researchers utilized a process known as statistical discourse analysis to identify different trends in word usage and conversation patterns.

Among other things, CATT’s able to detect a tactic commonly used by offenders seeking a meeting, in which they first attempt to gain trust by disclosing something about themselves – this usually takes the form of a negative personal story, such as their being the victim of parental abuse.

 

See also
AI powered copywriting arrives for Facebook

 

Additionally, before suggesting a meeting, offenders often chat with the child for a period of weeks or even months, essentially “grooming” them. By contrast, paedophiles who are only interested in chatting typically quickly move on from child to child.

By detecting these factors and others, it is now hoped that CATT could be used to sort through the plethora of suspicious conversations in chatrooms, alerting police to ones that might be leading up to a real-world encounter. To that end, the university now plans to turn the tool over to several law enforcement departments for a test run – it could be in use by the end of the year.

 

See also
New blockchain DNS system would put an end to DDoS attacks

 

“If we can identify language differences, then the tool can identify these differences in the chats in order to give a risk assessment and a probability that this person is going to attempt face-to-face contact with the victim,” says Seigfried-Spellar. “That way, officers can begin to prioritize which cases they want to put resources toward to investigate more quickly.”

paper on the research is being published in the journal Child Abuse and Neglect.

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This