What are you looking for?

18 November 2022 | Podcasts | Article by Alan Collins

The safety of snapchat for children: HJ Talks About Abuse


In this week’s podcast, the abuse team talk about Snapchat and the problematic nature of the social media platform which has in turn been allowing predators to exploit minors for sexual gratification very easily.

The app, since its launch in 2011 with 363 million daily active users worldwide, is loved particularly by young children for funny selfies, filters and it’s automatically deleting messages features, but it is these very aspects of the social media platform that has given rise to increased grooming and trafficking opportunities. Many accuse the app of empowering sexual predators to engage in sexually abuse children with less fear of being caught – encouraging offenders.

Whilst it can for the most part be a source of pure, innocent and harmless fun for children, adding humorous sound effects, flower crowns and clip art over photos and videos, it can also have devastating consequences in regard to child sexual abuse.

itunes listen on apple podcast listen on spotify podcasts listen on google podcasts e1686741324453

 

What is Snapchat?

The platform is age rated 13+ and mainly operates as a social media messaging app where you can send photos and videos that will disappear after a short period of time and can only be viewed once. The app also features voice notes, private chats, games, news stories, discover and story features.
Another popular and unique feature of the app is the location feature ‘snap maps’, which allows any users location to be shown down to the very building that they are in, in real time. 25% of snapchat users are below 18 years old.

 

What is the issue?

According to the Times, police are investigating three new cases a day of child exploitation linked to the app, but messages that are self-destructing are allowing groomers identity to be avoided.

Many offenders will utilise Snapchat as their tool for initially connecting with and befriending children, falsely implying who they are as there are no ID requirements when signing up to the app, nor any facility to upload of photo of who you are speaking with; just a username and ‘bitmoji character’. It is from there they are able to communicate and begin the grooming process by taking advantage of the apps location and photo and videoing sending feature, convincing their victims that they are safe to send indecent content of themselves as they automatically disappear after a few seconds. It has been revealed that in fact images and videos aren’t as self-destructive as the app may imply, and if on the receiving end of a technologically sophisticated offender (as many paedophiles who are taking the initiative to use the app are), indecent content of children is making its way on the dark web, according to the UK children’s charity NSPCC. Not to mention offenders using another device to take a photograph or video of the indecent content that has been received.
Sickeningly, it is not uncommon to read how convicted offenders used the ‘snap maps’ feature of the app to hunt down their victims to sexually assault them. This feature, for obvious reasons, is an absolutely huge safety risk.

The Sunday Times investigation has uncovered that since 2014 there have been ‘thousands of reported cases that have involved Snapchat’ including ‘paedophiles using the app to elicit indecent images from children and to groom teenagers,’ as well as being an enabler for under-18s to be ‘spreading child pornography themselves.”

 

Child Users Exposure to Sex Workers on Snapchat

Beside from the app already normalising sexting with young users, it is also facilitating sex workers who use the app as a sort of marketing tool to come into contact with child users of the app, whether this is intentional or not, which has the potential to put children at risk or at least expose them to adult content, which can pose safety concerns for minors.

In particular, with it being so easy to ‘quick add’ random users by only a simple one-step click of a button, often other unknown users are connected with you, most of the time without you even realising.

This can mean that children are inadvertently and unwittingly are putting themselves at risk to exposure of adult content by possible sex worker users on Snapchat, who are increasingly using the app as a platform to send photos and videos of themselves to other users.

 

Discover feature on Snapchat

This allows users to see content from various media outlets which can be sources of adult inappropriate and sex orientated content. This poses risk of exposing children to risk of sexual abuse. A lawsuit in the US in 2016 cited some offensive Snapchat Discover content including titles such as “people share their secret rules for sex” and “10 things he thinks when he can’t make you orgasm.” Is it appropriate for users as young as 13 years old to be viewing this sort of content, and what are the implications of this?

 

So, what are Snapchat doing to help protect children on the app?

According to Forbes, Snapchat told them “We care deeply about protecting our community and are sickened by any behaviour which involves the abuse of a minor. We work hard to detect, prevent and stop abuse on our platform and encourage everyone – young people, parents and caregivers – to have open conversations about what they’re doing online. We will continue to proactively work with governments, law enforcement and other safety organizations to ensure that Snapchat continues to be a positive and safe environment.”

Snapchat does have in place various safety features, but how far do these actual protect children against paedophiles who take advantage of the app, hidden behind a screen using false account details?

Family centre

Snapchat’s new family centre gives you an overview of your child’s activity on the app. By linking your account to your child’s, you can see a list of their friends and who they have contacted in the last seven days – but not the content of those messages. Family centre also gives you access to a confidential reporting service that allows you to report any concerns directly to Snapchat’s Trust and Safety team.

Ghost mode

Enabling this will stop other users from seeing your child’s location. To edit location settings, go to the cog button in the right-hand corner of the screen. Then enable ‘Ghost mode’ and select ‘until switched off’ to make sure it stays enabled.

Limit contact from adult users

Snapchat has introduced restrictions to help limit unwanted contact from adults. Adults will not be allowed to add young people who are 17 and under unless they have a certain number of friends in common. This won’t stop all contact from adults, but it will help to limit it.

Privacy settings

There are different privacy settings available that will help to limit who can see your child’s account and contact them. Who can contact me – This lets you manage who can contact your child. Who can view my story – Here you can block specific people from viewing their story. To explore the different privacy settings available, select the cog in the right-hand side of the screen and select ‘Privacy’.

Default chat functions

By default, you can’t chat to someone on Snapchat unless you are friends. Make sure to speak to your child about who they accept friend requests from.

Reporting

To report another user, press and hold on their Snapchat ID, select ‘More’ and ‘Report’. Visit our reporting online safety concerns advice page or contact the NSPCC Helpline for more support.

But aside from this, it seems as though there are no specific safeguarding measures in relation to children in place when it comes to protecting them against sexual abuse.

Given the nature of the messaging, it is hard to monitor communication, particularly for parents to see the actual content of their child’s messages with other users. Even if you have the family centre monitoring tool that allows you to see the content of your child’s phone remotely, you won’t be able to see what was actually sent as the content is automatically deleted.

If you have been affected by the topics raised in this week’s podcast or would like more information, please get in touch with the Abuse team. You can contact Alan Collins at alan.collins@hughjames.com or Danielle Vincent at danielle.vincent@hughjames.com

Author bio

Alan Collins

Partner

Alan Collins is one of the best known and most experienced solicitors in the field of child abuse litigation and has acted in many high profile cases, including the Jimmy Savile and Haut de la Garenne abuse scandals.  Alan has represented interested parties before public inquiries including the Independent Jersey Care Inquiry, and IICSA (Independent Inquiry into Child Sexual Abuse).

Internationally, Alan works in Australia, South East Asia, Uganda, Kenya, and California representing clients in high profile sexual abuse cases. Alan also spoke at the Third Regional Workshop on Justice for Children in East Asia and the Pacific in Bangkok hosted by Unicef and HCCH (Hague Conference on Private International Law).

Disclaimer: The information on the Hugh James website is for general information only and reflects the position at the date of publication. It does not constitute legal advice and should not be treated as such. If you would like to ensure the commentary reflects current legislation, case law or best practice, please contact the blog author.

 

Next steps

We’re here to get things moving. Drop a message to one of our experts and we’ll get straight back to you.

Call us: 033 3016 2222

Message us