The UK data watchdog must introduce age verification for commercial pornography sites or face a high court challenge over any failure to act, children’s safety groups have warned.

The demand in a letter to the Information Commissioner’s Office (ICO) states that the government’s failure to stop children seeing porn is causing lifelong trauma and putting children at risk of abuse and exploitation. It urges the ICO to use the powers under the recently introduced age appropriate design code (AADC) to introduce rigorous age-checking procedures for publicly accessible porn sites.

However, the ICO has said the code is not intended to impose age assurance measures on porn sites. The author of the letter, John Carr, secretary of the Children’s Charities’ Coalition on Internet Safety, which represents 11 major children’s charities, including the NSPCC, Barnardo’s and the Children’s Society, has written to the information commissioner, Elizabeth Denham, urging her to reconsider.

“You remain the only person in the country with the power to act to protect another generation of children from the distorting impact of open access to pornography, with all the dreadful and now well-documented effects we know that has on society, particularly in respect of violence towards women and girls. I ask you to reconsider your decision not to act against pornography websites.”

John Carr, secretary of the Children’s Charities’ Coalition on Internet Safety
John Carr, secretary of the Children’s Charities’ Coalition on Internet Safety. Photograph: James Drew Turner/The Guardian

It comes after a survey of Barnardo’s frontline workers found high levels of concern among staff about the impact of porn on children, particularly those who are already vulnerable.

Barnardo’s child exploitation experts told the Guardian they are seeing growing numbers of children acting in highly sexualised ways or being vulnerable to abuse because they have been watching porn.

“It is everywhere; on the school bus, in the corridors, in their social media feeds. Children should not be seeing this material, it is a form of trauma and abuse to let them see it,” one case worker told the Guardian.

A number of recent studies have shown the increasingly detrimental ways that porn affects young people, including that children as young as seven are unintentionally stumbling across it online. A PSHE (personal, social, health and economic) study found that when children viewed porn, 70% frequently saw men portrayed as dominant (compared with 17% frequently seeing women as dominant) and 35% frequently saw “consensual” violence towards women (compared with 9% frequently seeing this towards men).

In an interview with the Guardian, Carr said the coalition was prepared to seek a judicial review if the ICO declines to act. “We are asking the information commissioner to act now, using the powers she has under the AADC, to ensure porn companies don’t let children on their sites. If she doesn’t, we will look at a judicial review of her decision not to act,” he said.

Carr added that “this is really about age verification” – a form of age assurance where users are asked to provide formal proof of their identity through passports, credit cards or other forms of identification known as hard identifiers.

The AADC applies to websites and apps likely to be accessed by children, such as TikTok, Facebook and Instagram, and aims to prevent misuse of their data, such as use of “nudge” techniques aimed at encouraging children to give up more of their privacy or spend more time online. In some cases, it recommends use of age assurance measures – such as age verification – if your service is deemed to pose risks to children. Companies that breach the code face fines of up to 4% of annual global turnover.

The ICO challenge comes as the government prepares to put landmark internet safety legislation through parliament. The online safety bill charges social media and tech companies with a duty of care to protect children from harmful content, but campaigners have warned that in its draft form it does not go far enough to protect minors.

The bill does not make overt references to age assurance – the umbrella term for measures that check an app or website user’s age – but a private member’s bill introduced by crossbench peer Beeban Kidron, the architect of the AADC, could provide a pathway. The age assurance (minimum standards) bill sets out a framework for introducing basic standards of age-checking online and could effectively be included in the online safety bill if, for instance, Ofcom – the regulator charged with implementing the bill – is given powers within the legislation to introduce new standards.

Kidron said age assurance has been promised since 2017, when it was a part of the Digital Economy Act expressly for porn, but was never implemented. “Parents, parliament and the press want a solution to the pornography issue, and in that sense tomorrow is not soon enough. But pornography is only one of many issues children face online. I brought the bill forward to put a stop to the commercial exploitation of kids – it is absolutely urgent. My bill will do what John [Carr] wants and more.”

Plans to introduce an age verification system in the UK for online porn were abandoned in 2019, when then culture secretary Nicky Morgan told parliament that the government would instead focus on including child protection in the online safety bill.

Vanessa Morse, CEO of the Centre to End All Sexual Exploitation, said: “It is unconscionable that the government is delaying age verification until the online safety bill comes into force when it could be brought in now. Make no mistake, porn sites are not neutral or naive. They are actively engaging all users, including children, through data surveillance, SEO and algorithms, to get them to stay on their sites and return more often because this makes them money. This is exploitative and an unlawful misuse of children’s data and it must be stopped. We want the ICO to investigate this.”

Kidron added that age assurance is not just needed to monitor use of commercial porn sites but also to prevent social media sites from profiteering from children under the age of 13 – the age barrier for most platforms. “Lets face it, if they know a kid is 12 then they have to answer the question, why are they recommending [content about] extreme diets, self-harm or suicide?”

The ICO’s executive director for tech and innovation, Stephen Bonner, said the AADC, is intended for sites that are used by children or likely to be used by children – and therefore does not apply to adult content sites. “The goal of the code is to make [relevant] sites child friendly.”

He added: “There are some sites that we do not expect children to use. In those cases, the code does not apply. It would be ridiculous to make adult sites child-friendly.”

The threat of action against the ICO comes after a judge last week gave campaigners permission to bring a judicial review against the government over its failure to enact the Digital Economy Act.

The case, which will be heard in the new year, centres on the argument that children are being harmed and their human rights affected by the failure to introduce age verification. It is brought by two campaigners, 20-year-old Ava Vakil, a student and activist involved in raising the issue of sexual violence among teenagers, and Ioannis Dekas, a father of four who is giving evidence about the impact of porn on his own children, despite his efforts to stop them watching it.

Vakil said: “It is incredibly important that we prevent access to pornography for young children, particularly given that so much of pornographic content online includes sexual violence and the objectification of women. Most children encounter porn online before they talk about sexual consent in PSHE or at home, creating a culture in which sex becomes deviant and pornographic, and not based in shared trust and intimacy.

“It is simply unbelievable that children are prevented from seeing a rated-18 film in the cinema, and yet can access sexually violent and explicit content with a few taps of their fingers.”

In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International

Source: Guardian

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

The Chicks Mourn Tragic Death Of Founding Member Laura Lynch

Morgan Lieberman/Getty Images The country and western world lost a major star…

Man Repairs Daughter’s Tablet for Her 15th Birthday, Sees Text from Her Saying He’s Not Her Dad

A man raised without parents transformed his life after meeting his wife…

ESPN Responds To Pat McAfee’s Claims Executive Is “Trying To Sabotage” His Show

ESPN is responding to claims made by Pat McAfee that an executive…

Storm Reid Has Three Siblings and a Non-biological Sister Zendaya

Storm Reid’s siblings are supportive of her career. The actress began acting…