Pornography
8 in 10 Brits back ban on AI 'nudify' tools that digitally undress women and children
Press release: A ban on AI tools that allows users to digitally undress women and children is supported by almost 8 in 10 Brits, according to a poll carried out ahead of Safer Internet Day 2024.
Whitestone Insight polled 1,203 GB adults on behalf of CARE. Participants were asked if: 'Websites and apps that use AI to simulate sexually explicit content such as undressing women and children should be banned by the government.'
Seven in ten respondents strongly agreed (69%) and one in ten (10%) somewhat agreed. The number of respondents who strongly agreed rises to almost 8 in 10 (75%) discounting those who preferred not to answer. Full data is available below.
Whilst recent changes have the potential to make it an offence to share AI images without consent, technology allowing people to generate sexual images without a person's permission remains freely available. Campaigners are calling for ‘nudify’ tools to be specifically outlawed.
Louise Davies MBE, CARE's Director of Advocacy and Policy, said:
“The deepfake AI generated pictures of Taylor Swift that have swept social media for the last few weeks highlight a disturbing new trend. Technology now exists to take any picture, however innocent, and use artificial intelligence to turn it into a naked, pornographic image.
“With this year’s Safer Internet Day urging people to work together to make the internet safer, it is clear that a new and looming threat is AI technology. If lawmakers do not act swiftly, these apps and sites could generate untold harm. Not just for celebrities, but women and children everywhere.
“Last year it was estimated that links advertising ‘nudification’ apps and websites increased by 2,400%. The content they create is extremely realistic. As well as still images, some platforms allow users to create new pornographic videos where subjects appear to do whatever the user asks.”
Ms Davies added:
“We are particularly concerned about an impact on children and young people. There is a rise in the use of these apps in schools. Young girls are being dehumanised and treated as mere sex objects. Deepfake images cause serious mental and physical distress to victims.
“It is clear that swift action to ban these apps would make a difference. There can be no reasonable argument against banning them. No cogent argument can be put in favour of technology that creates sexualised images without a person’s consent.”
ENDS
Polling headline findings:
- 69% of respondents strongly agree with the government banning websites and apps that use AI to simulate sexually explicit content such as undressing women and children.
- This number rises to 75% strongly agreeing, discounting those who preferred not to answer.
- 10% somewhat agree, 2% somewhat disagree, 3% strongly disagree, and 9% don't know.
- 57% of 18–24-year-olds strongly agree, vs 73% of 65+.
- 60% of men strongly agree vs 73% of women.
The question:
There are now websites and apps that use artificial intelligence (AI) to simulate sexually explicit content, such as undressing women and children. Do you agree or disagree with this statement? 'Websites and apps that use AI to simulate sexually explicit content such as undressing women and children should be banned by the government.'
Total |
Men |
Women |
18-24 |
65+ |
|
Strongly agree |
69% |
60% |
73% |
57% |
73% |
Somewhat agree |
10% |
13% |
6% |
13% |
8% |
Somewhat disagree |
2% |
4% |
1% |
7% |
1% |
Strongly disagree |
3% |
3% |
3% |
2% |
4% |
Don't know |
9% |
9% |
8% |
8% |
6% |
Prefer not to say |
11% |
11% |
10% |
14% |
8% |
Methodology Note:
Whitestone Insight interviewed 1,203 GB adults online from 2nd-5th February 2024. Whitestone Insight is a member of the British Polling Council and abides by its rules.
Data tables:
Full data tables can be accessed here: AI-Explicit-Imagery-Survey-Final.pdf (care.org.uk)
About CARE
Christian Action Research and Education (CARE) is a social policy charity, bringing Christian insight to the policies and laws that affect our lives.
Contact us: press@care.org.uk
Share