1 comment | 4 shares
Estimated reading time: 5 minutes
1 comment | 4 shares
Estimated reading time: 5 minutes
Online content subscription platform OnlyFans attracted a good deal of attention at the end of August when it announced that it would be banning sexually explicit content, and then reversed the decision within days. Elena Martellozzo and Paula Bradbury of the Centre for Abuse and Trauma Studies at Middlesex University write about the risks that the platform poses and what more it could do to protect its users.
OnlyFans, a social media platform where users can sell and/or purchase original softcore or X-rated content, has come under scrutiny in recent weeks for deciding to restrict its sexually explicit content and, a few days later, for changing its mind. Created by a London based company, this controversial content-sharing platform allows creators to share paywalled or subscriber-only content. This model has earned the company billions from its more than 120 million users, by applying a 20% fee for the created content. This has been financially advantageous to both the company and the creators, with the top 1% of creators earning six figure sums per year.
Whilst OnlyFans initially took an ‘anything can be uploaded’ approach to user content, on 19 August, in response to concerns of banking partners and payment processors about potentially illegal content, the platform decided to restrict the availability of its sexually explicit content. People would still be able to post nude content on the site, but this would have to be in line with OnlyFans’ policies. However, just a few days later, the decision to restrict the availability of sexually explicit content was overruled following widespread backlash from its users. In a response, OnlyFans defended the decision noting that it “is short-term good news for sex workers reliant on the platform.” However, what does this mean for the platform’s responsibility to protect users?
Exploitation of children
Over the last few years, concerns have been raised about the sexual exploitation of individuals using OnlyFans after revelations that some under 18s, particularly girls, have been circumnavigating the age verification measures and setting up their own accounts to upload explicit images of themselves in exchange for money or gifts. Experts working in child protection are concerned that sites such as OnlyFans may be used by adults who are interested in targeting those who appear significantly younger than the rest of content creators. As we have previously argued, whilst the site might have changed sex work forever by creating a safe environment for sex workers to engage with their clients, it has also opened up a new arena for inexperienced and naive young people who are tempted by the financial rewards, yet not subject to the usual legal protections for under 18’s.
It is indeed a dangerous temptation. The law in the UK is very clear: it states that to be able to sell or distribute explicit content the creator must be 18 or over. Yet there is currently no legal requirement for online platforms to monitor explicit content that might have been generated by underage users. This suggests that both the underage person creating the content and the person that buys it could face criminal liability.
Adults are not risk-free
There are also a number of harms and risks posed to adults who join the site as content creators. In a recent interview on BBC Radio 5 Live, one female content creator, Camilla L, reported making over a million dollars a year but has, as a consequence, been greatly affected by stalkers who send her messages reporting their observations of her movements, causing her to move home and live constantly in fear. The platform creates the risk of cyberstalking, yet is not doing anything to address this.
OnlyFans has being used by ex-intimate perpetrators of stalking to sell images of their victims – a practice known as image-based abuse (IBA). IBA occurs when an intimate image or video is shared without the consent of the person pictured and stalking advocacy agencies have reported a significant increase in the rise of this form of crime. This further highlights how important it is that OnlyFans becomes a true partner to its creators and protects all users from abuse and non-consensual posts.
The debate over ‘explicit’ content
These risks contributed to the pressure on OnlyFans that led to the short-lived ban on explicit content. But it is also critical to reflect on the potential risks of pushing content creators away from this site, into possibly darker and less regulated corners of cyberspace, and therefore important to consider the potential benefits of a platform which provides a safer, more visible and regulated environment for sex workers.
The key is doing more to protect users
OnlyFans’ decision not to ‘explicitly’ change, however, appears to be purely financial. This focus also overlooks the importance of doing more to protect its users, such as more effectively regulating or preventing minors from accessing the site; tracking down and stopping sex perpetrators, and protecting all users from abuse and non-consensual posts. This view was supported by Honza Cervenka, an English lawyer representing victims of discrimination, harassment and non-consensual pornography, who told us that OnlyFans has yet to “properly and diligently check that all content on the page is legitimate and consensual. The reason for this is because it is simply too laborious, expensive and eats up into their profit margins”.
While the company claims to be evaluating over 300,000 media files a day and has more than 500 agents involved in compliance and moderation to flag content, artificial intelligence is not always reliable for detecting all types of harmful material. As argued by Professor Aiken, “cybersecurity does not protect what is to be human”. If OnlyFans would train experts to ensure that moderation is carried out successfully, it could allow for creators to continue to produce content in a safer environment.
The steps that OnlyFans has taken so far to protect its users have been unsatisfactory, to say the least. In May 2019, the platform introduced a new account verification process whereby the creator must provide a ‘selfie’ along with their ID on the image to prove their identity. However, this proved to be a futile system, as underage users have been able to use adult ID’s to create fake accounts.
More robust action is needed, and the UK government indicated that one of the aims of the 2017 Digital Economy Act was precisely “to have robust age verification controls in place to prevent children and young people under 18 from accessing pornographic material”. However, the long-awaited Online Safety Bill has left us somewhat disillusioned, as an effective and trusted method of age assurance has yet to be mandated to prevent under 18s from accessing sites like OnlyFans, which permit them to sell and access explicit images and puts them at risk of exploitation. So why wait any longer to build the digital environment they deserve?
This article represents the views of the authors and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.
Featured image: Photo by Priscilla Du Preez on Unsplash
Dr Elena Martellozzo is an Associate Professor in Criminology at the centre for Child Abuse and Trauma Studies (CATS) at Middlesex University. Elena has extensive experience of applied research within the Criminal Justice arena. Elena’s research includes exploring children and young people’s online behaviour, the analysis of sexual grooming and police practice in the area of child sexual abuse. Elena has emerged as a leading researcher and global voice in the field of child protection, victimology, policing and cybercrime. She is a prolific writer and has participated in highly sensitive research with the Police, the IWF, the NSPCC, the OCC, the Home Office and other government departments. Elena has also acted as an advisor on child online protection to governments and practitioners in Italy (since 2004) Bahrain (2016) and the Rwandan Government (2019) to develop a national child internet safety policy framework.
Paula Bradbury is a Criminology Lecturer and Doctoral Researcher within the School of Law at Middlesex University, exploring the appropriateness of current policy and practice relating to adolescent sexual offending and sexual behaviour between peers. She is passionate about researching online sexual offending behaviour and child abuse. Paula is an active member of the CATS team engaging in multiple research pathways to combat child sexual abuse both online and offline as a mixed methods researcher proficient in both quantitative and qualitative analysis, and project management. She is also the National Child Sexual Abuse Lead for Victim Support, serving as a project manager developing online support content for adult survivors or child sexual abuse.
Thank you for the very comprehensive and interesting analysis. It managed to cover various aspects of the topic and got me thinking about the issue of what happens if the site changes or doesn’t change its policies – what a complicated matter indeed.
Your email address will not be published. Required fields are marked *
© LSE 2021