The Online Safety Act (UK), and how X is failing its UK users
- A D Rowen
- 31 minutes ago
- 10 min read
If you live in the UK and view adult material online, you will no doubt be familiar with the latest impact of the legislation passed by the UK parliament in the form of the Online Safety Act 2023.
As of 25th July, the Act compels any online platform hosting adult-only material (i.e. pornography) to require users in the UK who want to view that material to provide proof that they are legal adults (over the age of eighteen).
So, if you go to a site like Pornhub or Onlyfans, and you are in the UK, you will have to go through an age verification ‘gate’ before you can visit the site. Previously, you usually just had to click a button that said you confirmed you were over eighteen (whether you were or not…) before going on to an adult site, but now UK law requires them to employ age-verification software, and threatens huge fines if they are found to still be allowing minors to have access to adult material.
How this works in practise is that you are asked to provide either a scan of a piece of photo ID confirming your birthdate, or you allow the software to scan your face to determine if you look the right age (fine for a hoary old wizard like me, less fun if you’re a baby-faced twenty-something).
And, look. There are certainly troubling, vaguely totalitarian concerns about how a government can pass laws preventing certain people from accessing completely legal material online, and how this could be misused in future (especially as the Act allows the British Home Secretary – a position held by a member of parliament – to be the final arbiter as to what is considered material unsafe for children).
And there are genuine concerns people have about having to provide personal data – a scan of their ID – to either porn site and social media providers, or third party companies that operate the age verification software. Of course, we are told that the software being used doesn’t retain any images or data once the verification is complete, but to accept this requires us to place a certain amount of trust in tech companies, and it is perhaps understandable that some people are reluctant to do that.
And there are people who are making good arguments that determined teenagers will manage to find ways around the age restrictions (such as using a VPN) which makes the whole thing largely pointless.
But, ultimately, I think all of us who are users of pornography, and/or creators of NSFW material, would agree that taking steps to try and stop it being so easy for children to access porn on their personal devices is the right thing to do. And while it’s inconvenient to have to do a face scan to have full access to platforms that host adult material (which includes social networks such as Bluesky and Discord), it does seem that the majority of these sites have taken steps to make the process as smooth as possible, and usually only require you to do it once.
No, for me at the moment, the biggest issue I have with the Online Safety Act 2023 and its implementation isn’t the Act itself, or the past or current UK government’s support for it.
Instead, the biggest problem for me, and many other UK adults who use social media to share and access NSFW material – especially for business reasons – is the failure of one of the largest social networks to actually allow UK users to verify their age at all.
If you are a user of Elon Musk’s X (formerly Twitter) and you are in the UK, and you previously followed and/or interacted with adult content creators/providers on X, you may well have noticed that your X feed currently looks somewhat different to how it used to. Certain types of post, and possibly even entire accounts, are now completely invisible to many UK users, and even if you are able to search through X to the point where you can actually get it to display the post (if not the content), you are greeted with the following message:
“Due to local laws, we are temporarily restricting access to this content until X estimates your age.”
Clicking on the “Learn more” link sheds a bit more light on what is currently happening (or at least, what X says is happening).
In short, unlike other social media rivals and the sites that specifically exist to provide adult material, X has opted to not (at this stage) bring in any age verification software that requires the user to take any action themselves.
They do say this is coming later – by all accounts, X is trying to develop its own software rather than pay a third party to use verification software that already exists, which is what most other platforms have done – although it is frustratingly vague about when.
But, in the meantime, X is trying to employ passive methods of age verification, taking an approach that is largely AI-driven to ascertain which users are over eighteen and which users are not, without the users themselves needing to do anything or even knowing what is taking place.
Some of these steps seem obvious in how they work – such as automatically flagging anyone who is currently identifying as being under eighteen via the birth date they have given when they set up their account, and green-lighting anyone who has had their account on X for thirteen years or more, or who is a verified public figure or business, and therefore presumably and adult by default.
Others are a bit more esoteric, though: specifically how X thinks estimation of age based on email address, or users’ social connections, is going to work in practice. They haven’t described this process publicly, and as time goes on and many of us remain stuck waiting for X to estimate our age, it seems more and more likely that this is going to be a difficult trick for them to pull off.
For example, take my X account, and my email address. As a writer, I use a pen name, and my social media accounts and email address for anything to do about my writing are in that pen name.
Because I have not given X my real name, and I’ve only been using my pen name since about 2016, any attempt by X to identify my age based on that is going to struggle. X isn’t going to be able to look at my social connections and see who I am married to or what year I graduated school, for example, because this information is not linked anywhere to my X account. As AD Rowen has only existed for a bit less than ten years, their AI has no way of knowing that I am any older than the age of my account itself. The passive, in-the background checks, whatever they are, just aren’t going to work for me.
Which would be annoying but okay, if there were other ways I could verify my age and return to being able to view accounts and posts which include NSFW material. But, at least at time of writing, X is only offering one way for users to actively prove their own age to the platform; if you are an X Premium user, you will have had to provide proof of identity in setting that up, and it is this that X says it will use to verify your age.
There are two problems with that. The first is that Premium accounts are a paid-for service. That means that currently X is saying that if you want to take action to prove your age (rather than waiting for the conclusion of a process that may never come), you must pay to do it – which is hardly providing fair access.
The second is that anecdotal evidence is showing that, even for people who have Premium and have therefore already provided proof of age, sensitive material remains invisible and out of reach – which suggests that X hasn’t activated this feature yet.
Why might X be doing this – dragging its feet over implementing effective age-verification software that will allow millions of users currently shut out of the full site to once again view and engage with content deemed sensitive, NSFW, pornographic, whatever?
One might cynically suggest that it is part of a plan by X, and the company’s owner Elon Musk, to exploit the enforcement of the Online Safety Act by forcing users to sign up for paid X Premium accounts – certainly, Musk would undoubtedly love if as many users as possible were contributing financially to the platform.
But if that is the case, why is it that the ID Verification feature offered by Premium (which has always been offered but has never previously been part of any sort of age verification: it’s more about proving you are a real person and not a bot – or a parody account) being described by UK users who do have Premium accounts as not affecting what they are able to view?
What seems to be more likely is that X simply wasn’t ready to go with a fully working system of age verification when the deadline of 25th July imposed by the UK law passed. This is perhaps unsurprising, given the way the company has been run since Musk bought it and installed himself as CEO. Musk fired huge chunks of the company’s staff in a bid to cut costs (and, he seemed to believe, to increase efficiency), and it seems entirely plausible that X simply didn’t have the sort of skilled people in place (or at least, not in the right numbers) to build in their own age-verification software and have it debugged, up and running, by 25th July.
However, there also seems to be an ideological element at play in the way X is choosing to approach compliance with the Online Safety Act – an ideological element that can, again, possibly be attributed to Musk and his vision for the direction of the company.
It’s not surprising that Musk utterly opposes the Online Safety Act, and other variations of it which are being brought in or proposed in other countries around the world. A self-professed free-speech advocate (although with an extremely narrow and self-serving interpretation of what “free speech” actually means to him), Musk views these types of legislation – and, more generally, attempts by various governments and lawmakers to make social media companies more accountable for the content users post on their platforms – as gross governmental overreach and an unwelcome interference in business.
So, it’s not beyond the realm of imagination to suggest that X’s current status, as the one social media network where UK users are not able to verify their status as adults and access all the content on there, as the result of the company under Musk stamping its feet and refusing to play.
Of course, privately Musk might feel that he could eventually win a battle with the UK government over this legislation, or simply do what he has done in the past and engineer its replacement with someone more friendly to his interests (his preferred UK political party, the hard-right Reform Party, has already said it would repeal the Online Safety Act if elected to government, although there won’t be a general election in the UK until 2029), but that doesn’t help the users cut out of being able to make the most of his network in the meantime.
But it isn’t just Musk’s anti-regulation ideology that could be holding X back from rolling out functional age-verification for UK users. X, and Musk, seems often to operate under the notion that they are the best and smartest, and can easily outperform other tech companies – especially through the employment of Artificial Intelligence over human endeavour. It is perhaps this ideology that has led to X relying on AI reviewing thousands of individual accounts to estimate (and thus verify) the ages of the users, rather than offering users a way of proactively verifying their own ages. And it is perhaps this ideology that has caused X to believe it can develop its own age verification software and integrate that into its platform, rather than incurring the financial cost of purchasing existing software from a third-party company, as smaller networks such as Bluesky and Discord have done.
Whatever the reason behind it, the outcome is clear: at the time of writing (5th August – eleven days after this regulation came into effect), X has yet to verify the ages of a huge number of UK-based users, myself included, and any and all sensitive, NSFW, adult content being uploaded by other users to the network remains invisible to us.
Is that really such a big problem, you might be thinking? If you want to goon, there are plenty of places you can go to find suitably stimulating material (which, although these will also be age-gated, are a darn sight more easy to get verified on than X is) – should X as a company really be that bothered about fixing things so that UK-based users can still look at porn there, when they aren’t specifically a porn provider?
And it’s true that a small part of my frustration does come from the fact that I am unable to currently view photos, videos and artworks uploaded by some of my favourite adult content creators who use X – and it’s also true that I can find at least some of these in other locations around the internet which I am still able to access.
But for me, this is about more than being frustrated at not being able to see a particular artist I like’s latest lewd cartoon, or learn about a particular model’s latest exhibitionist video.
My X account is based around my identity as an author of erotic, nudity-themed fiction. It exists to help me grow the audience for my work, and to make connections with other creators who do similar things. While I write for my own enjoyment (whether short story, blog or novel), the fact that I have chosen to sell my work rather than give it all away for free requires me to do some element of self-promotion, and for me, that is largely done through my social media presence. And the main site I was able to do this on – because I am promoting NSFW things a lot of the time – was X.
X is not a good site. Especially under Musk, where right-wing voices are amplified, sharing dangerous misinformation is permitted, and people can post hate speech with impunity. But it is a site where I have a (small) established audience, and it is a site where I am allowed to post images and content featuring nudity to promote my written work, or share the work of others I like, without worrying that I am breeching the network’s terms of service. While other creators, angry at X’s trajectory and the role Musk played in the 2024 US election, have jumped ship to Bluesky, I have kept up my presence on X (I also have a Bluesky but you’ll see I’m much less active there) – and how do they reward me? By not taking any steps that would make it possible for me to continue to operate as I have; by denying me the opportunity to have the full audience I had prior to the UK legislation coming in.
I have blogs I’m working on that, currently, I’m holding off on doing any more to because, well, what’s the point in creating something with a load of NSFW content in it if I can’t tell anyone who follows me about it?
And while there is an element of truth in them telling us that this legislation is to blame, it is also true that most other sites and networks have managed to comply with fairly minimal impact on user experience – only X stands out for its complete failure to allow users to self-verify if they wish to do so.
Comments