2025-11-07 22:05:11
Surveilling. Not empowering, surveilling:
Empowering young people to use the Internet safely and protecting them from potential harm is not just a priority for the European Commission, but for many countries around the world. It is by working together towards this shared goal with like-minded partners that we can best achieve it.
2025-11-04 05:54:13
Key to the Online Safety Act was a novel idea from UK academics Lorna Woods & William Perrin that Online Harm is like Physical Harm, so platforms and venues both should have “duties of care” to prevent harm occurring amongst punters. In short: platforms should police user speech for “safety”, and the state should demand such.
Surprise: this is fundamentally illegal in some countries. But they knew that.
The whole point of the “duty of care” approach was to try and knowingly do an end-run around the US first amendment (and its application to platform content via Section 230) which otherwise electrically grounds the entire Internet with a huge nation, full of huge platforms, full of speech that upsets … many people, especially certain kinds of academic.
Here’s commentary on the First Amendment (via Section 230) from Baroness Kidron’s “5 Rights Foundation” which was instrumental in shaping the UK Online Safety Act. Note especially the framing that this is all a matter of “big tech profits against the little people” — we shall return to that:
With the ongoing debate in the US about Section 230 it is more important than ever that the protection of children online is put above the commercial interests of big tech firms. Section 230 is already controversial and has been criticised for giving tech firms the latitude to ignore the law and the needs of users. In Canada, the free trade agreement between the United States, Canada and Mexico saw the inclusion of Section 230-style protections for tech firms. Canada is the base for Pornhub, the largest pornography site in the world. When Pornhub was found to be monetising child rape and child sexual abuse material, the Canadian Government representative in the Senate, Senator Marc Gold, had to admit that “there are provisions in the North American Free Trade Agreement that make it difficult to deal with a company like Pornhub.”
Encouragingly, both Republicans and Democrats want change, and the US Supreme Court has criticised the way Section 230 lets online services off the hook for promoting illegal content, and for refusing to police their own platforms. While US Congress is likely to consider reform, it is not a given that the new administration will act swiftly or in the UK’s interests. The voice of the tech lobby is powerful, and it is vital that that the principle of non-regression is applied to the protections for children contained in the Online Safety Bill.
https://5rightsfoundation.com/wp-content/uploads/2024/10/Ambitions_for_the_Online_Safety_Bill.pdf
Ignoring for the moment the question of whether online harms really are the same as real-world harms — in a warzone perhaps some tweets are as problematic as AK47s — clearly the British thinkers at the 5 Rights Foundation were confident that America could be swung round to their way of thinking; after all, the problem was “the voice of the tech lobby”, right, not to mention the problem of billionaire robber barons of the tech industry, failing to protect little users? And everyone hates billionaires.
Unfortunately for the thinkers behind the online safety “duty of care”, the critical issues being raised were not ones of tech company profits; they were legitimate and well-worn issues of free speech. Hence this posting at ITIF from June 2024, flushing some of them out including a straight pot-shot at KOSA, the contentious American Kids Online Safety Act (KOSA, familiar acronym?)
NetChoice has sued Arkansas, Ohio, and Utah over their social media age verification laws, arguing that they violate the First Amendment by requiring users to hand over sensitive personal information in order to access online communication tools. Meanwhile, the Free Speech Coalition, a trade association representing the adult industry, has sued Louisiana, Utah, and Texas over their adult website age verification laws for similar reasons.
…
While age-appropriate design provides a useful set of guiding principles for online services with underaged users, enforcing these standards the way the CAADCA does would cause more harm than good. Requiring companies to act in the best interests of children—or face fines up to $7,500 per affected child—is an incredibly broad and ill-defined standard that is difficult, if not impossible, for online services to perfectly follow. Additionally, as NetChoice outlined in its lawsuit, the CAADCA may also violate the First Amendment by giving the government of California power to dictate online services’ editorial decisions.
…
Finally, KOSA would establish a “duty of care” for any online service that is reasonably likely to be used by a minor. Specifically, these online services would have a duty to ensure their design features prevent and mitigate harm to minors. This provision has caused the most controversy of any part of the bill, with critics arguing that the language is vague and undefined by existing case law, which would complicate compliance. Online services may overcorrect and make it more difficult for minors, or potentially all users, to access helpful content related to mental health, suicide, addiction, eating disorders, sexuality, and more. The duty of care provision may even violate the First Amendment, as the government cannot dictate an online service’s editorial decisions, which could include design features.
https://itif.org/publications/2024/06/03/how-to-address-childrens-online-safety-in-united-states
See also the reference to CAADCA, the contentious Californian attempt to import (again) the 5 Rights Foundation-informed Age-Appropriate Design Code (AADC) thinking. Others in the UK have previously made fine critiques of the AADC, but as with the rest of the Online Safety Act, the entirety of Government, regulation, and policy wonkery has been so high on its own supply of goodliness and anticapitalism that nobody was going to pay attention to a few dissenting experts on technology and global law.
So all of this is a roundabout background to suggest you go read this Verge piece on the current state of KOSA, also archived:
One of the biggest flashpoints for internet regulation, the Kids Online Safety Act, is poised for a revival — but possibly without the central feature that’s kept people fighting over it for the past three years.
…
[…] KOSA could return to the House of Representatives with the duty of care provision removed. The rumored changes could amount to KOSA’s core provision going out with a whimper, even as lawmakers are rumored to be planning a package of several kids safety bills soon after the government reopens from the shutdown.
Meanwhile, for some longtime opponents of KOSA, removing the duty of care could resolve a central concern they have with the bill: that it could incentivize social media companies to remove helpful and potentially lifesaving resources for kids from marginalized communities. But the overall kids safety package could make that a Pyrrhic victory, placing the gutted KOSA alongside bills with potentially similarly troubling implications for online speech.
Yeah, it looks like the lights have finally come on. I strongly suspect that KOSA is mostly an illiberal and misconceived proposal even with Duty of Care removed… but I would welcome seeing a few dramatic story arcs being aired, pointing out what a damnfool idea it was in the first place.
Some of them even got awards for thinking it up.
2025-11-03 19:34:18
I was a Facebook engineer, I had my own instance — gigabytes of PHP & other goo — and when I was there (10+ years ago) the feed code was hacked-on by herds of maths PhDs & considered black magic by 95% of the other engineers, including me.
And now someone whom Ofcom can afford to pay is going to “audit” today’s version.
“…if we don’t start to get the answers that we’re looking for on the algorithmic side of things” — lol, yes, of course that’s how it works.
“Have they changed the way the algorithms work so that children don’t get that material shown to them? That’s a big focus for us,” she said. “[Are there] anti-grooming protections in place so that children’s feeds only accept children’s profiles and can’t be directly messaged by people they don’t know?”
As part of this work, Ofcom can order tech sites to do “an algorithmic audit” over these feeds. “You may see some enforcement action from us in the next few months if we don’t start to get the answers that we’re looking for on the algorithmic side of things,” Dawes added.
2025-11-03 19:13:35
A small clique of UK peers, activists, academics & journalists, see “The Internet” as a monster from which they must protect “The People”. Frustrated by disobedience & general lack of progress, they’re attacking each other while demanding harsher whips, blind to the truth: the monster *is* “The People”. They will lose, but it will take ages & cripple the UK.
It’s going to take years to fix this, not least that we’re probably going to have to wait for most of the cohort to become bored, distracted, or dead, before some saner perspectives can be established. I wish that I could be more optimistic, but the UK does not have the blunt instrument of positive liberties being enumerated in a strong constitution nor bill of rights.
We are now in the land of the Higgs Ratchet effect:
Higgs aimed to demonstrate that contemporary models to explain the growth of government did not explain why growth historically occurred in spurts, rather than continuously. Higgs formulated the ratchet effect to explain this phenomenon. He theorized that most government growth occurred in response to real or imagined national “crises” and that after the crises, some, but rarely all, of the new interventions ceased. [wp]
So first it was “we must prevent children stumbling across pornography” and then it became “we must prevent children stumbling across anything harmful” and now they are proposing “we must prevent children stumbling across complex technical means to circumvent these controls“. It will be several more years of ratcheted illiberality before we have momentum and mass enough to undo this, but we must still resist in the meanwhile.
We have statements that:
Baroness Liz Lloyd warned there was “limited evidence on children’s use of VPNs”, or virtual private networks, which can help internet users bypass UK internet rules and filters.
…but of course this means that they are going to go look for that evidence and not stop until they can synthesise a case for what they want to do:
We look forward to your response on these matters, in addition to the further information on your roadmap for OSA implementation and methods to detect the use of virtual private networks (VPNs) that you agreed to provide during our session. We will also be interested to see a breakdown on the purposes of use of VPNs by under-18s when this data is available.
Why? Well basically: they’ve had a small success, and like any addictive substance the problem with success is that you have to keep getting more, or alternatively you need to drop it, go cold turkey, and walk away — especially before anyone can criticise the consequences and impact of what you did.
The peers, activists, (etc) are not yet fit to go cold-turkey, so we’re in for several more years of post-parental rage at children for growing up differently to how they did, disguised in Orwellian surveillance and control “for their own good”.
If you think this is hyperbole, go read Baroness Keeley’s public letter to Ofcom:
Discussing the implementation of the OSA more broadly, witnesses described Ofcom’s approach as “gradualist” and “incredibly slow”. We heard that platforms are not “quaking in their boots” with regard to enforcement.
When did it become appropriate for someone in Government (Chair of the Comms & Digital Committee?) to frame desired political outcomes as “platforms [should be] quaking in their boots“? Perhaps junior doctors should “tremble with trepidation”, critical newspapers “shake like leaves”, and underperforming rail franchises “hang their heads in shame”?
This is meant to be a professional discussion in a professional context, talking about big issues of privacy and liberty, but with such tone on public display (“Please note that we intend to place this letter, and your response, in the public domain.”) the only conclusions we can draw are that the people driving the effort are ever more hungry for power, impact, and the bloody scourge of their (own) lash… or else they’re a bunch of people who failed to control their own children and are now frustrated at failing to control others’, treating everyone in their path in a similar fashion.
edit: 2055h: broken link fixed
2025-11-03 18:04:26
Too often, the public debate around encryption and child safety is framed as a zero-sum game. Privacy or protection, never both. This framing misunderstands the technology and also reinforces outdated power structures that position surveillance as the only way to protect children […] We’ll look at how encryption can be aligned with feminist values and explore how digital infrastructures can protect vulnerable communities without surveillance.
2025-11-01 10:59:29
Nice essay on what the fediverse *should* be doing:
[It] doesn’t need to resurface the tired, worn-down tennis courts of the old internet. It needs to create something new. [It] offers an entirely different kind of online experience. It’s smaller, more local, more relational. It’s messy in the best ways. It prioritises conversation over content, sharing over extraction, and community governance over corporate control.
https://jaz.co.uk/2025/10/30/there-is-one-fediverse-there-are-a-million-pickleball-courts/