MoreRSS

site iconDense DiscoveryModify

With each issue of DD you’ll receive a hand-picked collection of links spanning a variety of topics – from tech to design and sustainability to urbanism.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Dense Discovery

DD377 / First, fast, forgotten: the media lifecycle

2026-02-23 12:39:49

By the time you’ve finished reading this sentence, seventeen new big things have happened on the internet. Most of them will be forgotten within the hour – including, probably, by the people who posted them. Spend 10 minutes on any feed and try to recall what you consumed. Speed turns out to be a surprisingly effective substitute for substance.

Veteran tech journalist Om Malik has a nice diagnosis for this feeling. In a recent essay, he argues that the organising principle of our information ecosystem used to be authority: you earned attention by being right, by being credible, by being worth reading. What replaced it is velocity.

“What matters now is how fast something moves through the network: how quickly it is clicked, shared, quoted, replied to, remixed, and replaced. In a system tuned for speed, authority is ornamental. The network rewards motion first and judgment later, if ever. Perhaps that’s why you feel you can’t discern between truths, half-truths, and lies.”

“Networks compress time and space, then quietly train us to live at their speed.”

It’s more of a structural argument than a moral one. In other words, nobody woke up one day deciding to make the internet worse. The platforms built incentive systems that rewarded speed above everything else, and rational people – writers, reviewers, newsrooms – responded accordingly. Malik believes that the algorithm is not some toggle you can flick off; it is the culture. (Worth noting, though: the algorithm has owners. It isn’t a force of nature.)

He uses YouTube tech reviews as a case study. When a phone embargo lifts, dozens of polished reviews drop simultaneously – same talking points, same mood lighting, same conclusions. The reviewer who spent three months actually living with the product? Mostly gone from the feed before anyone finds them.

“The system rewards whoever speaks first, not whoever lives with it long enough to understand it. The ‘review’ at launch outperforms the review written two months later by orders of magnitude. The second, longer, more in-depth, more honest review might as well not exist. It’s not that people are less honest by nature. It’s that the structure pays a premium for compliance and levies a tax on independence. The result is a soft capture where creators don’t have to be told what to say. The incentives do the talking.”

This dynamic extends well beyond tech reviews:

“People do what the network rewards. Writers write for the feed. Photographers shoot for the scroll. Newsrooms frame stories as conflict because conflict travels faster than nuance. Even our emotional lives adapt to latency and refresh cycles. The design of the network becomes the choreography of daily life.”

The result is a culture optimised for first takes, not best takes.

To be fair, the authority-based media of the past wasn’t exactly a golden age of truth-telling – gatekeeping had its own distortions, its own capture, its own blind spots. Malik, to his credit, has no romantic attachment to the old days. What we’ve lost isn’t some pristine past, but a slower metabolism that at least gave an idea time to be wrong before it was replaced by another one.

And now to this week’s discoveries. – Kai

DD376 / Coding: from craft to commodity

2026-02-16 12:41:15

For years, ‘knowing how to code’ was treated like the golden ticket. Even junior software developers were paid absurd amounts of money, and those who couldn’t speak computer watched with a mix of envy and bewilderment. ‘Learn to code’ became the new ‘get a college degree’.

Well, software development is having its assembly line moment. As the machines become more capable, human input is being dramatically devalued. Coding is transforming from craft to manufacture, from bespoke tailoring to fast fashion.

In ‘The rise of industrial software’, Chris Loy argues that AI is turning software from a carefully crafted product into an industrial, disposable commodity.

“In the case of software, the industrialisation of production is giving rise to a new class of software artefact, which we might term disposable software: software created with no durable expectation of ownership, maintenance, or long-term understanding.”

Loy draws comparisons to other industrialised outputs. Just as industrial agriculture gave us both abundance and obesity – cheap food alongside malnutrition – industrial software comes with its own set of unhealthy side effects:

“Industrial systems reliably create economic pressure toward excess, low quality goods. This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach. The result is not abundance of the best things, but overproduction of the most consumable ones.”

Loy’s comparison of LLMs to steam engines made me pause. The steam engine didn’t just make factories more efficient – it fundamentally restructured civilisation. And software, unlike cheap clothing or ultra-processed food, isn’t just one industry among many. It’s become the substrate of every industry. So it’s easy to see how, for better or worse, the industrialisation of software will have far-reaching consequences.

Of course, industrialisation never completely erases craft. Handmade clothing still exists, so does organic whole food. Loy raises the possibility of an ‘organic software’ movement – the farmers markets of the software industry, if you will. Maybe there’s a future where bespoke code becomes a luxury good, signalling care and quality in a sea of disposable slop.

The bigger question, though, isn’t about craft – it’s about stewardship:

“Previous industrial revolutions externalised their costs onto environments that seemed infinite until they weren’t. Software ecosystems are no different: dependency chains, maintenance burdens, security surfaces that compound as output scales. Technical debt is the pollution of the digital world, invisible until it chokes the systems that depend on it. In an era of mass automation, we may find that the hardest problem is not production, but stewardship. Who maintains the software that no one owns?”

In another essay, Loy argues that developers aren’t being replaced but that their role shifts from writing code to setting practices, from solving problems to architecting systems where AI writes the software. Software that nobody fully understands.

Many developers got into this work because they liked solving puzzles and building things, not because they dreamed of one day becoming middle management for a very fast, very confident intern who occasionally hallucinates. Such is ‘progress’, I guess.

And now to this week’s discoveries. – Kai

DD375 / American acceptionalism

2026-02-09 12:48:17

I grew up thinking the USA was basically one giant action movie with better shopping. Then I heard stories from friends who’d visited, and it started to feel less like a blockbuster and more like a cautionary tale. Medical bankruptcies from routine procedures. School shootings treated like weather events. Elections where the person with fewer votes can still win.

So when someone first mentioned ‘American exceptionalism’ – speaking very little English at the time – what I heard was ‘acceptionalism’, which I interpreted as America’s unique ability to somehow accept these flaws and carry on anyway. Turns out my linguistic confusion might have accidentally described reality better than the actual term.

The notion of American exceptionalism has seen a weird inversion. Amanda Shendruk’s recent viral post is a visual reminder that American exceptionalism is real – just not in ways many Americans think.

Then I came across Adam Bonica’s essay, which is both realistic about America’s dysfunction but also still hopeful about its future. He opens with a childhood memory of watching the Berlin Wall fall when he was six (like me, I was eight), transfixed by strangers embracing, hammers in hand.

Now a political scientist, Bonica studies why transformative political moments like that almost never happen, until they do – and he’s convinced America might be approaching its own wall-smashing moment.

Like Shendruk’s piece, he points out that much of America’s dysfunction are solved problems somewhere else:

“Universal healthcare is not some utopian fantasy. It is Tuesday in Toronto. Affordable higher education is not an impossible dream. It is Wednesday in Berlin. Sensible gun regulation is not a violation of natural law. It is Thursday in London. Paid parental leave is not radical. It is Friday in Tallinn, and Monday in Tokyo, and every day in between.”

“There is another America inside this one, visible in the statistics of nations that made different choices. Call it Latent America: the nation that would exist if our democracy functioned to serve the public rather than protect the already powerful.”

But Bonica believes that the current turmoil might contain the seeds of its own undoing – in large parts because systemic corruption is now in the open:

“Hidden corruption persists because it is difficult to mobilize against. Exposed corruption shifts the axis of politics from left versus right to clean versus corrupt, people versus oligarchs. That’s a fight authoritarians lose.”

Twenty-eight years the Berlin Wall stood. Then it fell in a matter of hours. Some transformations require decades of patient building and arduous organising. Others arrive like a fever breaking, sudden and irreversible. There was a moment when enough people stopped believing in the wall’s inevitability and saw it for what it was: a political choice that could be unmade.

“The wall looks permanent until the day it comes down. So it goes with all institutions. They are not immutable fixtures but human creations, designed to solve the problems of one era and replaceable when they fail the next.”

And, gosh, is America’s wall visible now! Rendered in Shendruk’s damning charts, performed daily by oligarchs courting power without shame. And once the mechanisms are this exposed, the fiction that any of this represents normal democratic function becomes harder to maintain with each passing day.

And now to this week’s discoveries. – Kai

DD374 / AI and the propaganda of inevitability

2026-02-02 12:58:33

Not a day passes without another AI think piece. I’ve mostly trained myself to scroll past them – the prophecies and confident predictions built on speculation. Last week I shared an O’Reilly piece because it offered something rare: a sober assessment grounded in how technology actually evolves, not how we fear it might.

That said, I’m not entirely immune to the more philosophical vision pieces. I try to read them like speculative fiction – thought experiments that provoke rather than pronouncements to believe. They’re useful for the questions they raise, not the answers they claim.

Peter Adam Boeckel’s recent essay falls into this category. A designer and futurist, Boeckel makes plenty of assumptions about AI’s trajectory. His central argument is that the real threat of AI isn’t job loss – it’s the displacement of purpose itself, that psychological scaffolding we’ve hung our sense of self upon.

“Purpose is not lost when a person stops working; it is lost when the work stops needing the person. … We are not defending competence but significance.”

He’s probably right, though work isn’t always our primary source of meaning. Family, community, faith, care work – these have always anchored us, often more deeply than any job.

For me, the essay’s strongest section is on education. Here Boeckel offers a future that feels (sort of) hopeful:

“If automation dismantles the architecture of work, education must become the architecture of meaning. The challenge is no longer how to prepare people for jobs that may soon vanish, but how to prepare them for a life where purpose is not delivered by employment.”

“A system can simulate empathy; a teacher can model it. What future education requires is not less technology, but more intentional humanity. The teacher of tomorrow will not compete with machines on knowledge, but on presence – on the ability to awaken curiosity, to hold silence, to provoke reflection.”

This is what I agree with: as knowledge becomes infinitely accessible, physical presence becomes scarce, a privilege even.

“The live moment, once ordinary, will become a premium product: an education not delivered, but experienced.”

It’s already happening: the return to in-person workshops, social gatherings, live performances – all the things that can’t be streamed or optimised. They resist scaling because presence is the point.

More broadly, what bothers me about essays like this, though, is the constant whiff of technological inevitability. By framing AI’s impact as civilisational and consciousness-altering, these vision pieces make resistance feel futile. Who argues with evolution?

But this isn’t evolution – it’s decisions made by a handful of corporations with extraordinary capital and influence. The future Boeckel describes isn’t arriving on its own; it’s being actively designed by companies with specific incentives that rarely align with the contemplative, wisdom-centred education he describes.

The risk is that these grand philosophical narratives become cover for continued privatisation and corporate control. We get sold the promise of transformation while the actual infrastructure – the algorithms, the data, the compute – remains firmly in the hands of a few.

So, do we need more essays imagining ‘new architectures of meaning’? There’s genuine transformation happening, for sure. But most AI think pieces sidestep the boring, near-term levers that actually give us some agency over how technology unfolds – labour standards, data governance, antitrust enforcement, policy interventions. The question isn’t whether AI will change us, but whether and how we’ll fight for any say in how.

And now to this week’s discoveries. – Kai

DD373 / Offloading risk, distributing fear

2026-01-26 14:49:55

Early after moving into our new apartment building here in Melbourne, we kept getting hit by burglars who stole bikes and ransacked storage cages. The response was predictable: we spent hours reviewing CCTV footage, filing police reports and reinforcing gates. As anxiety rose, many of us demanded more security measures, even private guards.

Looking back, I can see our default response was to turn to what anthropologists Mark Maguire and Setha Low call ‘security capitalism’. (I featured their latest book Trapped back in DD326, but only really discovered their work through a discussion in a recent podcast.)

Maguire and Low argue that security has morphed from an inalienable right into a commodity hoarded by those who can afford it. The central mechanism is pretty insidious: those with resources create ‘interior worlds’ – gated communities, securitised enclaves, fortified homes – and in doing so, they don’t just protect themselves, they actively make everyone else less safe.

One of my main take-aways: security, by its very nature, is antagonistic to equality. The risk doesn’t disappear – it just gets offloaded onto those without the means to purchase protection.

What fuels this system is an entire gadget- and service-slinging industry ready to profit from our fear:

“Just as middle and upper middle classes, and especially the wealthy, are becoming more risk averse and have the power to pay for that, there’s a giant sector that’s feeding that, that is more than willing to sell you some gadgetry. The more of it there is, the more it becomes ubiquitous. And it also feeds into status anxiety.”

Inside these spaces, security becomes the dominant lens through which to view the world. The more people invest in security, the more threats they begin to identify: workers you let in, teenagers gathering, a person in a hoodie, someone walking too slowly – suddenly there are red flags everywhere.

“The more you securitise your life, the more those walls and gates and guards make your life all about fear rather than less about fear. And so, as the fear grows, then you want more security, you buy more gadgets, you support all kinds of policing initiatives.”

Importantly, this dynamic extends into public space. When a park gets heavily securitised – police presence, cameras, controlled access – it becomes exclusionary:

“That means that young people of colour will probably not go because they don’t hang around where the police are. It means that people who don’t have a place to sleep probably won’t go there either. And suddenly, you have this homogenised space.”

In our time of intense uncertainty, the impulse to buy our way to safety is entirely understandable. But security capitalism offers only the illusion of protection while accelerating the societal breakdown we fear. This creates “a self-fulfilling prophecy of fearful people wanting more security, the state and private sector producing it, only to make the world more fearful for some and poorly protected for others”.

The alternative – rebuilding social connections, investing in genuine public space, fostering mutual aid – sounds almost quaint. What makes it so difficult is that we’re chasing something that doesn’t actually exist: there is no security in nature, only the management of inevitable risk. We know the walls we’re erecting aren’t freedom, but the illusion of safety feels more tangible than the difficult, incremental work of building trust and community.

And now to this week’s discoveries. – Kai

DD372 / Friction-maxxing through 2026?

2026-01-19 23:07:51

Tech companies have spent years perfecting their image as enablers – as tools that promise to amplify our capabilities. The pitch has always been ‘convenience’ and ‘efficiency’. But today, we’re coming to terms with the fact that we’re learning less, thinking less, tolerating less. We increasingly behave more like toddlers expecting machines to handle life’s unpleasantness.

Writing in The Cut (free archived version), Kathryn Jezer-Morton argues that tech companies are succeeding in making us think of life itself as inconvenient – something to continuously escape from into digital padded rooms of predictive algorithms and single-tap commands.

“Reading is boring; talking is awkward; moving is tiring; leaving the house is daunting. Thinking is hard.”

She adds an urgent perspective to this discussion – that of a parent watching what these tools of escape are doing to us and, more worryingly, to kids.

“Our love of escaping is one of humanity’s most poetically problematic tendencies, and now it’s being used against us. A friend of mine, a father of two young kids, admitted to me that the high point of each day is sitting on the toilet with his phone. … We’re foie gras ducks being force-fed escapism.”

Once we’ve adopted these habits of escape, the act of returning to unmediated existence feels insufferable.

“We become exactly like toddlers in the five minutes after the iPad is taken away: The dullness and labour of embodied existence is unbearable.”

“Children are the easiest targets for tech companies because they don’t know the difference between suffering and friction – one difference between children and adults is that adults do. Or at least, we’re supposed to.”

To counter this trend, she’s coined a brilliant term to carry us through 2026: friction-maxxing.

“Friction-maxxing is … the process of building up tolerance for ‘inconvenience’ (which is usually not inconvenience at all but just the vagaries of being a person living with other people in spaces that are impossible to completely control) – and then reaching even toward enjoyment. And then, it’s modelling this tolerance, followed by enjoyment and humour, for our kids.”

The notion of technology as an eliminator of friction has appeared again and again in DD. What we’re really talking about is learning to distinguish between friction and suffering – recognising that not all discomfort is bad, that some resistance makes us stronger, sharper, more alive. The tech companies want to collapse that distinction, to sell us a world where nothing is ever awkward or boring or difficult.

I can’t help but think our current political moment can be partly explained this way. We’re living through humanity’s most prosperous period, yet everyone feels disillusioned. We’ve been conditioned to expect frictionless existence and now we’re collectively enraged that it hasn’t delivered happiness. That gap between promised utopia and persistent dissatisfaction – that’s where cynicism breeds, where political rage finds its fuel.

The end game was never convenience but a texture-rich life that challenges and rewards us. Not happiness as a frictionless state, but satisfaction earned through the friction itself.

And now to this week’s discoveries. – Kai