2026-01-28 00:22:32
A few months ago, I got to interview a bunch of bloggers at a writing retreat. I discovered that they all had one thing in common: each one was in possession of a secret.
I don’t mean that they, like, murdered their boyfriend and got away with it, or that they’re sexually attracted to mollusks. I mean that they had all tapped some ve…
2026-01-21 02:55:47
The hot new theory online is that reading is kaput, and therefore civilization is too. The rise of hyper-addictive digital technologies has shattered our attention spans and extinguished our taste for text. Books are disappearing from our culture, and so are our capacities for complex and rational thought. We are careening toward a post-literate society, where myth, intuition, and emotion replace logic, evidence, and science. Nobody needs to bomb us back to the Stone Age; we have decided to walk there ourselves.
I am skeptical of this thesis. I used to study claims like these for a living, so I know that the mind is primed to believe narratives of decline. We have a much lower standard of evidence for “bad thing go up” than we do for “bad thing go down”.
Unsurprisingly, then, stories about the end of reading tend to leave out some inconvenient data points. For example, book sales were higher in 2025 than they were in 2019, and only a bit below their high point in the pandemic. Independent bookstores are booming, not busting; 422 new indie shops opened last year alone. Even Barnes and Noble is cool again.
The actual data on reading isn’t as apocalyptic as the headlines imply. Gallup surveys suggest that some mega-readers (11+ books per year) have become moderate readers (1-5 books per year), but they don’t find any other major trends over the past three decades:
Other surveys document similarly moderate declines. For instance, data from the National Endowment for the Arts finds a slight decrease in reading over the past decade:
And the American Time Use Survey shows a dip in reading time from 2003 to 2023:

These are declines, no doubt. But if you look closely at the reading time data, you’ll notice that the dip between 2003 and 2011 is about twice the size of the dip between 2011 and 2023. In fact, the only meaningful changes happen in 2009 and 2015. I’d say we have two effects here: a larger internet effect and a smaller smartphone effect, neither of which is huge. If the data is right, the best anti-reading intervention is not a 5G-enabled iPhone circa 2023, but a broadband-enabled iMac circa 2009.
Ultimately, the plausibility of the “death of reading” thesis depends on two judgment calls.
First, do these effects strike you as big or small? Apparently, lots of people see these numbers and perceive an emergency. But we should submit every aspiring crisis to this hypothetical: how would we describe the size of the effect if we were measuring a heartening trend instead instead of a concerning one?
Imagine that time-use graph measured cigarette-smoking instead of book-reading. Would you say that smoking “collapsed” between 2003 and 2023? If we had been spending a billion dollars a year on a big anti-smoking campaign that whole time, would we say it worked? Kind of, I’d say, but most of the time the line doesn’t budge. I wouldn’t be unfurling any “Mission Accomplished” banners, which is why I am not currently unfurling any “Mission Failed” banners either.1
The second judgment call: do you expect these trends to continue, plateau, or even reverse? The obvious expectation is that technology will get more distracting every year. And the decline in reading seems to be greater among college students, so we should expect the numbers to continue ticking downward as older bookworms are replaced by younger phoneworms. Those are both reasonable predictions, but two facts make me a little more doubtful.
Fact #1: there are signs that the digital invasion of our attention is beginning to stall. We seem to have passed peak social media—time spent on the apps has started to slide. App developers are finding it harder and harder to squeeze more attention out of our eyeballs, and it turns out that having your eyeballs squeezed hurts, so people aren’t sticking around for it. The “draw people in” phase of the internet was unsurprisingly a lot more enticing than the “shake ‘em down” phase—what we now refer to, appropriately, as “enshittification”. The early internet felt like sipping an IPA with friends; the late internet feels like taking furtive shots of Southern Comfort to keep the shakes at bay. So it’s no wonder that, after paying $1000 for a new phone, people will then pay an additional $50 for a device that makes their phone less functional.
Fact #2: reading has already survived several major incursions, which suggests it’s more appealing than we thought. Radio, TV, dial-up, Wi-Fi, TikTok—none of it has been enough to snuff out the human desire to point our pupils at words on paper. Apparently books are what hyper-online people call “Lindy”: they’ve lasted a long time, so we should expect them to last even longer.
It is remarkable, even miraculous, that people who possess the most addictive devices ever invented will occasionally choose to turn those devices off and pick up a book instead. If I was a mad scientist hellbent on stopping people from reading, I’d probably invent something like the iPhone. And after I released my dastardly creation into the world, I’d end up like the Grinch on Christmas morning, dumfounded that my plan didn’t work: I gave them all the YouTube Shorts they could ever desire and they’re still...reading!!
Perhaps there are frontiers of digital addiction we have yet to reach. Maybe one day we’ll all have Neuralinks that beam Instagram Reels directly into our primary visual cortex, and then reading will really be toast.
Maybe. But it has proven very difficult to artificially satisfy even the most basic human pleasures. Who wants a birthday cake made with aspartame? Who would rather have a tanning bed than a sunny day? Who prefers to watch bots play chess? You can view high-res images of the Mona Lisa anytime you want, and yet people will still pay to fly to Paris and shove through crowds just to get a glimpse of the real thing.
I think there is a deep truth here: human desires are complex and multidimensional, and this makes them both hard to quench and hard to hack. That tinge of discontent that haunts even the happiest people, that bottomless hunger for more even among plenty—those are evolutionary defense mechanisms. If we were easier to please, we wouldn’t have made it this far. We would have gorged ourselves to death as soon as we figured out how to cultivate sugarcane.
That’s why I doubt the core assumption of the “death of reading” hypothesis. The theory heavily implies that people who would once have been avid readers are now glassy-eyed doomscrollers because that is, in fact, what they always wanted to be. They never appreciated the life of the mind. They were just filling time with great works of literature until TikTok came along. The unspoken assumption is that most humans, other than a few rare intellectuals, have a hierarchy of needs that looks like this:
I don’t buy this. Everyone, even people without liberal arts degrees, knows the difference between the cheap pleasures and the deep pleasures. No one pats themselves on the back for spending an hour watching mukbang videos, no one touts their screentime like they’re setting a high score, and no one feels proud that their hand instinctively starts groping for their phone whenever there’s a lull in conversation.2
Finishing a great nonfiction book feels like heaving a barbell off your chest. Finishing a great novel feels like leaving an entire nation behind. There are no replacements for these feelings. Videos can titillate, podcasts can inform, but there’s only one way to get that feeling of your brain folds stretching and your soul expanding, and it is to drag your eyes across text.
That’s actually where I agree with the worrywarts of the written word: all serious intellectual work happens on the page, and we shouldn’t pretend otherwise. If you want to contribute to the world of ideas, if you want to entertain and manipulate complex thoughts, you have to read and write.
According to one theory, that’s why writing originated: to pin facts in place. At first, those facts were things like “Hirin owes Mushin four bushels of wheat”, but once you realize that knowledge can be hardened and preserved by encoding it in little squiggles, you unlock a whole new realm of logic and reasoning.

That’s why there’s no replacement for text, and there never will be. Thoughts that can survive being written into words are on average truer than thoughts that never leave the mind. You know how you can find a leak in a tire by squirting dish soap on it and then looking for where the bubbles form? Writing is like squirting dish soap on an idea: it makes the holes obvious.
That doesn’t mean every piece of prose is wonderful, just that it can be. And when it reaches those heights, it commands a power that nothing else can possess.
I didn’t always believe this. I was persuaded on this point recently when I met an audio editor named Julia Barton, who was writing a book about the history of radio. I thought that was funny—shouldn’t the history of radio be told as a podcast?
No, she said, because in the long run, books are all that matter. Podcasts, films, and TikToks are good at attracting ears and eyes, but in the realm of ideas, they punch below their weight. Thoughts only stick around when you print them out and bind them in cardboard.
I think Barton’s thesis is right. At the center of every long-lived movement, you will always find a book. Every major religion has its holy text, of course, but there is also no communism without the Communist Manifesto, no environmentalism without Silent Spring, no American revolution without Common Sense. This remains true even in our supposed post-literate meltdown—just look at Abundance, which inspired the creation of a Congressional caucus. That happened not because of Abundance the Podcast or Abundance the 7-Part YouTube Series, but because of Abundance the book.
A somewhat diminished readership can somewhat diminish the power of text in culture, but it’s a mistake to think that words only exercise influence over you when you behold those words firsthand. I’m reminded of Meryl Streep’s monologue in The Devil Wears Prada, when Anne Hathaway scoffs at two seemingly identical belts and Streep schools her:
...it’s sort of comical how you think that you’ve made a choice that exempts you from the fashion industry when, in fact, you’re wearing a sweater that was selected for you by the people in this room.3
What’s true in the world of fashion is also true in the world of ideas. Being ignorant of the forces shaping society does not exempt you from their influence—it places you at their mercy. This is easy to miss. It may seem like ignorance is always overpowering knowledge, that the people who kick things down are triumphing over the people who build things up. That’s because kicking down is fast and loud, while building up is slow and quiet. But that is precisely why the builders ultimately prevail. The kickers get bored and wander off, while the builders return and start again.
I have one more gripe against the “death of literacy” hypothesis, and against Walter Ong, the Jesuit priest/English professor whose book Orality and Literacy provides the intellectual backbone for the argument.
Most of the differences between oral and literate cultures are actually differences between non-recorded and recorded cultures. And even if our culture has become slightly less literate, it has become far more recorded.
As Ong points out, in an oral culture, the only way for information to pass from one generation to another is for someone to remember and repeat it.4 This is bit like trying to maintain a music collection with nothing but a first-generation iPod: you can’t store that much, so you have to make tradeoffs. Oral traditions are chock full of repetition, archetypal characters, and intuitive ideas, because that’s what it takes to make something memorable. Precise facts, on the other hand, are like 10-gigabyte files—they’re going to get compressed, corrupted, or deleted.
Writing is one way of solving the storage problem, but it’s not the only way, and we use those other ways now more than ever. Humans took an estimated 2 trillion photos in 2025, and 20 million videos get uploaded to YouTube every day. No one knows how many spreadsheets, apps, or code files we make. Each one of these formats allows us to retain different kinds of information, and it causes us to think in a different register. What psychology is unlocked by Photoshop, iMovie, and Excel?
There is something unique about text, no doubt, and I’m sure a purely pictographic, videographic, or spreadsheet-graphic culture would be rather odd and probably dysfunctional. But having more methods of storage makes us better at transmitting knowledge, not worse, and they allow us to surpass the cognitive limits that so strongly shape oral culture.
Put another way: hearing a bard recite The Iliad around a campfire is nothing like streaming the song “Golden” on YouTube. That bard is going to add his own flourishes, he’s going to cut out the bits that might offend his audience, he’s probably going to misremember some stanzas, and no one will be able to fact-check him. In contrast, the billionth stream of “Golden” is exactly the same as the first. Even if people spend less time reading, it is impossible to return to a world where every fact that isn’t memorized is simply lost. I don’t believe we are nearly as close to a post-literate society as the critics think, but I also don’t believe that a post-literate society is going to bear much resemblance to a pre-literate society.
I have text on my mind right now for two reasons.
The first is that I’m writing a book, and it’s almost done. So maybe everything I’ve said is just motivated reasoning: “‘Books are very important!’ says man with book”.5 But the deeper I get, the more I read the thoughts that other people have tamed and transmuted into a form that could be fed into the printing press and the inkjet printer, and the more I try to do the same, the more I’m convinced that there is a power here that will persist.
The second reason is Experimental History just turned four. This is usually the time of year when I try to wax wise on the state of the blogosphere and the internet in general. So here’s my short report: it’s boom times for text.
I know that what we used to call “social media” is now just television you watch on your phone. I know that people want to spend their leisure time watching strangers apply makeup, assemble salads, and repair dishwashers. I know they want to see this guy dancing in his dirty bathroom and they want to watch Mr. Beast bury himself alive. These are their preferences, and woe betide anyone who tries to show them anything else, especially—God forbid—the written word.
But I also know that humans have a hunger that no video can satisfy. Even in the midst of infinite addictive entertainment, some people still want to read. A lot of people, in fact. I serve at their pleasure, and I am happy to, because I think the world ultimately belongs to them. 5,000 years after Sumerians started scratching cuneiform into clay and 600 years after Gutenberg started pressing inky blocks onto paper, text is still king. Long may it reign.
Here are last year’s most-viewed posts:
And here are my favorite MYSTERY POSTS that went out to paid subscribers only:
Thank you to everyone who makes Experimental History possible, including those who support the blog, and those who increase its power by yelling at it on the internet. Godspeed to all of you, and may your 2026 be too good for words.
By the way, here is the actual data on cigarette smoking, via the American Lung Association:
In case you’re wondering, the decline in smoking has not been offset by an increase in vaping. While adults vape a tiny bit more today than they did in 2019, the difference is very small:
It’s curious that the word “phubbing” (a combination of phone and snubbing) never caught on. Maybe that’s because it was coined by fiat: an advertising agency simply decided that’s what we should call it when someone ignores you and looks at their phone instead. If this word had bubbled up naturally from the crush of online discourse, maybe it would have gotten more buy-in, and maybe we’d be more sensitive to the phenomenon.
I know it might seem rich to quote a movie in a post that’s extolling text, but then Streep didn’t deliver her diatribe off the dome. Someone wrote it for her, and then she spoke it aloud. (And that script in turn was based on a novel.) It’s an obvious point, but when we’re decrying the death of literacy, it’s easy to forget that film is mainly a literate art form.
Improv comedy, in contrast, is a purely oral art form. And I can’t help but notice that film actors make millions and become world famous by speaking written words aloud, while improvisational actors never touch text at all and they mainly go into debt.
This isn’t completely true, of course—a parent can fashion a handaxe and hand it down to their child, thus transferring some handaxe-related knowledge. But lots of facts can’t be encoded in axes.
It comes out Spring 2027, so I’ll have more details then.
2026-01-07 00:45:44
Here’s the most replicated finding to come out of my area of psychology in the past decade: most people believe they suffer from a chronic case of awkwardness.
Study after study finds that people expect their conversations to go poorly, when in fact those conversations usually go pretty well. People assign themselves the majority of the blame for any awkward silences that arise, and they believe that they like other people more than other people like them in return. I’ve replicated this effect myself: I once ran a study where participants talked in groups of three, and then they reported/guessed how much each person liked each other person in the conversation. Those participants believed, on average, that they were the least liked person in the trio.
In another study, participants were asked to rate their skills on 20 everyday activities, and they scored themselves as better than average on 19 of them. When it came to cooking, cleaning, shopping, eating, sleeping, reading, etc., participants were like, “Yeah, that’s kinda my thing.” The one exception? “Initiating and sustaining rewarding conversation at a cocktail party, dinner party, or similar social event”.
I find all this heartbreaking, because studies consistently show that the thing that makes humans the happiest is positive relationships with other humans. Awkwardness puts a persistent bit of distance between us and the good life, like being celiac in a world where every dish has a dash of gluten in it.
Even worse, nobody seems to have any solutions, nor any plans for inventing them. If you want to lose weight, buy a house, or take a trip to Tahiti, entire industries are waiting to serve you. If you have diagnosable social anxiety, your insurance might pay for you to take an antidepressant and talk to a therapist. But if you simply want to gain a bit of social grace, you’re pretty much outta luck. It’s as if we all think awkwardness is a kind of moral failing, a choice, or a congenital affliction that suggests you were naughty in a past life—at any rate, unworthy of treatment and undeserving of assistance.
We can do better. And we can start by realizing that, even though we use one word to describe it, awkwardness is not one thing. It’s got layers, like a big, ungainly onion. Three layers, to be exact. So to shrink the onion, you have to peel it from the skin to the pith, adapting your technique as you go, because removing each layer requires its own unique technique.
Before we make our initial incision, I should mention that I’m not the kind of psychologist who treats people. I’m the kind of psychologist who asks people stupid questions and then makes sweeping generalizations about them. You should take everything I say with a heaping teaspoon of salt, which will also come in handy after we’ve sliced the onion and it’s time to sauté it. That disclaimer disclaimed, let’s begin on the outside and work our way in, starting with—
The outermost layer of the awkward onion is the most noticeable one: awkward people do the wrong thing at the wrong time. You try to make people laugh; you make them cringe instead. You try to compliment them; you creep them out. You open up; you scare them off. Let’s call this social clumsiness.
Being socially clumsy is like being in a role-playing game where your charisma stat is chronically too low and you can’t access the correct dialogue options. And if you understand that reference, I understand why you’re reading this post.

Here’s the bad news: I don’t think there’s a cure for clumsiness. Every human trait is normally distributed, so it’s inevitable that some chunk of humanity is going to have a hard time reading emotional cues and predicting the social outcomes of their actions. I’ve seen high-functioning, socially ham-handed people try to memorize interpersonal rules the same way chess grandmasters memorize openings, but it always comes off stilted and strange. You’ll be like, “Hey, how you doing” and they’re like “ROOK TO E4, KNIGHT TO C11, QUEEN TO G6” and you’re like “uhhh cool man me too”.
Here’s the good news, though: even if you can’t cure social clumsiness, there is a way to manage its symptoms. To show you how, let me tell you a story of a stupid thing I did, and what I should have done instead.
Once, in high school, I was in my bedroom when I saw a girl in my class drive up to the intersection outside my house. It was dark outside and I had the light on, and so when she looked up, she caught me in the mortifying act of, I guess, existing inside my home? This felt excruciatingly embarrassing, for some reason, and so I immediately dropped to the floor, as if I was in a platoon of GIs and someone had just shouted “SNIPER!” But breaking line of sight doesn’t cause someone to unsee you, and so from this girl’s point of view, she had just locked eyes with some dude from school through a window and his response had been to duck and cover. She told her friends about this, and they all made fun of me ruthlessly.
I learned an important lesson that day: when it comes to being awkward, the coverup is always worse than the crime. If you just did something embarrassing mere moments ago, it’s unlikely that you have suddenly become socially omnipotent and that all of your subsequent moves are guaranteed to be prudent and effective. It’s more likely that you’re panicking, and so your next action is going to be even stupider than your last.
And that, I think, is the key to mitigating your social clumsiness: give up on the coverups. When you miss a cue or make a faux pas, you just have to own it. Apologize if necessary, make amends, explain yourself, but do not attempt to undo your blunder with another round of blundering. If you knock over a stack of porcelain plates, don’t try to quickly sweep up the shards before anyone notices; you will merely knock over a shelf of water pitchers.
This turns out to be a surprisingly high-status move, because when you readily admit your mistakes, you imply that you don’t expect to be seriously harmed by them, and this makes you seem intimidating and cool. You know how when a toddler topples over, they’ll immediately look at you to gauge how upset they should be? Adults do that too. Whenever someone does something unexpected, we check their reaction—if they look embarrassed, then whatever they did must be embarrassing. When that person panics, they look like a putz. When they shrug and go, “Classic me!”, they come off as a lovable doof, or even, somehow, a chill, confident person.
In fact, the most successful socially clumsy people I know can manage their mistakes before they even happen. They simply own up to their difficulties and ask people to meet them halfway, saying things like:
Thanks for inviting me over to your house. It’s hard for me to tell when people want to stop hanging out with me, so please just tell me when you’d like me to leave. I won’t be mad. If it’s weird to you, I’m sorry about that. I promise it’s not weird to me.
It takes me a while to trust people who attempt this kind of social maneuver—they can’t be serious, can they? But once I’m convinced they’re earnest, knowing someone’s social deficits feels no different than knowing their dietary restrictions (“Arthur can’t eat artichokes; Maya doesn’t understand sarcasm”), and we get along swimmingly. Such a person is always going to seem a bit like a Martian, but that’s fine, because they are a bit of a Martian, and there’s nothing wrong with being from outer space as long as you’re upfront about it.
When we describe someone else as awkward, we’re referring to the things they do. But when we describe ourselves as awkward, we’re also referring to this whole awkward world inside our heads, this constant sensation that you’re not slotted in, that you’re being weird, somehow. It’s that nagging thought of “does my sweater look bad” that blossoms into “oh god, everyone is staring at my horrible sweater” and finally arrives at “I need to throw this sweater into a dumpster immediately, preferably with me wearing it”.
This is the second layer of the awkward onion, one that we can call excessive self-awareness. Whether you’re socially clumsy or not, you can certainly worry that you are, and you can try to prevent any gaffes from happening by paying extremely close attention to yourself at all times. This strategy always backfires because it causes a syndrome that athletes call “choking” or “the yips”—that stilted, clunky movement you get when you pay too much attention to something that’s supposed to be done without thinking. As the old poem goes:
A centipede was happy – quite!
Until a toad in fun
Said, “Pray, which leg moves after which?”
This raised her doubts to such a pitch,
She fell exhausted in the ditch
Not knowing how to run.
The solution to excessive self-awareness is to turn your attention outward instead of inward. You cannot out-shout your inner critic; you have to drown it out with another voice entirely. Luckily, there are other voices around you all the time, emanating from other humans. The more you pay attention to what they’re doing and saying, the less attention you have left to lavish on yourself.
You can call this mindfulness if that makes it more useful to you, but I don’t mean it as a sort of droopy-eyed, slack-jawed, I-am-one-with-the-universe state of enlightenment. What I mean is: look around you! Human beings are the most entertaining organisms on the planet. See their strange activities and their odd proclivities, their opinions and their words and their what-have-you. This one is riding a unicycle! That one is picking their nose and hoping no one notices! You’re telling me that you’d rather think about yourself all the time?
Getting out of your own head and into someone else’s can be surprisingly rewarding for all involved. It’s hard to maintain both an internal and an external dialogue simultaneously, and so when your self-focus is going full-blast, your conversations degenerate into a series of false starts (“So...how many cousins do you have?” “Seven.” “Ah, a prime number.”) Meanwhile, the other person stays buttoned up because, well, why would you disrobe for someone who isn’t even looking? Paying attention to a human, on the other hand, is like watering a plant: it makes them bloom. People love it when you listen and respond to them, just like babies love it when they turn a crank and Elmo pops out of a box—oh! The joy of having an effect on the world!

Of course, you might not like everyone that you attend to. When people start blooming in your presence, you’ll discover that some of them make you sneeze, and some of them smell like that kind of plant that gives off the stench of rotten eggs. But this is still progress, because in the Great Hierarchy of Subjective Experiences, annoyance is better than awkwardness—you can walk away from an annoyance, but awkwardness comes with you wherever you go.
It can be helpful to develop a distaste for your own excessive self-focus, and one way to do that is to relabel it as “narcissism”. We usually picture narcissists as people with an inflated sense of self worth, and of course many narcissists are like that. But I contend that there is a negative form of narcissism, one where you pay yourself an extravagant amount of attention that just happens to come in the form of scorn. Ultimately, self-love and self-hate are both forms of self-obsession.
So if you find yourself fixated on your own flaws, perhaps its worth asking: what makes you so worthy of your own attention, even if it’s mainly disapproving? Why should you be the protagonist of every social encounter? If you’re really as bad as you say, why not stop thinking about yourself so much and give someone else a turn?
Social clumsiness is the thing that we fear doing, and excessive self-focus is the strategy we use to prevent that fear from becoming real, but neither of them is the fear itself, the fear of being left out, called out, ridiculed, or rejected. “Social anxiety” is already taken, so let’s refer to this center of the awkward onion as people-phobia.
People-phobia is both different from and worse than all other phobias, because the thing that scares the bajeezus out of you is also the thing you love the most. Arachnophobes don’t have to work for, ride buses full of, or go on first dates with spiders. But people-phobes must find a way to survive in a world that’s chockablock with homo sapiens, and so they yo-yo between the torment of trying to approach other people and the agony of trying to avoid them.
At the heart of people-phobia are two big truths and one big lie. The two big truths: our social connections do matter a lot, and social ruptures do cause a lot of pain. Individual humans cannot survive long on their own, and so evolution endowed us with a healthy fear of getting voted off the island. That’s why it hurts so bad to get bullied, dumped, pantsed, and demoted, even though none of those things cause actual tissue damage.1
But here’s the big lie: people-phobes implicitly believe that hurt can never be healed, so it must be avoided at all costs. This fear is misguided because the mind can, in fact, mend itself. Just like we have a physical immune system that repairs injuries to the body, we also have a psychological immune system that repairs injuries to the ego. Black eyes, stubbed toes, and twisted ankles tend to heal themselves on their own, and so do slip-ups, mishaps, and faux pas.
That means you can cure people-phobia the same way you cure any fear—by facing it, feeling it, and forgetting it. That’s the logic behind exposure and response prevention: you sit in the presence of the scary thing without deploying your usual coping mechanisms (scrolling on your phone, fleeing, etc.) and you do this until you get tired of being scared. If you’re an arachnophobe, for instance, you peer at a spider from a safe distance, you wait until your heart rate returns to normal, you take one step closer, and you repeat until you’re so close to the spider that it agrees to officiate your wedding.2
Unfortunately, people-phobia is harder to treat than arachnophobia because people, unlike spiders, cannot be placed in a terrarium and kept safely on the other side of the room. There is no zero-risk social interaction—anyone, at any time, can decide that they don’t like you. That’s why your people-phobia does not go into spontaneous remission from continued contact with humanity: if you don’t confront your fear in a way that ultimately renders it dull, you’re simply stoking the phobia rather than extinguishing it.3
Exposure only works for people-phobia, then, if you’re able to do two things: notch some pleasant interactions and reflect on them afterward. The notching might sound harder than the reflecting, but the evidence suggests it’s actually the other way around. Most people have mostly good interactions most of the time. They just don’t notice.
In any study I’ve ever read and in every study I’ve ever conducted myself, when you ask people to report on their conversation right after the fact, they go, “Oh, it was pretty good!”. In one study, I put strangers in an empty room and told them to talk about whatever they want for as long as they want, which sounds like the social equivalent of being told to go walk on hot coals or stick needles in your eyes. And yet, surprisingly, most of those participants reported having a perfectly enjoyable, not-very-awkward time. When I asked another group of participants to think back to their most recent conversation (which were overwhelmingly with friends and family, rather than strangers), I found the same pattern of results4:

But when you ask people to predict their next conversation, they suddenly change their tune. I had another group of participants guess how this whole “meet a stranger in the lab, have an open-ended conversation” thing would go, and they were not optimistic. Participants estimated that only 50% of conversations would make it past five minutes (actually, 87% did), and that only 15% of conversations would go all the way to the time limit of 45 minutes (actually, 31% did). So when people meet someone new, they go, “that was pretty good!”, but when they imagine meeting someone new, they go, “that will be pretty bad!”
A first-line remedy for people-phobia, then, is to rub your nose in the pleasantness of your everyday interactions. If you’re afraid that your goof-ups will doom you to a lifetime of solitude and then that just...doesn’t happen, perhaps it’s worth reflecting on that fact until your expectations update to match your experiences. Do that enough, and maybe your worries will start to appear not only false, but also tedious. However, if reflecting on the contents of your conversations makes you feel like that guy in Indiana Jones who gets his face melted off when he looks directly at the Ark of the Covenant, then I’m afraid you’re going to need bigger guns than can fit into a blog post.

Obviously, I don’t think you can instantly de-awkward yourself by reading the right words in the right order. We’re trying to override automatic responses and perform laser removal on burned-in fears—this stuff takes time.
In the meantime, though, there’s something all of us can do right away: we can disarm. The greatest delusion of the awkward person is that they can never harm someone else; they can only be harmed. But every social hangup we have was hung there by someone else, probably by someone who didn’t realize they were hanging it, maybe by someone who didn’t even realize they were capable of such a thing. When Todd Posner told me in college that I have a big nose, did he realize he was giving me a lifelong complex? No, he probably went right back to thinking about his own embarrassingly girthy neck, which, combined with his penchant for wearing suits, caused people to refer to him behind his back as “Business Frog” (a fact I kept nobly to myself).
So even if you can’t rid yourself of your own awkward onion, you can at least refrain from fertilizing anyone else’s. This requires some virtuous sacrifice, because the most tempting way to cope with awkwardness is to pass it on—if you’re pointing and laughing at someone else, it’s hard for anyone to point and laugh at you. But every time you accept the opportunity to be cruel, you increase the ambient level of cruelty in the world, which makes all of us more likely to end up on the wrong end of a pointed finger.
All of that is to say: if you happen to stop at an intersection and you look up and see someone you know just standing there inside his house he immediately ducks out of sight, you can think to yourself, “There are many reasonable explanations for such behavior—perhaps he just saw a dime on the floor and bent down to get it!” and you can forget about the whole ordeal and, most importantly, keep your damn eyes on the road.
PS: This post pairs well with Good Conversations Have Lots of Doorknobs.
Psychologists who study social exclusion love to use this horrible experimental procedure called “Cyberball”, where you play a game of virtual catch with two other participants. Everything goes normally at first, but then the other participants inexplicably start throwing the ball only to each other, excluding you entirely. (In reality, there are no other participants; this is all pre-programmed.) When you do this to someone who’s in an fMRI scanner, you can see that getting ignored in Cyberball lights up the same part of the brain that processes physical pain. But you don’t need a big magnet to find this effect: just watching the little avatars ignore you while tossing the ball back and forth between them will immediately make you feel awful.
My PhD cohort included some clinical psychologists who interned at an OCD treatment center as part of their training. Some patients there had extreme fears about wanting to harm other people—they didn’t actually want to hurt anybody, but they were afraid that they did. So part of their treatment was being given the opportunity to cause harm, and to realize that they weren’t really going to do it. At the final stage of this treatment, patients are given a knife and told to hold it at their therapist’s throat, who says, “See? Nothing bad is happening.” Apparently this procedure is super effective and no one at the clinic had ever been harmed doing it, but please do not try this at home.
As this Reddit thread so poetically puts it, “you have to do exposure therapy right otherwise you’re not doing exposure therapy, you’re doing trauma.”
You might notice that while awkwardness ratings are higher when people talk to strangers vs. loved ones, enjoyment ratings are higher too. What gives? One possibility is that people are “on” when they meet someone new, and that’s a surprisingly enjoyable state to be in. That’s consistent with this study from 2010, which found that:
Participants actually had a better time talking to a stranger than they did talking to their romantic partner.
When they were told to “try to make a good impression” while talking to their romantic partner (“Don’t role-play, or pretend you are somewhere where you are not, but simply try to put your best face forward”), they had a better time than when they were given no such instructions.
Participants failed to predict both of these effects.
Like most psychology studies published around this time, the sample sizes and effects are not huge, so I wouldn’t be surprised if you re-ran this study and found no effect. But even if people enjoyed talking to strangers as much as they enjoy talking to their boyfriends and girlfriends, that would still be pretty surprising.
2025-12-10 01:46:17
All of us, whether we realize it or not, are asking ourselves one question over and over for our whole lives: how much should I suffer?
Should I take the job that pays more but also sucks more? Should I stick with the guy who sometimes drives me insane? Should I drag myself through an organic chemistry class if I means I have a shot at becoming a surgeon?
It’s impossible to answer these questions if you haven’t figured out your Acceptable Suffering Ratio. I don’t know how one does that in general. I only know how I found mine: by taking a dangerous, but legal, drug.
I’ve always had bad skin. I was that kinda pimply kid in school—you know him, that kid with a face that invites mild cruelty from his fellow teens.1 I did all sorts of scrubs and washes, to little avail. In grad school, I started getting nasty cysts in my earlobes that would fill with pus and weep for days and I decided: enough. Enough! It’s the 21st century. Surely science can do something for me, right?
And science could do something for me: it could offer me a drug called Accutane. Well, it could offer me isotretinoin, which used to be sold as Accutane, but the manufacturers stopped selling it because too many kids killed themselves.
See, Accutane has a lot of potential side effects, including hearing loss, pancreatitis, “loosening of the skin”, “increased pressure around the brain”, and, most famous of all, depression, anxiety, and thoughts of suicide and self-injury. In 2000, a Congressman’s son shot himself while he was on Accutane, which naturally made everyone very nervous. My doctor was so worried that she offered to “balance me out” with an antidepressant before I even started on the acne meds. I turned her down, figuring I was too young to try the “put a bunch of drugs inside you and let ‘em duke it out” school of pharmacology.

But her concerns were reasonable: Accutane did indeed make me less happy. Like an Instagram filter for the mind, it deepened the blacks and washed out the whites: sad things felt a little sharper and happy things felt a little blunted. I was a bit quicker to get angry, and I was more often visited by the thought that nothing in my life was going well—although, as a grad student, I was visited fairly often by that thought even in normal times. It wasn’t the kind of thing I noticed every day. But occasionally, when I was, say, struggling to get some code to work, I’d feel an extra, unfamiliar tang of despair, and I’d go, “Accutane, is that you?”
The drugs also made my skin like 95% better. It was night and day—we’re talking going from Diary of a Pimply Kid to uh, Diary of a Normal Kid. That wonderful facial glow-up that most people experience as they exit puberty, I got to experience that same thing in my mid-20s. Ten years later, I’m still basically zit-free.
I didn’t realize it at first, but I had found my Acceptable Suffering Ratio. Six months of moderate sadness for a lifetime of clear skin? Yes, I’ll take that deal. But nothing worse than that. If the suffering is longer or deeper, if the upside is lower or less certain: no way dude. I’m looking for Accutane-level value out of life, or better.
That probably sounds stupid. It is stupid. And yet, until that point, I don’t think I was sufficiently skeptical of the deals life was offering me. Whenever I had the chance to trade pain for gain, I wasn’t asking: how bad, for how long, and for how much in return? I literally never wondered “How much should I suffer to achieve my goals?” because the implicit answer was always “As much as I have to”.
My oversight makes some sense—I was in academia at the time, where guaranteed suffering for uncertain gains is the name of the game. But I think a lot of us have Acceptable Suffering Ratios that are way out of whack. We believe ourselves to be in the myth of Sisyphus, where suffering is both futile and inescapable, when we are actually in myth of Hercules, where heroic labors ought to earn us godly rewards.
For example, I know this guy, call him Omar, and women are always falling in unrequited love with him. They’re gaga for Omar, but he’s lukewarm on them, and so they make a home for him in their hearts that he only ever uses as a crash pad. I don’t know these women personally, but sometimes I wish I could go undercover in their lives as like their hairdresser or whatever just so I could tell them: “It’s not supposed to feel like this.” This pining after nothing, this endless waiting and hoping that some dude’s change of heart will ultimately vindicate your years of one-sided affection—that’s not the trade that real love should ask of you.
Real love does bring plenty of pain: the angst of being perceived exactly as you are, the torment of confronting your own selfishness, the fear of giving someone else the nuclear detonation codes to your life. But if you can withstand those agonies, you will be richly rewarded. Omar’s frustrated lovers have this romantic notion that love is a burden to be borne, and the greater the burden, the greater the love. When a love is right, though, it’s less like heaving a legendary boulder up a mountain, and more like schlepping a picnic basket up a hill—yes, you gotta carry the thing to and fro, and it’s a bit of a pain in the ass, but in between you get to enjoy the sunset and the brie.
Our Acceptable Suffering Ratios are most out of whack when it comes to choosing what to do with our lives.
Careers usually have a “pay your dues” phase, which implies the existence of a “collect your dues” phase. In my experience, however, the dues that go in are no guarantee of the dues that come back out. There is no cosmic, omnipotent bean-counter who makes sure that you get your adversity paid back with interest. You really can suffer for nothing.
If you’re staring down a gauntlet of pain, then, it’s important to peek at the people coming out the other side. If they’re like, “I’m so glad I went through that gauntlet of pain! That was definitely worth it, no question!” then maybe it’s wise to follow in their footsteps. But if the people on the other side of the gauntlet are like, “What do you mean, other side of the gauntlet? I’m still in it! Look at me suffering!”, perhaps you should steer clear.
Psychologists refer to this process of gauntlet-peeking as surrogation. People are hesitant to do it, however, I think because it feels so silly. If you see people queuing up for a gauntlet of pain, it’s natural to assume that the payoff must be worth the price. But that’s reasoning in the wrong direction. We shouldn’t be asking, “How desirable is this job to the people who want it?” That answer is always going to be “very desirable”, because we’ve selected on the outcome. Instead, the thing we need to know is, “How desirable is this job to the people who have it?”
It turns out the scarcity of a resource is much more potent in the wanting than it is in the having. I had a chance to learn this lesson about twenty years ago, when the stakes were far lower, but I declined. Back then, I thought the only barrier between me and permanent happiness was a Nintendo Wii. I stood outside a Target at 5am in the freezing cold for the mere chance of buying one, counting and re-counting the people in front of me, desperately trying to guess whether there would be enough Wiis to go around, my face going numb as I dreamed of the rapturous enjoyment of Wii Bowling.

I didn’t realize that, by the time I got home, the most exciting part of owning a Wii was already over. When I strapped on my Wii-mote, there was not a gaggle of covetous boys salivating outside my window, reminding me that I was having an enviable experience. It turns out that what makes a game fun is its quality, not its rarity.
If I had understood that obvious fact at the time, I probably wouldn’t have wasted so many hours lusting after a game console, caressing pictures of nunchuck attachments in my monthly Nintendo Power, calling Best Buys to pump them for information about when shipments would arrive, or guilting my mom into pre-dawn excursions to far-flung electronics stores.
Post-Accutane, though, I think I got better at spotting situations like these and surrogating myself out of them. When I got to be an upper-level PhD student, I would go to conferences, look for the people who were five years or so ahead of me, and ask myself: do they seem happy? Do I want to be like them? Are they pleased to have exited the gauntlet of pain that separates my life and theirs? The answer was an emphatic no. Landing a professor position had not suddenly put all their neuroses into remission. If anything, their success had justified their suffering, thus inviting even more of it. If it took this much self-abnegation, mortification, and degradation to get an academic job, imagine how much you’ll need for tenure!
I think our dysfunctional relationship with suffering is wired deep in our psyches, because it shows up even in the midst of our fantasies.
I’ve gotten back into improv recently, which has reacquainted me with a startling fact. An improv scene could happen anywhere and be about anything—we’re lovers on Mars! We’re teenage monarchs! We’re Oompa-Loompas backstage at the chocolate factory! And yet, when given infinite freedom, what do people do? They shuffle out on stage, sit down at a pretend computer, start to type, and exclaim, “I hate my job!”
It’s remarkable how many scenes are like this, how quickly we default to criticizing and complaining about the very reality that we just made up.2 A blank stage somehow turns us into heat-seeking missiles for the situations we least want to be in, as if there’s some unwritten rule that says that when we play pretend, we must imagine ourselves in hell.3
I’m not usually this kind of psychologist, but I can’t help but see these scenes as a kind of reverse Rorschach test. When allowed to draw whatever kind of blob we want, we draw one that looks like the father who was never proud of us. “Yep, that’s it! That’s exactly the thing I don’t like!”
I don’t think that instinct should be squelched, but it should be questioned. Is that the thing you really want to draw? Because you could also draw, like, a giraffe, if you wanted to. Or a rocket ship. Or anything, really. The things that hurt you are not owed some minimum amount of mental airtime. If you’re going to dredge them up and splatter them across the canvas, it should be for something—to better understand them, to accept them, to laugh at them, to untangle them, not to simply stand back and despise them, as if you’re opening up a gallery of all the art you hate the most.
In 1986, a psychologist named Wayne Hershberger played a nasty trick on some baby chickens. He rigged up a box with a cup of food suspended from a railing, and whenever the chicks would try to approach the cup, an apparatus would yank it out of their reach. If the chicks walked in the opposite direction, however, the same apparatus would swing the cup closer to them. So the only way for the chicks to get their dinner was for them to do what must have felt like the dumbest thing in the world: they had to walk away from the food. Alas, most of the chicks never learned how to do this.
I think many humans never learn how to do this, either. They assume that our existence is nothing more than an endless lurching in the direction of the things you want, and never actually getting them. Life is hard and then you die; stick a needle in your eye!
There are people with the opposite problem, of course: people who refuse to take on any amount of discomfort in pursuit of their goals, people who try not to have any goals in the first place, for they only bring affliction. It’s a different strategy, but the same mistake. These two opposing approaches to life—call them grindset and bedrot—both assume that the ratio between pain and gain is fixed. The grindset gang simply accepts every deal they’re offered, while the bedrot brigade turns every deal down.
Neither camp understands that when you get your Suffering Ratio right, it doesn’t feel like suffering at all. The active ingredient in suffering is pointlessness; when it has a purpose, it loses its potency. Taking Accutane made me sad, yes, but for a reason. The nearly-certain promise of a less-pimply face gave meaning to my misery, shrinking it to mere irritation.
Torturing yourself for an unknown increase in the chance that you’ll get some outcome that you’re not even sure you want—yes, that should hurt! That’s the signal you should do something else. When your lunch is snatched out of your grasp for the hundredth time in a row, perhaps you should see what happens when you walk away instead.
Apparently kids these days cover up their pimples with little medicated stickers, rather than parading around all day with raw, white-headed zits. This is a terrific technological and social innovation, and I commend everyone who made it happen.
Perhaps Trent Reznor was thinking of improv comedy when he wrote the lyric:
You were never really real to begin with
I just made you up to hurt myself
You might assume that improv is a good place to work out your demons, until you think about it for a second and realize that it basically amounts to deputizing strangers into doing art therapy with you. So while I’ve never personally witnessed someone conquer their traumas through improv comedy, I have witnessed many people spread their traumas to others.
2025-11-26 03:17:23
I say this with tremendous respect: it’s kinda surprising that the three largest religions in the world are Christianity, Islam, and Hinduism.
None of them are very sexy or fun, they come with all kinds of rules, and if they promise you any happiness at all, it’s either after you’re dead, or it’s the lamest kind of happiness possible, the kind where you don’t get anything you want but you supposedly feel fine about it. If you were trying to design a successful religion from scratch, I don’t think any of these would have made it out of focus groups. “Yeah, uh, women 18 to 34 years old just aren’t resonating with the part where the guy gets nailed to a tree.”
Why, in the ultimate meme battle of religions, did these three prevail? Let’s assume for the sake of argument that it’s not because they have the divine on their side. (Otherwise, God appears to be hedging his bets by backing multiple religions at once.)
Obviously there’s a lot of historical contingency here, and if a couple wars had gone the other way, we might have a different set of creeds on the podium. But I think each of these mega-religions has something in common, something we never really talk about, maybe because we don’t notice it, or maybe because it’s impolite to mention—namely, that they all have a brainy version and a folksy version.
If you’re the scholarly type, Christianity offers you Aquinas and Augustine, Islam has al-Ash’ari and al-Ghazali, Hinduism has Adi Shankara and Swami Vivekananda, etc. But if you don’t care for bookish stuff, you can also just say the prayers, sing the songs, bow to the right things at the right time, and it’s all good. The guy with the MDiv degree is just as Christian as the guy who does spirit hands while the praise band plays “Our God Is an Awesome God”.
It’s hard to talk about this without making it sound like the brainy version is the “good” one, because if we’re doing forensic sociology of religion, it’s obvious which side of the spectrum we prefer. But brainy vs. folksy is really about interest rather than ability. The people who favor the brainier side may or may not be better at thinking, but this is the thing they like thinking about.
More importantly, the brainy and the folksy sides need each other. The brainy version appeals to evangelists, explainers, and institution-builders—people who make their religion respectable and robust. The folksy version keeps a religion relevant and accessible to the 99.9% of humanity who can’t do faith full-time—people who might not be able to name all the commandments, but who will still show up on Sunday and put their dollars in the basket. The brainy version fills the pulpits; the folksy version fills the pews.1
Naturally, brainy folks are always a little annoyed at folksy folks, and folksy folks are always a little resentful of brainy folks. It’s tempting to split up: “Imagine if we didn’t have to pander to these know-nothings”/”Imagine if we didn’t have to listen to these nerds!” But a religion can start wobbling out of control if it tilts too far toward either its brainy yin or its folksy yang. Left unchecked, brainy types can become obsessed with esoterica, to the point where they start killing each other over commas. Meanwhile, uncultivated folksiness can degenerate into hogwash and superstition. Pure braininess is one inscrutable sage preaching to no one; pure folksiness is turning the madrasa into a gift shop.
Here’s why I bring all this up: we’ve got a big brainy/folksy split on our hands right now:
That divide is biggest at the highest level of education:
To think clearly about this situation, we have to continue resisting the temptation to focus on which side is “correct”. And we have to avoid glossing the divide as “Democrats = smart, Republicans = dumb”—both because going to college doesn’t mean you actually know anything, and because intelligence is far more complicated than we like to admit.
I think this divide is better understood as cultural rather than cognitive, but it doesn’t really matter, because separating the brainy and the folksy leaves you with the worst version of each. This is one reason why politics is so outrageous right now—only a sicko would delight in the White House’s Studio Ghibli-fied picture of a weeping woman being deported, and only an insufferable scold would try to outlaw words like “crazy”, “stupid”, and “grandfather” in the name of political correctness. It’s not hard to see why most people don’t feel like they fit in well with either party. But as long as the folksy and brainy contingents stay on opposite sides of the dance floor, we can look forward to a lot more of this.
Bifurcation by education is always bad, but it’s worse for the educated group, because they’ll always be outnumbered. You simply cannot build a political coalition on the expectation that everybody’s going to do the reading. If the brainy group is going to survive, it has to find a way to reunite with the folksy.
So maybe it’s worth taking some cues from the most successful ideologies of all time, the ones that have kept the brainy and folksy strains intertwined for thousands of years. I don’t think politics should be a religion—I’m not even sure if religion should be a religion—but someone’s gonna run the country, and as long as we’ve got a brainy/folksy split, we’ll always have to choose between someone who is up their own ass, and someone who simply is an ass.
As far as I can see, the biggest religions offer two strategies for bridging the divide between the high-falutin and the low-falutin. Let’s call ‘em fractal beliefs and memetic flexibility.
The shape below is a fractal, a triangle made up of triangles. Look at it from far away and you see one big triangle; look at it close up and you see lots of little triangles. It’s triangles all the way down.
The most successful ideologies have similarly fractal beliefs: you get the same ideas no matter how much you zoom in or out. If a Christian leans more into their faith—if they read the Bible cover to cover, go to church twice a week, and start listening to sermons on the way to work—they don’t suddenly transform into, say, a Buddhist. They’re just an extra-enthusiastic Christian with extra-elaborated beliefs. This is a critical feature: if your high-devotion and low-devotion followers believe totally different things, eventually they’re gonna split.
If the brainy tribe is to survive, then, it’s gotta fractal-ize its beliefs. That means generating the simplest versions of your platform that is still true. For example, many brainy folks want to begin arguments about gender by positing something like “gender is a social construct”, and right out the gate they’re expecting everyone to have internalized like three different concepts from sociology 101. Instead, they should start with something everybody can understand and get on board with, like “People’s opportunities in life shouldn’t depend on their private parts”. Making your arguments fractal doesn’t require changing their core commitments; it just means making each step of the argument digestible to someone who has no inclination to chew. If you’re gotta bring up Durkheim, at least put him last.
Brainy folks hate doing this. They’d much prefer to produce ever-more-exquisite versions of their arguments because, to borrow some blockchain terminology, brainy people operate on a proof-of-work system, where your standing is based on the effort you put in. That’s why brainy folks are so attracted to the idea that your first instincts cannot be right, and it’s why their beliefs can be an acquired taste. You’re supposed to struggle a bit to get them—that’s how you prove that you did your homework.
(Only the brainy tribe would, for instance, insist that you need to “educate yourself” in order to participate in polite society, and that no one should be expected to help you with this.)
It would be convenient for this analogy if folksy types operated on a proof-of-stake system (which is the other way blockchains can work), but they don’t, really. They don’t operate on proof of anything—the whole concept of proof is dubious to them. You don’t need to prove the obvious. That might sound dumb, and it often is. But sometimes it’s a useful counterweight: sometimes your first instinct is indeed the right one, and you can think yourself into a stupider position.
Regardless, demanding proof-of-work is always going to tilt the playing field in favor of the other side. A liberal friend of mine was recently lamenting that liberals have all the complicated-but-true positions while conservatives have all the easy-but-wrong positions. She acted like this was the natural end state of things; too bad, so sad, I guess the stupid shall inherit the Earth!
This sounds like a cop-out to me. If you can’t find an accessible way to express your complicated-but-true beliefs, maybe they aren’t actually true, or maybe you’re not trying hard enough. Of course, it takes a lot of thought to make a convoluted idea more intuitive—if only there were some people who liked thinking!
Big religions have a healthy dose of memetic flexibility: while their central tenets are firm, the rest of their belief systems can bend to fit lots of different situations.
For example, why are there four gospels? You’d think it would be a liability to have four versions of the same story floating around, especially when they’re not perfectly consistent with each other. There are disagreements about the name of Jesus’ paternal grandfather, the chronology of his birth, his last words, and uh, whether you’re supposed to take a staff with you when you go off to proselytize (Matthew and Luke say yes, but Mark says no way). This kind of thing is very embarrassing when you’re trying to convince people that you have the inerrant word of God. Wouldn’t it make more sense to smush all these accounts together into one coherent whole?
Syriac Christians did exactly that in the second century AD: it was called the Diatessaron. And yet, despite its cool name, it never really caught on and eventually died out. Nobody even tried that hard to suppress it as a heresy. I mean, if no one claims that your book is the work of the devil, do you even exist??

Maybe the harmonized gospel never went viral because four gospels, inconsistencies and all, are better than one. Each gospel has its own target audience: you’ve got Matthew for the Jews, Luke for the ladies and the Romans, John for the incense-and-crystals crowd, and Mark for the uh...all those Mark-heads out there, you know who you are. Having four stories with the same moral but different details and emphases gives you memetic flexibility without compromising the core beliefs.
Brainy folks could learn a lot from this. For instance, there’s a certain kind of galaxy-brained doomer who thinks that the only acceptable way to fight climate change is to tighten our belts. If we can invent our way out of this crisis with, say, hydrogen fuel cells or super-safe nuclear reactors, they think that’s somehow cheating. We’re supposed to scrimp, sweat, and suffer, because the greenhouse effect is not just a fact of chemistry and physics—it’s our moral comeuppance. In the same way that evangelical pastors used to say that every tornado was God’s punishment for homosexuality, these folks believe that rising sea levels are God’s punishment for, I guess, air conditioning.
This kind of small-tent, memetically inflexible thinking is a great way to make your political movement go extinct. But if you’re willing to be a little open-minded about how, exactly, we prevent the Earth from turning into a sun-dried tomato, you might actually succeed. Imagine if we could suck the carbon out of the atmosphere and turn it into charcoal for your Fourth of July barbecue. Imagine if electricity was so cheap and clean that you could drive your Hummer from sea to shining sea while causing net zero emissions. Imagine genetically engineered cows that don’t fart. That’s a future far more people can get behind, both literally and figuratively.
A few months ago, I ran into an old classmate at the grocery store—let’s call him Jay. He was a bit standoffish at first, and I soon found out why: Jay works in progressive politics now, and he was sussing me out to see whether I had become a right-wing shill since graduation, which is what some of our mutual acquaintances have done. Once he was satisfied I wasn’t trying to follow the well-trod liberal-loser-to-Fox-News-provocateur track, he warmed up. “I unfriended all the conservatives I know. I don’t even talk to moderates anymore,” Jay said, as if this was the most normal thing in the world. I didn’t want to get into it with him in the cereal aisle, but I wanted to know: what is his plan for dealing with the ongoing existence of his enemies? If he ignores them, they’ll....what? Give up? Fill their pockets with stones and wade into the ocean?
Apparently, Jay has decided that his side should lose. There are only three ways that an idea can gain a greater share of human minds: conquest, conversion, or conception. I hope we can all agree that killing is off the table, so that leaves changing minds and makin’ babies. It seems to me that progressives have largely given up on both.2
Obviously, I don’t think people should have kids just to thicken the ranks of their political party, nor do I want proselytizing progressives on every street corner. But if you’re serious about your political beliefs, you should have some plausible theory of how they’re going succeed in the future. That means making it easy to believe the things you think are true, and it means trying to appeal to people who don’t already agree with you.
Most of all, it means participating in the world in a way that makes people want to join you. That’s why there are 5.5 billion Christians, Muslims, and Hindus—those religions offer people something that earns their continued devotion, whether those people want to think really hard about their faith or not. People aren’t gonna show up to Jay’s church if it’s all pulpit and no pew.
There are always going to be political cleavages, and there should be. The ideal amount of polarization is not zero. (When everybody’s on the same page, we groupthink ourselves into doing stupid things like, you know, invading two countries in two years.) Running a country is the greatest adversarial collaboration known to man, and it only works when both sides bring their best ideas. To do that, we need each party to have a fully integrated brainy and folksy contingent; some people plumbing the depths, other people keeping the boat anchored. Otherwise, we end up with parties that are defined either by their pointy-headed pedants or their pinheaded reactionaries.
“I must study Politicks and War,” John Adams famously wrote, “that my sons may have liberty to study Mathematicks and Philosophy”:
My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine.
I appreciate Adams’ aspirations, but I disagree with his order of operations. Politics is not a science set apart from all others. Good governance requires good thinking, and that means drawing on every ounce of our knowledge, no matter how far-flung. Right now, I think we could stand to learn a thing or two from the ancient memelords who created our modern religions. Otherwise, yes, a world where we’re more interested in painting and poetry than we are politics—that sounds great. I think we know the way there. We just have to grab our walking sticks.
Many people assume this arrangement has broken down, and that highly educated people have all ditched their churches and become atheists. In fact, more-educated people are more likely to attend religious services.
2025-11-12 08:45:53
This is the quarterly links ‘n’ updates post, a collection of things I’ve been reading and doing for the past few months.
As late as 1813, some parts of the European medical establishment believed that potatoes cause leprosy. (Don’t even get ‘em started on scrofula!) Potato historian Salaman Redcliffe suggests that people were skeptical because potatoes look kinda weird, they grow in the ground, and you plant them as tubers rather than seeds, which are all extremely suspicious things for a food to do.
You may remember the Spurious Correlations website, which dredges up random datasets and finds correlations between them—for instance, Lululemon’s stock price and the popularity of the first name Stevie. Now thanks to AI, each one of those correlations can be instantly turned into a full academic paper, like: LULU-LEMONADE: A STATISTICAL STUDY OF THE STEVIE-NIZED MARKET.
Sadly, this technology makes many academic departments completely redundant.
Via : there’s a good chance that 2025 will have the fewest murders ever recorded in the US. (We only have reliable data going back to 1960).
I just added a new entry to my list of all-time great blog posts: Ask not why would you work in biology, but rather: why wouldn’t you?, by of . An excerpt:
Yes, biology is very interesting, yes, biology is very hard to do well. Yet, it remains the only field that could do something of the utmost importance: prevent a urinary catheter from being shunted inside you in the upcoming future.
Before World War I, the US government had basically no cryptographic capacity. So when the war broke out and suddenly they needed people to do code-breaking, where did they turn? To the Riverbank Institute, which had been set up by “Colonel” George Fabyan to decode the most important cipher of all: the one that supposedly proved the works of Shakespeare were written by Francis Bacon. Elizebeth and William Friedman, the couple who broke that cipher while working at Riverside, went on to become the first cryptologists at the precursor to the National Security Agency.12
One of the wildest blog posts I’ve read this year is about an American guy going to fight in the Ukraine war. Honestly, it sounds like a huge bummer: you squat in a trench and pretend to shoot at Russians and hope to not be killed by a drone.
When I saw ’s post called How Pen Caps Work, I was like, “what do you mean? Pen caps work by...being....caps for pens”. Apparently not: for fountain pens, anyway, pen caps work through vacuum power. Putting the cap on and taking it off causes a tiny amount of suction that draws the ink into the nib.
, who was one of the winners of my 2025 blog post competition, has a great series on why neuroscientists still can’t simulate a worm:
I told [my mom] that this is what my job feels like—each animal has a different kind of radio in its head and/or body, and neuroscientists are trying to figure out things about them. Some neuroscientists want to fix radios; some want to build better radios. Others, like me, are just trying to understand them.
To which Li’s mom responded:
Other nematode fun facts from Li’s piece: they use static electricity to teleport themselves onto bumblebees as a way of getting around:
And...nematodes survived the Space Shuttle Columbia explosion??
More great work from a 2025 Blog Extravaganza honoree: has a terrific article in Works in Progress: Why Isn’t AI Replacing Radiologists? Radiology was supposed to be the first medical speciality to be rendered obsolete by AI. Instead, radiology jobs are more numerous and salaries are higher than ever.
The Polarization Dashboard is a useful sanity check against current events. Whenever something big happens in politics, people are like “WOW OUR VERY SOULS HAVE PERMANENTLY CHANGED” when in fact people almost always have the same opinions that they did yesterday. Here’s the change in support for murdering members of the opposite political party over time. Currently, <2% of Democrats and Republicans support it. (h/t )
See also: You’re Probably Wrong About How Things Have Changed.
Some beginner researchers successfully replicated my Things Could Be Better paper without any expert help. I’m really proud of this! , who ran the workshop, writes:
I did not help replicate this study because the group replicating Measures of Anchoring in Estimation Tasks [the other study being replicated] needed help understanding the language the paper was written in. In contrast the group replicating Things Could Be Better started their own replication within 15 minutes of being handed the paper and did not have any followup questions for me before they began the replication.
Two years ago a Harvard Business School professor named Francesca Gino was fired for faking her data. (I wrote about the debacle here.) She sued the bloggers who outed her, but that lawsuit was thrown out. She also sued Harvard, claiming discrimination. Now Harvard is suing Gino back, alleging that when Gino submitted data to prove prove that her original data wasn’t fake...the new data was fake, too.
and his friends ask: What if the NIH had been 40% smaller?. I appreciate how circumspect the authors are, but the short answer seems to be, “We would be significantly worse off, because many important medicines rely on research that would not have happened under a smaller budget”. This is further evidence of just how important it is to invest in science: even when we do it in a totally boneheaded way, it somehow still pays off.
Recently, the Financial Times set the internet alight with these graphs:
re-analyzed the data and claims the changes in conscientiousness are minimal, if they exist at all. I’m inclined to trust Ferguson’s account on this one: it’s super weird to see such huge changes in such small amounts of time on basically any psychological variables.
Speaking of personality, ClearerThinking now has one mega personality test that will give you all your results from a bunch of different tests at once. I previously cited their work showing that supposedly scientific personality tests do not obviously outperform the bullshit ones, which continues to boggle people’s minds whenever I bring it up.
One of the most famous psychology experiments of all time is Festinger and Carlsmith (1959), the classic demonstration of cognitive dissonance. The psychologist Matti Heino points out, though, that the main results literally don’t add up. In the table below, the circled means are impossible given the reported sample size—there’s no way to get an average of 3.08, for instance, if you have 20 people giving ratings on a 0-10 scale.
I used to be one of those people who was like “well cognitive dissonance has been replicated in hundreds of studies” but it’s not like I ever actually read those studies. It just kinda seems like, c’mon! It’s cognitive dissonance! Everybody knows cognitive dissonance! Anyway, when a bunch of labs tried to replicate another classic demonstration of cognitive dissonance, they found no effect.3 And a new paper claims that When Prophecy Fails, a landmark book that documented the effects of cognitive dissonance in a doomsday cult, may have been greatly embellished or deliberately orchestrated by the authors.4 None of this means that cognitive dissonance doesn’t or can’t exist, but it does make me feel a whole lot of, uh, dissonance.
According to this study, losing a Michelin star improves TripAdvisor ratings. This is probably because getting a star invites harsher judgment (“It’s good, but...is it Michelin good?”). The perfect restaurant is one that raises your expectations high enough to get you to come in, but leaves your expectations low enough that you can still be wowed.
This is a hard target to hit: I’ve only ever seen two restaurants that have a 4.9 on Google after receiving thousands of ratings. Both of them are Thai restaurants that are nice but not fancy and cool but not trendy—you go expecting good Thai food and you get fantastic Thai food, and everybody goes, “wow, this place should have a Michelin star!”
at debuts a new civic vision called “Potentialism”: everybody’s got something special about them, and they have the right and responsibility to cultivate it and use it for the good of all. In a country where most people don’t like either party very much (which we know about thanks to Yudkin’s research), we should be doing a lot more ideological experimentation like this.
A new paper tracked all PhD students in Sweden and found that they’re more likely to take psychiatric medication compared to a matched group who stopped their education at the master’s level. This isn’t randomized and we should assume these two groups differ in lots of ways. But the fact that the differences grow over time and then disappear at year seven, when most people who started their PhDs have now finished them, does suggest that getting your doctorate is distressing.
Goodhart’s Law in action: The Army Corps of Engineers hired contractors to clear debris after the 2017 wildfires in California, but unfortunately they paid by the ton, incentivizing the contractors to dig up a bunch of wet mud and use it to weigh down their loads. Filling in the holes left by the contractors is “estimated to cost another $3.5 million”.
I’ve seen some wild visual illusions in my time, but this one had me screaming “no...no...no no no NO NO NO NOOO”
Items of note for people who do science independently and people who like that sort of thing. Original post here.
The Existential Hope Foundation is offering a $10k prize for the “most inspiring, uplifting, and forward-looking memes”.
has been building UV sterilization pumps and growing yeast in a DIY laboratory. See also: Why Independent Science Now?
Alex Chernavsky started taking metformin and found that it didn’t do much for his blood sugar but it did, surprisingly, improve his reaction times:
I got a lot of thoughtful responses to my recent post on the decline of deviance, including this one from the classical musician . Many people think the internet is to blame, but we forget how recently the internet became the internet. As late as 2007, only half of households had broadband at home. Instagram only launched in 2010. The early, slow, text-based internet lived in a room in your house and you had to choose between using it and allowing your grandma to call you; it was nothing like the one that lives in your pocket. So the transition from the 1990s to the 2000s, where we see many forms of deviance declining, is not really the transition from “pre-internet” to “post-internet”. That transition happened somewhere around 2012, when a majority of people got a smartphone. That’s why I think the internet as we know it today may act as an accelerant, but I don’t think it caused the decline of deviance in the first place.
In other news:
I talked to about why Americans are so afraid of talking to each other.
answered all my questions about music, including “What are the weirdest lyrics in a #1 hit single?”
Finally, a post from two years ago. A friend of mine recently told me, “Sometimes I want to send this post to someone, but then I remember what you titled it.”
I encountered this anecdote in ’s excellent Case for Crazy Philanthropy.
100 years later, the cryptographer/game developer Elonka Dunin visited the Friedmans’ grave and found they left a secret code on their tombstone.
This isn’t the Festinger and Carlsmith version, but another way of eliciting the effect, where people have to write an essay that goes against their beliefs.
Thanks to an “alert reader” for bringing this to my attention.
Which means Mr. Show was right all along.