2026-01-07 00:45:44
Here’s the most replicated finding to come out of my area of psychology in the past decade: most people believe they suffer from a chronic case of awkwardness.
Study after study finds that people expect their conversations to go poorly, when in fact those conversations usually go pretty well. People assign themselves the majority of the blame for any awkward silences that arise, and they believe that they like other people more than other people like them in return. I’ve replicated this effect myself: I once ran a study where participants talked in groups of three, and then they reported/guessed how much each person liked each other person in the conversation. Those participants believed, on average, that they were the least liked person in the trio.
In another study, participants were asked to rate their skills on 20 everyday activities, and they scored themselves as better than average on 19 of them. When it came to cooking, cleaning, shopping, eating, sleeping, reading, etc., participants were like, “Yeah, that’s kinda my thing.” The one exception? “Initiating and sustaining rewarding conversation at a cocktail party, dinner party, or similar social event”.
I find all this heartbreaking, because studies consistently show that the thing that makes humans the happiest is positive relationships with other humans. Awkwardness puts a persistent bit of distance between us and the good life, like being celiac in a world where every dish has a dash of gluten in it.
Even worse, nobody seems to have any solutions, nor any plans for inventing them. If you want to lose weight, buy a house, or take a trip to Tahiti, entire industries are waiting to serve you. If you have diagnosable social anxiety, your insurance might pay for you to take an antidepressant and talk to a therapist. But if you simply want to gain a bit of social grace, you’re pretty much outta luck. It’s as if we all think awkwardness is a kind of moral failing, a choice, or a congenital affliction that suggests you were naughty in a past life—at any rate, unworthy of treatment and undeserving of assistance.
We can do better. And we can start by realizing that, even though we use one word to describe it, awkwardness is not one thing. It’s got layers, like a big, ungainly onion. Three layers, to be exact. So to shrink the onion, you have to peel it from the skin to the pith, adapting your technique as you go, because removing each layer requires its own unique technique.
Before we make our initial incision, I should mention that I’m not the kind of psychologist who treats people. I’m the kind of psychologist who asks people stupid questions and then makes sweeping generalizations about them. You should take everything I say with a heaping teaspoon of salt, which will also come in handy after we’ve sliced the onion and it’s time to sauté it. That disclaimer disclaimed, let’s begin on the outside and work our way in, starting with—
The outermost layer of the awkward onion is the most noticeable one: awkward people do the wrong thing at the wrong time. You try to make people laugh; you make them cringe instead. You try to compliment them; you creep them out. You open up; you scare them off. Let’s call this social clumsiness.
Being socially clumsy is like being in a role-playing game where your charisma stat is chronically too low and you can’t access the correct dialogue options. And if you understand that reference, I understand why you’re reading this post.

Here’s the bad news: I don’t think there’s a cure for clumsiness. Every human trait is normally distributed, so it’s inevitable that some chunk of humanity is going to have a hard time reading emotional cues and predicting the social outcomes of their actions. I’ve seen high-functioning, socially ham-handed people try to memorize interpersonal rules the same way chess grandmasters memorize openings, but it always comes off stilted and strange. You’ll be like, “Hey, how you doing” and they’re like “ROOK TO E4, KNIGHT TO C11, QUEEN TO G6” and you’re like “uhhh cool man me too”.
Here’s the good news, though: even if you can’t cure social clumsiness, there is a way to manage its symptoms. To show you how, let me tell you a story of a stupid thing I did, and what I should have done instead.
Once, in high school, I was in my bedroom when I saw a girl in my class drive up to the intersection outside my house. It was dark outside and I had the light on, and so when she looked up, she caught me in the mortifying act of, I guess, existing inside my home? This felt excruciatingly embarrassing, for some reason, and so I immediately dropped to the floor, as if I was in a platoon of GIs and someone had just shouted “SNIPER!” But breaking line of sight doesn’t cause someone to unsee you, and so from this girl’s point of view, she had just locked eyes with some dude from school through a window and his response had been to duck and cover. She told her friends about this, and they all made fun of me ruthlessly.
I learned an important lesson that day: when it comes to being awkward, the coverup is always worse than the crime. If you just did something embarrassing mere moments ago, it’s unlikely that you have suddenly become socially omnipotent and that all of your subsequent moves are guaranteed to be prudent and effective. It’s more likely that you’re panicking, and so your next action is going to be even stupider than your last.
And that, I think, is the key to mitigating your social clumsiness: give up on the coverups. When you miss a cue or make a faux pas, you just have to own it. Apologize if necessary, make amends, explain yourself, but do not attempt to undo your blunder with another round of blundering. If you knock over a stack of porcelain plates, don’t try to quickly sweep up the shards before anyone notices; you will merely knock over a shelf of water pitchers.
This turns out to be a surprisingly high-status move, because when you readily admit your mistakes, you imply that you don’t expect to be seriously harmed by them, and this makes you seem intimidating and cool. You know how when a toddler topples over, they’ll immediately look at you to gauge how upset they should be? Adults do that too. Whenever someone does something unexpected, we check their reaction—if they look embarrassed, then whatever they did must be embarrassing. When that person panics, they look like a putz. When they shrug and go, “Classic me!”, they come off as a lovable doof, or even, somehow, a chill, confident person.
In fact, the most successful socially clumsy people I know can manage their mistakes before they even happen. They simply own up to their difficulties and ask people to meet them halfway, saying things like:
Thanks for inviting me over to your house. It’s hard for me to tell when people want to stop hanging out with me, so please just tell me when you’d like me to leave. I won’t be mad. If it’s weird to you, I’m sorry about that. I promise it’s not weird to me.
It takes me a while to trust people who attempt this kind of social maneuver—they can’t be serious, can they? But once I’m convinced they’re earnest, knowing someone’s social deficits feels no different than knowing their dietary restrictions (“Arthur can’t eat artichokes; Maya doesn’t understand sarcasm”), and we get along swimmingly. Such a person is always going to seem a bit like a Martian, but that’s fine, because they are a bit of a Martian, and there’s nothing wrong with being from outer space as long as you’re upfront about it.
When we describe someone else as awkward, we’re referring to the things they do. But when we describe ourselves as awkward, we’re also referring to this whole awkward world inside our heads, this constant sensation that you’re not slotted in, that you’re being weird, somehow. It’s that nagging thought of “does my sweater look bad” that blossoms into “oh god, everyone is staring at my horrible sweater” and finally arrives at “I need to throw this sweater into a dumpster immediately, preferably with me wearing it”.
This is the second layer of the awkward onion, one that we can call excessive self-awareness. Whether you’re socially clumsy or not, you can certainly worry that you are, and you can try to prevent any gaffes from happening by paying extremely close attention to yourself at all times. This strategy always backfires because it causes a syndrome that athletes call “choking” or “the yips”—that stilted, clunky movement you get when you pay too much attention to something that’s supposed to be done without thinking. As the old poem goes:
A centipede was happy – quite!
Until a toad in fun
Said, “Pray, which leg moves after which?”
This raised her doubts to such a pitch,
She fell exhausted in the ditch
Not knowing how to run.
The solution to excessive self-awareness is to turn your attention outward instead of inward. You cannot out-shout your inner critic; you have to drown it out with another voice entirely. Luckily, there are other voices around you all the time, emanating from other humans. The more you pay attention to what they’re doing and saying, the less attention you have left to lavish on yourself.
You can call this mindfulness if that makes it more useful to you, but I don’t mean it as a sort of droopy-eyed, slack-jawed, I-am-one-with-the-universe state of enlightenment. What I mean is: look around you! Human beings are the most entertaining organisms on the planet. See their strange activities and their odd proclivities, their opinions and their words and their what-have-you. This one is riding a unicycle! That one is picking their nose and hoping no one notices! You’re telling me that you’d rather think about yourself all the time?
Getting out of your own head and into someone else’s can be surprisingly rewarding for all involved. It’s hard to maintain both an internal and an external dialogue simultaneously, and so when your self-focus is going full-blast, your conversations degenerate into a series of false starts (“So...how many cousins do you have?” “Seven.” “Ah, a prime number.”) Meanwhile, the other person stays buttoned up because, well, why would you disrobe for someone who isn’t even looking? Paying attention to a human, on the other hand, is like watering a plant: it makes them bloom. People love it when you listen and respond to them, just like babies love it when they turn a crank and Elmo pops out of a box—oh! The joy of having an effect on the world!

Of course, you might not like everyone that you attend to. When people start blooming in your presence, you’ll discover that some of them make you sneeze, and some of them smell like that kind of plant that gives off the stench of rotten eggs. But this is still progress, because in the Great Hierarchy of Subjective Experiences, annoyance is better than awkwardness—you can walk away from an annoyance, but awkwardness comes with you wherever you go.
It can be helpful to develop a distaste for your own excessive self-focus, and one way to do that is to relabel it as “narcissism”. We usually picture narcissists as people with an inflated sense of self worth, and of course many narcissists are like that. But I contend that there is a negative form of narcissism, one where you pay yourself an extravagant amount of attention that just happens to come in the form of scorn. Ultimately, self-love and self-hate are both forms of self-obsession.
So if you find yourself fixated on your own flaws, perhaps its worth asking: what makes you so worthy of your own attention, even if it’s mainly disapproving? Why should you be the protagonist of every social encounter? If you’re really as bad as you say, why not stop thinking about yourself so much and give someone else a turn?
Social clumsiness is the thing that we fear doing, and excessive self-focus is the strategy we use to prevent that fear from becoming real, but neither of them is the fear itself, the fear of being left out, called out, ridiculed, or rejected. “Social anxiety” is already taken, so let’s refer to this center of the awkward onion as people-phobia.
People-phobia is both different from and worse than all other phobias, because the thing that scares the bajeezus out of you is also the thing you love the most. Arachnophobes don’t have to work for, ride buses full of, or go on first dates with spiders. But people-phobes must find a way to survive in a world that’s chockablock with homo sapiens, and so they yo-yo between the torment of trying to approach other people and the agony of trying to avoid them.
At the heart of people-phobia are two big truths and one big lie. The two big truths: our social connections do matter a lot, and social ruptures do cause a lot of pain. Individual humans cannot survive long on their own, and so evolution endowed us with a healthy fear of getting voted off the island. That’s why it hurts so bad to get bullied, dumped, pantsed, and demoted, even though none of those things cause actual tissue damage.1
But here’s the big lie: people-phobes implicitly believe that hurt can never be healed, so it must be avoided at all costs. This fear is misguided because the mind can, in fact, mend itself. Just like we have a physical immune system that repairs injuries to the body, we also have a psychological immune system that repairs injuries to the ego. Black eyes, stubbed toes, and twisted ankles tend to heal themselves on their own, and so do slip-ups, mishaps, and faux pas.
That means you can cure people-phobia the same way you cure any fear—by facing it, feeling it, and forgetting it. That’s the logic behind exposure and response prevention: you sit in the presence of the scary thing without deploying your usual coping mechanisms (scrolling on your phone, fleeing, etc.) and you do this until you get tired of being scared. If you’re an arachnophobe, for instance, you peer at a spider from a safe distance, you wait until your heart rate returns to normal, you take one step closer, and you repeat until you’re so close to the spider that it agrees to officiate your wedding.2
Unfortunately, people-phobia is harder to treat than arachnophobia because people, unlike spiders, cannot be placed in a terrarium and kept safely on the other side of the room. There is no zero-risk social interaction—anyone, at any time, can decide that they don’t like you. That’s why your people-phobia does not go into spontaneous remission from continued contact with humanity: if you don’t confront your fear in a way that ultimately renders it dull, you’re simply stoking the phobia rather than extinguishing it.3
Exposure only works for people-phobia, then, if you’re able to do two things: notch some pleasant interactions and reflect on them afterward. The notching might sound harder than the reflecting, but the evidence suggests it’s actually the other way around. Most people have mostly good interactions most of the time. They just don’t notice.
In any study I’ve ever read and in every study I’ve ever conducted myself, when you ask people to report on their conversation right after the fact, they go, “Oh, it was pretty good!”. In one study, I put strangers in an empty room and told them to talk about whatever they want for as long as they want, which sounds like the social equivalent of being told to go walk on hot coals or stick needles in your eyes. And yet, surprisingly, most of those participants reported having a perfectly enjoyable, not-very-awkward time. When I asked another group of participants to think back to their most recent conversation (which were overwhelmingly with friends and family, rather than strangers), I found the same pattern of results4:

But when you ask people to predict their next conversation, they suddenly change their tune. I had another group of participants guess how this whole “meet a stranger in the lab, have an open-ended conversation” thing would go, and they were not optimistic. Participants estimated that only 50% of conversations would make it past five minutes (actually, 87% did), and that only 15% of conversations would go all the way to the time limit of 45 minutes (actually, 31% did). So when people meet someone new, they go, “that was pretty good!”, but when they imagine meeting someone new, they go, “that will be pretty bad!”
A first-line remedy for people-phobia, then, is to rub your nose in the pleasantness of your everyday interactions. If you’re afraid that your goof-ups will doom you to a lifetime of solitude and then that just...doesn’t happen, perhaps it’s worth reflecting on that fact until your expectations update to match your experiences. Do that enough, and maybe your worries will start to appear not only false, but also tedious. However, if reflecting on the contents of your conversations makes you feel like that guy in Indiana Jones who gets his face melted off when he looks directly at the Ark of the Covenant, then I’m afraid you’re going to need bigger guns than can fit into a blog post.

Obviously, I don’t think you can instantly de-awkward yourself by reading the right words in the right order. We’re trying to override automatic responses and perform laser removal on burned-in fears—this stuff takes time.
In the meantime, though, there’s something all of us can do right away: we can disarm. The greatest delusion of the awkward person is that they can never harm someone else; they can only be harmed. But every social hangup we have was hung there by someone else, probably by someone who didn’t realize they were hanging it, maybe by someone who didn’t even realize they were capable of such a thing. When Todd Posner told me in college that I have a big nose, did he realize he was giving me a lifelong complex? No, he probably went right back to thinking about his own embarrassingly girthy neck, which, combined with his penchant for wearing suits, caused people to refer to him behind his back as “Business Frog” (a fact I kept nobly to myself).
So even if you can’t rid yourself of your own awkward onion, you can at least refrain from fertilizing anyone else’s. This requires some virtuous sacrifice, because the most tempting way to cope with awkwardness is to pass it on—if you’re pointing and laughing at someone else, it’s hard for anyone to point and laugh at you. But every time you accept the opportunity to be cruel, you increase the ambient level of cruelty in the world, which makes all of us more likely to end up on the wrong end of a pointed finger.
All of that is to say: if you happen to stop at an intersection and you look up and see someone you know just standing there inside his house he immediately ducks out of sight, you can think to yourself, “There are many reasonable explanations for such behavior—perhaps he just saw a dime on the floor and bent down to get it!” and you can forget about the whole ordeal and, most importantly, keep your damn eyes on the road.
PS: This post pairs well with Good Conversations Have Lots of Doorknobs.
Psychologists who study social exclusion love to use this horrible experimental procedure called “Cyberball”, where you play a game of virtual catch with two other participants. Everything goes normally at first, but then the other participants inexplicably start throwing the ball only to each other, excluding you entirely. (In reality, there are no other participants; this is all pre-programmed.) When you do this to someone who’s in an fMRI scanner, you can see that getting ignored in Cyberball lights up the same part of the brain that processes physical pain. But you don’t need a big magnet to find this effect: just watching the little avatars ignore you while tossing the ball back and forth between them will immediately make you feel awful.
My PhD cohort included some clinical psychologists who interned at an OCD treatment center as part of their training. Some patients there had extreme fears about wanting to harm other people—they didn’t actually want to hurt anybody, but they were afraid that they did. So part of their treatment was being given the opportunity to cause harm, and to realize that they weren’t really going to do it. At the final stage of this treatment, patients are given a knife and told to hold it at their therapist’s throat, who says, “See? Nothing bad is happening.” Apparently this procedure is super effective and no one at the clinic had ever been harmed doing it, but please do not try this at home.
As this Reddit thread so poetically puts it, “you have to do exposure therapy right otherwise you’re not doing exposure therapy, you’re doing trauma.”
You might notice that while awkwardness ratings are higher when people talk to strangers vs. loved ones, enjoyment ratings are higher too. What gives? One possibility is that people are “on” when they meet someone new, and that’s a surprisingly enjoyable state to be in. That’s consistent with this study from 2010, which found that:
Participants actually had a better time talking to a stranger than they did talking to their romantic partner.
When they were told to “try to make a good impression” while talking to their romantic partner (“Don’t role-play, or pretend you are somewhere where you are not, but simply try to put your best face forward”), they had a better time than when they were given no such instructions.
Participants failed to predict both of these effects.
Like most psychology studies published around this time, the sample sizes and effects are not huge, so I wouldn’t be surprised if you re-ran this study and found no effect. But even if people enjoyed talking to strangers as much as they enjoy talking to their boyfriends and girlfriends, that would still be pretty surprising.
2025-12-10 01:46:17
All of us, whether we realize it or not, are asking ourselves one question over and over for our whole lives: how much should I suffer?
Should I take the job that pays more but also sucks more? Should I stick with the guy who sometimes drives me insane? Should I drag myself through an organic chemistry class if I means I have a shot at becoming a surgeon?
It’s impossible to answer these questions if you haven’t figured out your Acceptable Suffering Ratio. I don’t know how one does that in general. I only know how I found mine: by taking a dangerous, but legal, drug.
I’ve always had bad skin. I was that kinda pimply kid in school—you know him, that kid with a face that invites mild cruelty from his fellow teens.1 I did all sorts of scrubs and washes, to little avail. In grad school, I started getting nasty cysts in my earlobes that would fill with pus and weep for days and I decided: enough. Enough! It’s the 21st century. Surely science can do something for me, right?
And science could do something for me: it could offer me a drug called Accutane. Well, it could offer me isotretinoin, which used to be sold as Accutane, but the manufacturers stopped selling it because too many kids killed themselves.
See, Accutane has a lot of potential side effects, including hearing loss, pancreatitis, “loosening of the skin”, “increased pressure around the brain”, and, most famous of all, depression, anxiety, and thoughts of suicide and self-injury. In 2000, a Congressman’s son shot himself while he was on Accutane, which naturally made everyone very nervous. My doctor was so worried that she offered to “balance me out” with an antidepressant before I even started on the acne meds. I turned her down, figuring I was too young to try the “put a bunch of drugs inside you and let ‘em duke it out” school of pharmacology.

But her concerns were reasonable: Accutane did indeed make me less happy. Like an Instagram filter for the mind, it deepened the blacks and washed out the whites: sad things felt a little sharper and happy things felt a little blunted. I was a bit quicker to get angry, and I was more often visited by the thought that nothing in my life was going well—although, as a grad student, I was visited fairly often by that thought even in normal times. It wasn’t the kind of thing I noticed every day. But occasionally, when I was, say, struggling to get some code to work, I’d feel an extra, unfamiliar tang of despair, and I’d go, “Accutane, is that you?”
The drugs also made my skin like 95% better. It was night and day—we’re talking going from Diary of a Pimply Kid to uh, Diary of a Normal Kid. That wonderful facial glow-up that most people experience as they exit puberty, I got to experience that same thing in my mid-20s. Ten years later, I’m still basically zit-free.
I didn’t realize it at first, but I had found my Acceptable Suffering Ratio. Six months of moderate sadness for a lifetime of clear skin? Yes, I’ll take that deal. But nothing worse than that. If the suffering is longer or deeper, if the upside is lower or less certain: no way dude. I’m looking for Accutane-level value out of life, or better.
That probably sounds stupid. It is stupid. And yet, until that point, I don’t think I was sufficiently skeptical of the deals life was offering me. Whenever I had the chance to trade pain for gain, I wasn’t asking: how bad, for how long, and for how much in return? I literally never wondered “How much should I suffer to achieve my goals?” because the implicit answer was always “As much as I have to”.
My oversight makes some sense—I was in academia at the time, where guaranteed suffering for uncertain gains is the name of the game. But I think a lot of us have Acceptable Suffering Ratios that are way out of whack. We believe ourselves to be in the myth of Sisyphus, where suffering is both futile and inescapable, when we are actually in myth of Hercules, where heroic labors ought to earn us godly rewards.
For example, I know this guy, call him Omar, and women are always falling in unrequited love with him. They’re gaga for Omar, but he’s lukewarm on them, and so they make a home for him in their hearts that he only ever uses as a crash pad. I don’t know these women personally, but sometimes I wish I could go undercover in their lives as like their hairdresser or whatever just so I could tell them: “It’s not supposed to feel like this.” This pining after nothing, this endless waiting and hoping that some dude’s change of heart will ultimately vindicate your years of one-sided affection—that’s not the trade that real love should ask of you.
Real love does bring plenty of pain: the angst of being perceived exactly as you are, the torment of confronting your own selfishness, the fear of giving someone else the nuclear detonation codes to your life. But if you can withstand those agonies, you will be richly rewarded. Omar’s frustrated lovers have this romantic notion that love is a burden to be borne, and the greater the burden, the greater the love. When a love is right, though, it’s less like heaving a legendary boulder up a mountain, and more like schlepping a picnic basket up a hill—yes, you gotta carry the thing to and fro, and it’s a bit of a pain in the ass, but in between you get to enjoy the sunset and the brie.
Our Acceptable Suffering Ratios are most out of whack when it comes to choosing what to do with our lives.
Careers usually have a “pay your dues” phase, which implies the existence of a “collect your dues” phase. In my experience, however, the dues that go in are no guarantee of the dues that come back out. There is no cosmic, omnipotent bean-counter who makes sure that you get your adversity paid back with interest. You really can suffer for nothing.
If you’re staring down a gauntlet of pain, then, it’s important to peek at the people coming out the other side. If they’re like, “I’m so glad I went through that gauntlet of pain! That was definitely worth it, no question!” then maybe it’s wise to follow in their footsteps. But if the people on the other side of the gauntlet are like, “What do you mean, other side of the gauntlet? I’m still in it! Look at me suffering!”, perhaps you should steer clear.
Psychologists refer to this process of gauntlet-peeking as surrogation. People are hesitant to do it, however, I think because it feels so silly. If you see people queuing up for a gauntlet of pain, it’s natural to assume that the payoff must be worth the price. But that’s reasoning in the wrong direction. We shouldn’t be asking, “How desirable is this job to the people who want it?” That answer is always going to be “very desirable”, because we’ve selected on the outcome. Instead, the thing we need to know is, “How desirable is this job to the people who have it?”
It turns out the scarcity of a resource is much more potent in the wanting than it is in the having. I had a chance to learn this lesson about twenty years ago, when the stakes were far lower, but I declined. Back then, I thought the only barrier between me and permanent happiness was a Nintendo Wii. I stood outside a Target at 5am in the freezing cold for the mere chance of buying one, counting and re-counting the people in front of me, desperately trying to guess whether there would be enough Wiis to go around, my face going numb as I dreamed of the rapturous enjoyment of Wii Bowling.

I didn’t realize that, by the time I got home, the most exciting part of owning a Wii was already over. When I strapped on my Wii-mote, there was not a gaggle of covetous boys salivating outside my window, reminding me that I was having an enviable experience. It turns out that what makes a game fun is its quality, not its rarity.
If I had understood that obvious fact at the time, I probably wouldn’t have wasted so many hours lusting after a game console, caressing pictures of nunchuck attachments in my monthly Nintendo Power, calling Best Buys to pump them for information about when shipments would arrive, or guilting my mom into pre-dawn excursions to far-flung electronics stores.
Post-Accutane, though, I think I got better at spotting situations like these and surrogating myself out of them. When I got to be an upper-level PhD student, I would go to conferences, look for the people who were five years or so ahead of me, and ask myself: do they seem happy? Do I want to be like them? Are they pleased to have exited the gauntlet of pain that separates my life and theirs? The answer was an emphatic no. Landing a professor position had not suddenly put all their neuroses into remission. If anything, their success had justified their suffering, thus inviting even more of it. If it took this much self-abnegation, mortification, and degradation to get an academic job, imagine how much you’ll need for tenure!
I think our dysfunctional relationship with suffering is wired deep in our psyches, because it shows up even in the midst of our fantasies.
I’ve gotten back into improv recently, which has reacquainted me with a startling fact. An improv scene could happen anywhere and be about anything—we’re lovers on Mars! We’re teenage monarchs! We’re Oompa-Loompas backstage at the chocolate factory! And yet, when given infinite freedom, what do people do? They shuffle out on stage, sit down at a pretend computer, start to type, and exclaim, “I hate my job!”
It’s remarkable how many scenes are like this, how quickly we default to criticizing and complaining about the very reality that we just made up.2 A blank stage somehow turns us into heat-seeking missiles for the situations we least want to be in, as if there’s some unwritten rule that says that when we play pretend, we must imagine ourselves in hell.3
I’m not usually this kind of psychologist, but I can’t help but see these scenes as a kind of reverse Rorschach test. When allowed to draw whatever kind of blob we want, we draw one that looks like the father who was never proud of us. “Yep, that’s it! That’s exactly the thing I don’t like!”
I don’t think that instinct should be squelched, but it should be questioned. Is that the thing you really want to draw? Because you could also draw, like, a giraffe, if you wanted to. Or a rocket ship. Or anything, really. The things that hurt you are not owed some minimum amount of mental airtime. If you’re going to dredge them up and splatter them across the canvas, it should be for something—to better understand them, to accept them, to laugh at them, to untangle them, not to simply stand back and despise them, as if you’re opening up a gallery of all the art you hate the most.
In 1986, a psychologist named Wayne Hershberger played a nasty trick on some baby chickens. He rigged up a box with a cup of food suspended from a railing, and whenever the chicks would try to approach the cup, an apparatus would yank it out of their reach. If the chicks walked in the opposite direction, however, the same apparatus would swing the cup closer to them. So the only way for the chicks to get their dinner was for them to do what must have felt like the dumbest thing in the world: they had to walk away from the food. Alas, most of the chicks never learned how to do this.
I think many humans never learn how to do this, either. They assume that our existence is nothing more than an endless lurching in the direction of the things you want, and never actually getting them. Life is hard and then you die; stick a needle in your eye!
There are people with the opposite problem, of course: people who refuse to take on any amount of discomfort in pursuit of their goals, people who try not to have any goals in the first place, for they only bring affliction. It’s a different strategy, but the same mistake. These two opposing approaches to life—call them grindset and bedrot—both assume that the ratio between pain and gain is fixed. The grindset gang simply accepts every deal they’re offered, while the bedrot brigade turns every deal down.
Neither camp understands that when you get your Suffering Ratio right, it doesn’t feel like suffering at all. The active ingredient in suffering is pointlessness; when it has a purpose, it loses its potency. Taking Accutane made me sad, yes, but for a reason. The nearly-certain promise of a less-pimply face gave meaning to my misery, shrinking it to mere irritation.
Torturing yourself for an unknown increase in the chance that you’ll get some outcome that you’re not even sure you want—yes, that should hurt! That’s the signal you should do something else. When your lunch is snatched out of your grasp for the hundredth time in a row, perhaps you should see what happens when you walk away instead.
Apparently kids these days cover up their pimples with little medicated stickers, rather than parading around all day with raw, white-headed zits. This is a terrific technological and social innovation, and I commend everyone who made it happen.
Perhaps Trent Reznor was thinking of improv comedy when he wrote the lyric:
You were never really real to begin with
I just made you up to hurt myself
You might assume that improv is a good place to work out your demons, until you think about it for a second and realize that it basically amounts to deputizing strangers into doing art therapy with you. So while I’ve never personally witnessed someone conquer their traumas through improv comedy, I have witnessed many people spread their traumas to others.
2025-11-26 03:17:23
I say this with tremendous respect: it’s kinda surprising that the three largest religions in the world are Christianity, Islam, and Hinduism.
None of them are very sexy or fun, they come with all kinds of rules, and if they promise you any happiness at all, it’s either after you’re dead, or it’s the lamest kind of happiness possible, the kind where you don’t get anything you want but you supposedly feel fine about it. If you were trying to design a successful religion from scratch, I don’t think any of these would have made it out of focus groups. “Yeah, uh, women 18 to 34 years old just aren’t resonating with the part where the guy gets nailed to a tree.”
Why, in the ultimate meme battle of religions, did these three prevail? Let’s assume for the sake of argument that it’s not because they have the divine on their side. (Otherwise, God appears to be hedging his bets by backing multiple religions at once.)
Obviously there’s a lot of historical contingency here, and if a couple wars had gone the other way, we might have a different set of creeds on the podium. But I think each of these mega-religions has something in common, something we never really talk about, maybe because we don’t notice it, or maybe because it’s impolite to mention—namely, that they all have a brainy version and a folksy version.
If you’re the scholarly type, Christianity offers you Aquinas and Augustine, Islam has al-Ash’ari and al-Ghazali, Hinduism has Adi Shankara and Swami Vivekananda, etc. But if you don’t care for bookish stuff, you can also just say the prayers, sing the songs, bow to the right things at the right time, and it’s all good. The guy with the MDiv degree is just as Christian as the guy who does spirit hands while the praise band plays “Our God Is an Awesome God”.
It’s hard to talk about this without making it sound like the brainy version is the “good” one, because if we’re doing forensic sociology of religion, it’s obvious which side of the spectrum we prefer. But brainy vs. folksy is really about interest rather than ability. The people who favor the brainier side may or may not be better at thinking, but this is the thing they like thinking about.
More importantly, the brainy and the folksy sides need each other. The brainy version appeals to evangelists, explainers, and institution-builders—people who make their religion respectable and robust. The folksy version keeps a religion relevant and accessible to the 99.9% of humanity who can’t do faith full-time—people who might not be able to name all the commandments, but who will still show up on Sunday and put their dollars in the basket. The brainy version fills the pulpits; the folksy version fills the pews.1
Naturally, brainy folks are always a little annoyed at folksy folks, and folksy folks are always a little resentful of brainy folks. It’s tempting to split up: “Imagine if we didn’t have to pander to these know-nothings”/”Imagine if we didn’t have to listen to these nerds!” But a religion can start wobbling out of control if it tilts too far toward either its brainy yin or its folksy yang. Left unchecked, brainy types can become obsessed with esoterica, to the point where they start killing each other over commas. Meanwhile, uncultivated folksiness can degenerate into hogwash and superstition. Pure braininess is one inscrutable sage preaching to no one; pure folksiness is turning the madrasa into a gift shop.
Here’s why I bring all this up: we’ve got a big brainy/folksy split on our hands right now:
That divide is biggest at the highest level of education:
To think clearly about this situation, we have to continue resisting the temptation to focus on which side is “correct”. And we have to avoid glossing the divide as “Democrats = smart, Republicans = dumb”—both because going to college doesn’t mean you actually know anything, and because intelligence is far more complicated than we like to admit.
I think this divide is better understood as cultural rather than cognitive, but it doesn’t really matter, because separating the brainy and the folksy leaves you with the worst version of each. This is one reason why politics is so outrageous right now—only a sicko would delight in the White House’s Studio Ghibli-fied picture of a weeping woman being deported, and only an insufferable scold would try to outlaw words like “crazy”, “stupid”, and “grandfather” in the name of political correctness. It’s not hard to see why most people don’t feel like they fit in well with either party. But as long as the folksy and brainy contingents stay on opposite sides of the dance floor, we can look forward to a lot more of this.
Bifurcation by education is always bad, but it’s worse for the educated group, because they’ll always be outnumbered. You simply cannot build a political coalition on the expectation that everybody’s going to do the reading. If the brainy group is going to survive, it has to find a way to reunite with the folksy.
So maybe it’s worth taking some cues from the most successful ideologies of all time, the ones that have kept the brainy and folksy strains intertwined for thousands of years. I don’t think politics should be a religion—I’m not even sure if religion should be a religion—but someone’s gonna run the country, and as long as we’ve got a brainy/folksy split, we’ll always have to choose between someone who is up their own ass, and someone who simply is an ass.
As far as I can see, the biggest religions offer two strategies for bridging the divide between the high-falutin and the low-falutin. Let’s call ‘em fractal beliefs and memetic flexibility.
The shape below is a fractal, a triangle made up of triangles. Look at it from far away and you see one big triangle; look at it close up and you see lots of little triangles. It’s triangles all the way down.
The most successful ideologies have similarly fractal beliefs: you get the same ideas no matter how much you zoom in or out. If a Christian leans more into their faith—if they read the Bible cover to cover, go to church twice a week, and start listening to sermons on the way to work—they don’t suddenly transform into, say, a Buddhist. They’re just an extra-enthusiastic Christian with extra-elaborated beliefs. This is a critical feature: if your high-devotion and low-devotion followers believe totally different things, eventually they’re gonna split.
If the brainy tribe is to survive, then, it’s gotta fractal-ize its beliefs. That means generating the simplest versions of your platform that is still true. For example, many brainy folks want to begin arguments about gender by positing something like “gender is a social construct”, and right out the gate they’re expecting everyone to have internalized like three different concepts from sociology 101. Instead, they should start with something everybody can understand and get on board with, like “People’s opportunities in life shouldn’t depend on their private parts”. Making your arguments fractal doesn’t require changing their core commitments; it just means making each step of the argument digestible to someone who has no inclination to chew. If you’re gotta bring up Durkheim, at least put him last.
Brainy folks hate doing this. They’d much prefer to produce ever-more-exquisite versions of their arguments because, to borrow some blockchain terminology, brainy people operate on a proof-of-work system, where your standing is based on the effort you put in. That’s why brainy folks are so attracted to the idea that your first instincts cannot be right, and it’s why their beliefs can be an acquired taste. You’re supposed to struggle a bit to get them—that’s how you prove that you did your homework.
(Only the brainy tribe would, for instance, insist that you need to “educate yourself” in order to participate in polite society, and that no one should be expected to help you with this.)
It would be convenient for this analogy if folksy types operated on a proof-of-stake system (which is the other way blockchains can work), but they don’t, really. They don’t operate on proof of anything—the whole concept of proof is dubious to them. You don’t need to prove the obvious. That might sound dumb, and it often is. But sometimes it’s a useful counterweight: sometimes your first instinct is indeed the right one, and you can think yourself into a stupider position.
Regardless, demanding proof-of-work is always going to tilt the playing field in favor of the other side. A liberal friend of mine was recently lamenting that liberals have all the complicated-but-true positions while conservatives have all the easy-but-wrong positions. She acted like this was the natural end state of things; too bad, so sad, I guess the stupid shall inherit the Earth!
This sounds like a cop-out to me. If you can’t find an accessible way to express your complicated-but-true beliefs, maybe they aren’t actually true, or maybe you’re not trying hard enough. Of course, it takes a lot of thought to make a convoluted idea more intuitive—if only there were some people who liked thinking!
Big religions have a healthy dose of memetic flexibility: while their central tenets are firm, the rest of their belief systems can bend to fit lots of different situations.
For example, why are there four gospels? You’d think it would be a liability to have four versions of the same story floating around, especially when they’re not perfectly consistent with each other. There are disagreements about the name of Jesus’ paternal grandfather, the chronology of his birth, his last words, and uh, whether you’re supposed to take a staff with you when you go off to proselytize (Matthew and Luke say yes, but Mark says no way). This kind of thing is very embarrassing when you’re trying to convince people that you have the inerrant word of God. Wouldn’t it make more sense to smush all these accounts together into one coherent whole?
Syriac Christians did exactly that in the second century AD: it was called the Diatessaron. And yet, despite its cool name, it never really caught on and eventually died out. Nobody even tried that hard to suppress it as a heresy. I mean, if no one claims that your book is the work of the devil, do you even exist??

Maybe the harmonized gospel never went viral because four gospels, inconsistencies and all, are better than one. Each gospel has its own target audience: you’ve got Matthew for the Jews, Luke for the ladies and the Romans, John for the incense-and-crystals crowd, and Mark for the uh...all those Mark-heads out there, you know who you are. Having four stories with the same moral but different details and emphases gives you memetic flexibility without compromising the core beliefs.
Brainy folks could learn a lot from this. For instance, there’s a certain kind of galaxy-brained doomer who thinks that the only acceptable way to fight climate change is to tighten our belts. If we can invent our way out of this crisis with, say, hydrogen fuel cells or super-safe nuclear reactors, they think that’s somehow cheating. We’re supposed to scrimp, sweat, and suffer, because the greenhouse effect is not just a fact of chemistry and physics—it’s our moral comeuppance. In the same way that evangelical pastors used to say that every tornado was God’s punishment for homosexuality, these folks believe that rising sea levels are God’s punishment for, I guess, air conditioning.
This kind of small-tent, memetically inflexible thinking is a great way to make your political movement go extinct. But if you’re willing to be a little open-minded about how, exactly, we prevent the Earth from turning into a sun-dried tomato, you might actually succeed. Imagine if we could suck the carbon out of the atmosphere and turn it into charcoal for your Fourth of July barbecue. Imagine if electricity was so cheap and clean that you could drive your Hummer from sea to shining sea while causing net zero emissions. Imagine genetically engineered cows that don’t fart. That’s a future far more people can get behind, both literally and figuratively.
A few months ago, I ran into an old classmate at the grocery store—let’s call him Jay. He was a bit standoffish at first, and I soon found out why: Jay works in progressive politics now, and he was sussing me out to see whether I had become a right-wing shill since graduation, which is what some of our mutual acquaintances have done. Once he was satisfied I wasn’t trying to follow the well-trod liberal-loser-to-Fox-News-provocateur track, he warmed up. “I unfriended all the conservatives I know. I don’t even talk to moderates anymore,” Jay said, as if this was the most normal thing in the world. I didn’t want to get into it with him in the cereal aisle, but I wanted to know: what is his plan for dealing with the ongoing existence of his enemies? If he ignores them, they’ll....what? Give up? Fill their pockets with stones and wade into the ocean?
Apparently, Jay has decided that his side should lose. There are only three ways that an idea can gain a greater share of human minds: conquest, conversion, or conception. I hope we can all agree that killing is off the table, so that leaves changing minds and makin’ babies. It seems to me that progressives have largely given up on both.2
Obviously, I don’t think people should have kids just to thicken the ranks of their political party, nor do I want proselytizing progressives on every street corner. But if you’re serious about your political beliefs, you should have some plausible theory of how they’re going succeed in the future. That means making it easy to believe the things you think are true, and it means trying to appeal to people who don’t already agree with you.
Most of all, it means participating in the world in a way that makes people want to join you. That’s why there are 5.5 billion Christians, Muslims, and Hindus—those religions offer people something that earns their continued devotion, whether those people want to think really hard about their faith or not. People aren’t gonna show up to Jay’s church if it’s all pulpit and no pew.
There are always going to be political cleavages, and there should be. The ideal amount of polarization is not zero. (When everybody’s on the same page, we groupthink ourselves into doing stupid things like, you know, invading two countries in two years.) Running a country is the greatest adversarial collaboration known to man, and it only works when both sides bring their best ideas. To do that, we need each party to have a fully integrated brainy and folksy contingent; some people plumbing the depths, other people keeping the boat anchored. Otherwise, we end up with parties that are defined either by their pointy-headed pedants or their pinheaded reactionaries.
“I must study Politicks and War,” John Adams famously wrote, “that my sons may have liberty to study Mathematicks and Philosophy”:
My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine.
I appreciate Adams’ aspirations, but I disagree with his order of operations. Politics is not a science set apart from all others. Good governance requires good thinking, and that means drawing on every ounce of our knowledge, no matter how far-flung. Right now, I think we could stand to learn a thing or two from the ancient memelords who created our modern religions. Otherwise, yes, a world where we’re more interested in painting and poetry than we are politics—that sounds great. I think we know the way there. We just have to grab our walking sticks.
Many people assume this arrangement has broken down, and that highly educated people have all ditched their churches and become atheists. In fact, more-educated people are more likely to attend religious services.
2025-11-12 08:45:53
This is the quarterly links ‘n’ updates post, a collection of things I’ve been reading and doing for the past few months.
As late as 1813, some parts of the European medical establishment believed that potatoes cause leprosy. (Don’t even get ‘em started on scrofula!) Potato historian Salaman Redcliffe suggests that people were skeptical because potatoes look kinda weird, they grow in the ground, and you plant them as tubers rather than seeds, which are all extremely suspicious things for a food to do.
You may remember the Spurious Correlations website, which dredges up random datasets and finds correlations between them—for instance, Lululemon’s stock price and the popularity of the first name Stevie. Now thanks to AI, each one of those correlations can be instantly turned into a full academic paper, like: LULU-LEMONADE: A STATISTICAL STUDY OF THE STEVIE-NIZED MARKET.
Sadly, this technology makes many academic departments completely redundant.
Via : there’s a good chance that 2025 will have the fewest murders ever recorded in the US. (We only have reliable data going back to 1960).
I just added a new entry to my list of all-time great blog posts: Ask not why would you work in biology, but rather: why wouldn’t you?, by of . An excerpt:
Yes, biology is very interesting, yes, biology is very hard to do well. Yet, it remains the only field that could do something of the utmost importance: prevent a urinary catheter from being shunted inside you in the upcoming future.
Before World War I, the US government had basically no cryptographic capacity. So when the war broke out and suddenly they needed people to do code-breaking, where did they turn? To the Riverbank Institute, which had been set up by “Colonel” George Fabyan to decode the most important cipher of all: the one that supposedly proved the works of Shakespeare were written by Francis Bacon. Elizebeth and William Friedman, the couple who broke that cipher while working at Riverside, went on to become the first cryptologists at the precursor to the National Security Agency.12
One of the wildest blog posts I’ve read this year is about an American guy going to fight in the Ukraine war. Honestly, it sounds like a huge bummer: you squat in a trench and pretend to shoot at Russians and hope to not be killed by a drone.
When I saw ’s post called How Pen Caps Work, I was like, “what do you mean? Pen caps work by...being....caps for pens”. Apparently not: for fountain pens, anyway, pen caps work through vacuum power. Putting the cap on and taking it off causes a tiny amount of suction that draws the ink into the nib.
, who was one of the winners of my 2025 blog post competition, has a great series on why neuroscientists still can’t simulate a worm:
I told [my mom] that this is what my job feels like—each animal has a different kind of radio in its head and/or body, and neuroscientists are trying to figure out things about them. Some neuroscientists want to fix radios; some want to build better radios. Others, like me, are just trying to understand them.
To which Li’s mom responded:
Other nematode fun facts from Li’s piece: they use static electricity to teleport themselves onto bumblebees as a way of getting around:
And...nematodes survived the Space Shuttle Columbia explosion??
More great work from a 2025 Blog Extravaganza honoree: has a terrific article in Works in Progress: Why Isn’t AI Replacing Radiologists? Radiology was supposed to be the first medical speciality to be rendered obsolete by AI. Instead, radiology jobs are more numerous and salaries are higher than ever.
The Polarization Dashboard is a useful sanity check against current events. Whenever something big happens in politics, people are like “WOW OUR VERY SOULS HAVE PERMANENTLY CHANGED” when in fact people almost always have the same opinions that they did yesterday. Here’s the change in support for murdering members of the opposite political party over time. Currently, <2% of Democrats and Republicans support it. (h/t )
See also: You’re Probably Wrong About How Things Have Changed.
Some beginner researchers successfully replicated my Things Could Be Better paper without any expert help. I’m really proud of this! , who ran the workshop, writes:
I did not help replicate this study because the group replicating Measures of Anchoring in Estimation Tasks [the other study being replicated] needed help understanding the language the paper was written in. In contrast the group replicating Things Could Be Better started their own replication within 15 minutes of being handed the paper and did not have any followup questions for me before they began the replication.
Two years ago a Harvard Business School professor named Francesca Gino was fired for faking her data. (I wrote about the debacle here.) She sued the bloggers who outed her, but that lawsuit was thrown out. She also sued Harvard, claiming discrimination. Now Harvard is suing Gino back, alleging that when Gino submitted data to prove prove that her original data wasn’t fake...the new data was fake, too.
and his friends ask: What if the NIH had been 40% smaller?. I appreciate how circumspect the authors are, but the short answer seems to be, “We would be significantly worse off, because many important medicines rely on research that would not have happened under a smaller budget”. This is further evidence of just how important it is to invest in science: even when we do it in a totally boneheaded way, it somehow still pays off.
Recently, the Financial Times set the internet alight with these graphs:
re-analyzed the data and claims the changes in conscientiousness are minimal, if they exist at all. I’m inclined to trust Ferguson’s account on this one: it’s super weird to see such huge changes in such small amounts of time on basically any psychological variables.
Speaking of personality, ClearerThinking now has one mega personality test that will give you all your results from a bunch of different tests at once. I previously cited their work showing that supposedly scientific personality tests do not obviously outperform the bullshit ones, which continues to boggle people’s minds whenever I bring it up.
One of the most famous psychology experiments of all time is Festinger and Carlsmith (1959), the classic demonstration of cognitive dissonance. The psychologist Matti Heino points out, though, that the main results literally don’t add up. In the table below, the circled means are impossible given the reported sample size—there’s no way to get an average of 3.08, for instance, if you have 20 people giving ratings on a 0-10 scale.
I used to be one of those people who was like “well cognitive dissonance has been replicated in hundreds of studies” but it’s not like I ever actually read those studies. It just kinda seems like, c’mon! It’s cognitive dissonance! Everybody knows cognitive dissonance! Anyway, when a bunch of labs tried to replicate another classic demonstration of cognitive dissonance, they found no effect.3 And a new paper claims that When Prophecy Fails, a landmark book that documented the effects of cognitive dissonance in a doomsday cult, may have been greatly embellished or deliberately orchestrated by the authors.4 None of this means that cognitive dissonance doesn’t or can’t exist, but it does make me feel a whole lot of, uh, dissonance.
According to this study, losing a Michelin star improves TripAdvisor ratings. This is probably because getting a star invites harsher judgment (“It’s good, but...is it Michelin good?”). The perfect restaurant is one that raises your expectations high enough to get you to come in, but leaves your expectations low enough that you can still be wowed.
This is a hard target to hit: I’ve only ever seen two restaurants that have a 4.9 on Google after receiving thousands of ratings. Both of them are Thai restaurants that are nice but not fancy and cool but not trendy—you go expecting good Thai food and you get fantastic Thai food, and everybody goes, “wow, this place should have a Michelin star!”
at debuts a new civic vision called “Potentialism”: everybody’s got something special about them, and they have the right and responsibility to cultivate it and use it for the good of all. In a country where most people don’t like either party very much (which we know about thanks to Yudkin’s research), we should be doing a lot more ideological experimentation like this.
A new paper tracked all PhD students in Sweden and found that they’re more likely to take psychiatric medication compared to a matched group who stopped their education at the master’s level. This isn’t randomized and we should assume these two groups differ in lots of ways. But the fact that the differences grow over time and then disappear at year seven, when most people who started their PhDs have now finished them, does suggest that getting your doctorate is distressing.
Goodhart’s Law in action: The Army Corps of Engineers hired contractors to clear debris after the 2017 wildfires in California, but unfortunately they paid by the ton, incentivizing the contractors to dig up a bunch of wet mud and use it to weigh down their loads. Filling in the holes left by the contractors is “estimated to cost another $3.5 million”.
I’ve seen some wild visual illusions in my time, but this one had me screaming “no...no...no no no NO NO NO NOOO”
Items of note for people who do science independently and people who like that sort of thing. Original post here.
The Existential Hope Foundation is offering a $10k prize for the “most inspiring, uplifting, and forward-looking memes”.
has been building UV sterilization pumps and growing yeast in a DIY laboratory. See also: Why Independent Science Now?
Alex Chernavsky started taking metformin and found that it didn’t do much for his blood sugar but it did, surprisingly, improve his reaction times:
I got a lot of thoughtful responses to my recent post on the decline of deviance, including this one from the classical musician . Many people think the internet is to blame, but we forget how recently the internet became the internet. As late as 2007, only half of households had broadband at home. Instagram only launched in 2010. The early, slow, text-based internet lived in a room in your house and you had to choose between using it and allowing your grandma to call you; it was nothing like the one that lives in your pocket. So the transition from the 1990s to the 2000s, where we see many forms of deviance declining, is not really the transition from “pre-internet” to “post-internet”. That transition happened somewhere around 2012, when a majority of people got a smartphone. That’s why I think the internet as we know it today may act as an accelerant, but I don’t think it caused the decline of deviance in the first place.
In other news:
I talked to about why Americans are so afraid of talking to each other.
answered all my questions about music, including “What are the weirdest lyrics in a #1 hit single?”
Finally, a post from two years ago. A friend of mine recently told me, “Sometimes I want to send this post to someone, but then I remember what you titled it.”
I encountered this anecdote in ’s excellent Case for Crazy Philanthropy.
100 years later, the cryptographer/game developer Elonka Dunin visited the Friedmans’ grave and found they left a secret code on their tombstone.
This isn’t the Festinger and Carlsmith version, but another way of eliciting the effect, where people have to write an essay that goes against their beliefs.
Thanks to an “alert reader” for bringing this to my attention.
Which means Mr. Show was right all along.
2025-10-28 23:21:32
People are less weird than they used to be. That might sound odd, but data from every sector of society is pointing strongly in the same direction: we’re in a recession of mischief, a crisis of conventionality, and an epidemic of the mundane. Deviance is on the decline.
I’m not the first to notice something strange going on—or, really, the lack of something strange going on. But so far, I think, each person has only pointed to a piece of the phenomenon. As a result, most of them have concluded that these trends are:
a) very recent, and therefore likely caused by the internet, when in fact most of them began long before
b) restricted to one segment of society (art, science, business), when in fact this is a culture-wide phenomenon, and
c) purely bad, when in fact they’re a mix of positive and negative.
When you put all the data together, you see a stark shift in society that is on the one hand miraculous, fantastic, worthy of a ticker-tape parade. And a shift that is, on the other hand, dismal, depressing, and in need of immediate intervention. Looking at these epoch-making events also suggests, I think, that they may all share a single cause.
Let’s start where the data is clear, comprehensive, and overlooked: compared to their parents and grandparents, teens today are a bunch of goody-two-shoes. For instance, high school students are less than half as likely to drink alcohol as they were in the 1990s:
They’re also less likely to smoke, have sex, or get in a fight, less likely to abuse painkillers, and less likely to do meth, ecstasy, hallucinogens, inhalants, and heroin. (Don’t kids vape now instead of smoking? No: vaping also declined from 2015 to 2023.) Weed peaked in the late 90s, when almost 50% of high schoolers reported that they had toked up at least once. Now that number is down to 30%. Kids these days are even more likely to use their seatbelts.
Surprisingly, they’re also less likely to bring a gun to school:
All of those findings rely on surveys, so maybe more and more kids are lying to us every year? Well, it’s pretty hard to lie about having a baby, and teenage pregnancy has also plummeted since the early 1990s:
Adults are also acting out less than they used to. For instance, crime rates have fallen by half in the past thirty years:
Here’s some similar data from Northern Ireland on “anti-social behavior incidents”, because they happened to track those:
Serial killing, too, is on the decline:
Another disappearing form of deviance: people don’t seem to be joining cults anymore. Philip Jenkins, a historian of religion and author of a book on cults, reports that “compared to the 1970s, the cult issue has vanished almost entirely”.1 (Given that an increase in cults would be better for Jenkins’ book sales, I’m inclined to trust him on this one.) There is no comprehensive dataset on cult formation, but analyzed cults that have been covered on a popular and long-running podcast and found that most of them started in the 60s, 70s, and 80s, with a steep dropoff after 20002:
Crimes and cults are definitely deviant, and they appear to be on the decline. That’s good. But here’s where things get surprising: neutral and positive forms of deviance also seem to be getting rarer. For example—
Moving away from home isn’t necessarily good or bad, but it is kinda weird. Ditching your hometown usually means leaving behind your family and friends, the institutions you understand, the culture you know, and perhaps even the language you speak. You have to be a bit of a misfit to do such a thing in the first place, and becoming a stranger makes you even stranger.
I always figured that every generation of Americans is more likely to move than the last. People used to be born and die in the same zip code; now they ping-pong across the country, even the whole world.
I was totally wrong about this. Americans have been getting less and less likely to move since the mid-1980s:
This effect is mainly driven by young people:
These days, “the typical adult lives only 18 miles from his or her mother“.
Creativity is just deviance put to good use. It, too, seems to be decreasing.
A few years ago, I analyzed a bunch of data and found that all popular forms of art had become “oligopolies”: fewer and fewer of the artists and franchises own more and more of the market. Before 2000, for instance, only about 25% of top-grossing movies were prequels, sequels, spinoffs, etc. Now it’s 75%.
The story is the same in TV, music, video games, and books—all of them have been oligpol-ized. As points out, we’re still reading comic books about superheroes that were invented in the 1960s, buying tickets to Broadway shows that premiered decades ago, and listening to the same music that our parents and grandparents listened to.
You see less variance even when you look only at the new stuff. According to analyses by The Pudding, popular music today is now more homogenous and has more repetitive lyrics than ever.
Also, the cover of every novel now looks like this:
But wait, shouldn’t we be drowning in new, groundbreaking art? Every day, people post ~100,000 songs to Spotify and upload 3.7 million videos to YouTube.3 Even accounting for Sturgeon’s Law (“90% of everything is crap”), that should still be more good stuff than anyone could appreciate in a lifetime. And yet professional art critics are complaining that culture has come to a standstill. According to The New York Times Magazine,
We are now almost a quarter of the way through what looks likely to go down in history as the least innovative, least transformative, least pioneering century for culture since the invention of the printing press.
Remember when the internet looked like this?

That era is long gone. Take a stroll through the Web Design Museum and you’ll immediately notice two things:
Every site has converged on the same look: sleek, minimalist design elements with lots of pictures
Website aesthetics changed a lot from the 90s to the 2000s and the 2010s, but haven’t changed much from the 2010s to now
A few examples:
This same kind of homogenization has happened on the parts of the internet that users create themselves. Every MySpace page was a disastrous hodgepodge; every Facebook profile is identical except for the pictures. On TikTok and Instagram, every influencer sounds the same4. On YouTube, every video thumbnail looks like it came out of one single content factory:
No doubt, the internet is still basically a creepy tube that extrudes a new weird thing every day: Trollface, the Momo Challenge, skibidi toilet. But notice that the raw materials for many of these memes is often decades old: superheroes (1930s-1970s), Star Wars (1977), Mario (1981), Pokémon (1996), Spongebob Squarepants (1999), Pepe the Frog (2005), Angry Birds (2009), Minions (2010), Minecraft (2011). Remember ten years ago, when people found a German movie that has a long sequence of Hitler shouting about something, and they started changing the subtitles to make Hitler complain about different things? Well, they’re still doing that.
The physical world, too, looks increasingly same-y. As Alex Murrell has documented5, every cafe in the world now has the same bourgeois boho style:
Every new apartment building looks like this:
The journalist Kyle Chayka has documented how every AirBnB now looks the same. And even super-wealthy mega-corporations work out of offices that look like this:

People usually assume that we don’t make interesting, ornate buildings anymore because it got too expensive to pay a bunch of artisans to carve designs into stone and wood.6 But the researcher Samuel Hughes argues that the supply-side story doesn’t hold up: many of the architectural flourishes that look like they have to be done by hand can, in fact, be done cheaply by machine, often with technology that we’ve had for a while. We’re still capable of making interesting buildings—we just choose not to.
Brands seem to be converging on the same kind of logo: no images, only words written in a sans serif font that kinda looks like Futura.7
An analysis of branded twitter accounts found that they increasingly sound alike:
Most cars are now black, silver, gray, or white8:
When a British consortium of science museums analyzed the color of their artifacts over time, they found a similar, steady uptick in black, gray, and white:
Science requires deviant thinking. So it’s no wonder that, as we see a decline in deviance everywhere else, we’re also seeing a decline in the rate of scientific progress. New ideas are less and less likely to displace old ideas, experts rate newer discoveries as less impressive than older discoveries, and we’re making fewer major innovations per person than we did 50 years ago.
You can spot this scientific bland-ification right away when you read older scientific writing. As (same guy who did the cult analysis) points out, scientific papers used to have style. Now they all sound the same, and they’re all boring. Essentially 100% of articles in medical journals, for instance, now use the same format (introduction, methods, results, and discussion):
This isn’t just an aesthetic shift. Standardizing your writing also standardizes your thinking—I know from firsthand experience that it’s hard to say anything interesting in a scientific paper.
Whenever I read biographies of famous scientists, I notice that a) they’re all pretty weird, and b) I don’t know anyone like them today, at least not in academia. I’ve met some odd people at universities, to be sure, but most of them end up leaving, a phenomenon the biologist calls “the flight of the Weird Nerd from academia”. The people who remain may be super smart, but they’re unlikely to rock the boat.
Whenever you notice some trend in society, especially a gloomy one, you should ask yourself: “Did previous generations complain about the exact same things?” If the answer is yes, you might have discovered an aspect of human psychology, rather than an aspect of human culture.
I’ve spent a long time studying people’s complaints from the past, and while I’ve seen plenty of gripes about how culture has become stupid, I haven’t seen many people complaining that it’s become stagnant.9 In fact, you can find lots of people in the past worrying that there’s too much new stuff. As relates, one hundred years ago, people were having nervous breakdowns about the pace of technological change. They were rioting at Stravinsky’s Rite of Spring and decrying the new approaches of artists like Kandinsky and Picasso. In 1965, Susan Sontag wrote that new forms of art “succeed one another so rapidly as to seem to give their audiences no breathing space to prepare”. Is there anyone who feels that way now?
Likewise, previous generations were very upset about all the moral boundaries that people were breaking, i.e.:
In olden days, a glimpse of stocking
Was looked on as something shocking
But now, God knows
Anything goes
-Cole Porter, 1934
Back then, as far as I can tell, nobody was encouraging young Americans to party more. Now they do. So as far as I can tell, the decline of deviance is not just a perennial complaint. People worrying about their culture being dominated by old stuff—that’s new.
That’s the evidence for a decline in deviance. Let’s see the best evidence against.
As I’ve been collecting data for this post over the past 18 months or so, I’ve been trying to counteract my confirmation bias by keeping an eye out for opposing trends. I haven’t found many—so maybe that’s my bias at work—but here they are.
First, unlike other forms of violence, mass shootings have become more common since the 90s (although notice the Y-axis, we’re talking about an extremely small subset of all crime):
Baby names have gotten a lot more unique:
And when you look at timelines of fashion, you certainly see a lot more change from the 1960s to the 2010s than you do from the 1860s to 1910s:
That at least hints the decline of deviance isn’t a monotonic, centuries-long trend. And indeed, lots of the data we have suggest that things started getting more homogenous somewhere between the 1980s and 2000s.
There are a few people who disagree at least with parts of the cultural stagnation hypothesis. Literature Substacker reports that “literature is booming”, and music Substacker is skeptical about stagnation in his industry. The internet ethnographer Katherine Dee argues10 that the most interesting art is happening in domains we don’t yet consider “art”, like social media personalities, TikTok sketch comedy, and Pinterest mood boards. I’m sure there’s some truth to all of this, but I’m also pretty sure it’s not enough to cancel out the massive trends we see everywhere else.
Maybe I’m missing all the new and exciting things because I’m just not cool and plugged in enough? After all, I’ll be the first to tell you there’s a lot of writing on Substack (and the blogosphere more generally) that’s very good and very idiosyncratic—just look at the winners of my blog competitions this year and last year. But I only know about that stuff because I read tons of blogs. If I was as deep into YouTube or podcasts, maybe I’d see the same thing there too, and maybe I’d change my tune.
Anyway, I know that it’s easy to perceive a trend when there isn’t any (see: The Illusion of Moral Decline, You’re Probably Wrong About How Things Have Changed). There’s no way of randomly sampling all of society and objectively measuring its deviance over time. The data we don’t have might contradict the data we do have. But it would have to be a lot of data, and it would all have to point in the opposite direction.
It really does seem like we’re experiencing a decline of deviance, so what’s driving it? Any major social trend is going to have lots of causes, but I think one in particular deserves most of the credit and the blame:
Life is worth more now. Not morally, but literally. This fact alone can, I think, go a long way toward explaining why our weirdness is waning.
When federal agencies do cost-benefit analyses, they have to figure out how much a human life is worth. (Otherwise, how do you know if it’s worth building, say, a new interstate that will help millions get to work on time but might cause some excess deaths due to air pollution?) They do this by asking people how much they would be willing to pay to reduce their risk of dying, which they then use to calculate the “value of a statistical life”. According to an analysis by the Substacker , those statistical lives have gotten a lot more valuable over time:
There are, I suspect, two reasons we hold onto life more dearly now. First: we’re richer. Generations of economic development have put more cash in people’s pockets, and that makes them more willing to pay to de-risk their lives—both because they can afford it, and because the life they’re insuring is going to be more pleasant. But as Linch points out, the value of a statistical life has increased faster than GDP, so that can’t be the whole story.
Second: life is a lot less dangerous than it used to be. If you have a nontrivial risk of dying from polio, smallpox, snake bites, tainted water, raids from marauding bandits, literally slipping on a banana peel, and a million other things, would you really bother to wear your seatbelt? Once all those other dangers go away, though, doing 80mph in your Kia Sorento might suddenly become the riskiest part of your day, and you might consider buckling up for the occasion.
Our super-safe environments may fundamentally shift our psychology. When you’re born into a land of milk and honey, it makes sense to adopt what ecologists refer to as a “slow life history strategy”—instead of driving drunk and having unprotected sex, you go to Pilates and worry about your 401(k). People who are playing life on slow mode care a lot more about whether their lives end, and they care a lot more about whether their lives get ruined. Everything’s gotta last: your joints, your skin, and most importantly, your reputation. That makes it way less enticing to screw around, lest you screw up the rest of your time on Earth.
(“What is it you plan to do with your one wild and precious life?” Make sure I stand up from my desk chair every 20-30 minutes!)
I think about it this way: both of my grandfathers died in their 60s, which was basically on track with their life expectancy the year they were born. I’m sure they hoped to live much longer than that, but they knew they might not make it to their first Social Security check. Imagine how you differently you might live if you thought you were going to die at 65 rather than 95. And those 65 years weren’t easy, especially at the beginning: they were born during the Depression, and one of them grew up without electricity or indoor plumbing.
Plus, both of my grandpas were drafted to fight in the Korean War, which couldn’t have surprised them much—the same thing had happened to their parents’ generation in the 1940s and their grandparents’ generation in the 1910s. When you can reasonably expect your government to ship you off to the other side of the world to shoot people and be shot at in return, you just can’t be so precious about your life.11
My life is nothing like theirs was. Nobody has ever asked me to shoot anybody. I’ve got a big-screen TV. I could get sushi delivered to my house in 30 minutes. The Social Security Administration thinks I might make it to 80. Why would I risk all this? The things my grandparents did casually—smoking, hitching a ride in the back of a pickup truck, postponing medical treatment until absolutely necessary—all of those feel unthinkable to me now.12 I have a miniature heart attack just looking at the kinds of playgrounds they had back then:

I know life doesn’t feel particularly easy, safe, or comfortable. What about climate change, nuclear war, authoritarianism, income inequality, etc.? Dangers and disadvantages still abound, no doubt. But look, 100 years ago, you could die from a splinter. We just don’t live in that world anymore, and some part of us picks that up and behaves accordingly.
In fact, adopting a slow life strategy doesn’t have to be a conscious act, and probably isn’t. Like most mental operations, it works better if you can’t consciously muck it up. It operates in the background, nudging each decision toward the safer option. Those choices compound over time, constraining the trajectory of your life like bumpers on a bowling lane. Eventually this cycle becomes self-reinforcing, because divergent thinking comes from divergent living, and vice versa.13
This is, I think, how we end up in our very normie world. You start out following the rules, then you never stop, then you forget that it’s possible to break the rules in the first place. Most rule-breaking is bad, but some of it is necessary. We seem to have lost both kinds at the same time.14
The sculptor Arturo di Modica ran away from his home in Sicily to go study art in Florence. He later immigrated to the US, working as a mechanic and a hospital technician to support himself while he did his art. Eventually he saved up enough to buy a dilapidated building in lower Manhattan, which he tore it down so he could illegally build his own studio—including two sub-basements—by hand, becoming an underground artist in the literal sense. He refused to work with an art dealer until 2012, when he was in his 70s. His most famous work, the Charging Bull statue that now lives on Wall Street, was deposited there without permission or payment; it was originally impounded before public outcry caused the city to put it back. Di Modica didn’t mean it as an avatar of capitalism—the stock market had tanked in 1987, and he intended the bull to symbolize resilience and self-reliance:
My point was to show people that if you want to do something in a moment things are very bad, you can do it. You can do it by yourself. My point was that you must be strong.
Meanwhile, “Fearless Girl”, the statue of a girl standing defiantly with her hands on her hips that was installed in front of the bull in 2017, was commissioned by an investment company to promote a new index fund.
Who would live di Modica’s life now? Every step was inadvisable: don’t run away from home, don’t study art, definitely don’t study sculpture, don’t dig your own basement, don’t dump your art on the street! Even if someone was crazy enough to pull a di Modica today, who could? The art school would force you to return home to your parents, the real estate would be unaffordable, the city would shut you down.
The decline of deviance is mainly a good thing. Our lives have gotten longer, safer, healthier, and richer. But the rise of mass prosperity and disappearance of everyday dangers has also made trivial risks seem terrifying. So as we tame every frontier of human life, we have to find a way to keep the good kinds of weirdness alive. We need new institutions, new eddies and corners and tucked-away spaces where strange things can grow.
All of this is within our power, but we must decide to do it. For the first time in history, weirdness is a choice. And it’s a hard one, because we have more to lose than ever. If we want a more interesting future, if we want art that excites us and science that enlightens us, then we’ll have to tolerate a few illegal holes in the basement, and somebody will have to be brave enough to climb down into them.

I’d love to read a version of Robert Putnam’s Bowling Alone specifically about the death of cults. Drawing Pentagrams Alone?
Whenever I tell people about the cult deficit, they offer two counterarguments. First: “Isn’t SoulCycle a cult? Isn’t Taylor Swift fandom a cult? Aren’t, like, Lububus a cult?” I think this is an example of prevalence-induced concept change: now that there are fewer cults, we’re applying the “cult” label to more and more things that are less and less cult-y. If your spin class required you to sell all your possessions, leave your family behind, and get married to the instructor, that would be a cult.
Second: “Aren’t conspiracy theories way more popular now? Maybe people are satisfying all of their cult urges from the comforts of their own home, kinda like how people started going to church on Zoom during the pandemic.” It’s a reasonable hypothesis, but the evidence speaks against it. A team of researchers tracked 37 conspiracy beliefs over time, and found no change in the percentage of people who believe them. Nor did they find any increase or decrease in the number of people who endorse domain-general tinfoil-hat thinking, like “Much of our lives are being controlled by plots hatched in secret places”. It seems that hardcore cultists have become an endangered species, while more pedestrian conspiracy theorists are merely as prevalent as they ever were.
There is some skepticism about the Spotify numbers, and I’m sure the YouTube numbers are dubious as well—a big chunk of that content has to be spam, duplicates, etc. But reduce those amounts even by 90% and you still have an impossible amount of songs and videos.
According to , the Instagram accent is what you end up with when you optimize your speaking for attracting and holding people’s attention. For more insights like that one, check out his new book.
Thanks to ’s article The Age of the Surefire Mediocre for these and other examples.
Or maybe the conspiracy theorists are right and it’s because some kind of apocalypse wiped out our architectural knowledge and the elites are keeping it hushed up.
Cracker Barrel tried to do the same thing recently and was hounded so hard on the internet that they brought their old logo back.
Note that this data comes from Poland, but if you look up images of American parking lots in previous decades vs. today, you’ll find the same thing.
For instance, T.S. Eliot, 1949:
We can assert with some confidence that our own period is one of decline; that the standards of culture are lower than they were fifty years ago; and that the evidences of this decline are visible in every department of human activity.
Dee’s original article is now paywalled, so I’m linking to a summary of her argument.
Almost all the data I’ve shown you is from the US, so I’m interested to hear what’s going on in other parts of the world. My prediction is that development at first causes a spike in idiosyncrasy as people gain more ways to express themselves—for example, all cars are going to look the same when the only one people can afford is a Model T, and then things get more interesting when more competitors emerge. But as the costs of being weird increase, you’ll eventually see a decline in deviance. That’s my guess, anyway.
Fast life strategies are still possible today, but they’re rarer. Once, in high school, I was over at a friend’s house and his mom lit a cigarette in the living room. I must have looked shocked, because she shrugged at me and said, “If the cigarettes don’t kill me, something else will.” I could at least understand where she was coming from: her husband had died in a car accident before he even turned 50. When you feel like your ticket could get punched at any time, why not enjoy yourself?
Perhaps that’s also why we’ve become so concerned about the safety of our children, when previous generations were much more laissez-faire. This map traces how the changes in a single family, but the pattern seems broadly true:
There’s a paradox here: shouldn’t safer, wealthier lives make us more courageous? Like, can’t you afford to take more risks when you have more money in the bank?
Yes, but you won’t want to. I saw this happen in real time when I was a resident advisor: getting an elite degree ought to increase a student’s options, but instead it makes them too afraid to choose all but a few of those options. Fifty percent of Harvard graduates go to work in finance, tech, and consulting. Most of them choose those careers not because they love making PowerPoints or squeezing an extra three cents of profit out of every Uber ride, but because those jobs are safe, lucrative, and prestigious—working at McKinsey means you won’t have to be embarrassed when you return for your five-year reunion. All of these kids dreamed of what they would gain by going to an Ivy League school; none of them realized it would give them something to lose.
In fact, the richest students are the most likely to pick the safest careers:
Also c’mon this chart is literally made by someone named Kingdollar.
2025-10-14 23:48:04
Daniel Kahneman and Amos Tversky were two of the greatest psychologists of all time. Maybe the greatest. The fields we now call “behavioral economics” and “judgement and decision-making” are basically just people doing knockoff Kahneman and Tversky studies from the 70s and 80s.
(You know how The Monkees were created by studio executives to be professional Beatles impersonators, and then they actually put out a few good albums, but they never reached the depth or the importance of the band they were—pun intended—aping? Think of Kahneman and Tversky as the Beatles and the past 50 years of judgement and decision-making research as The Monkees.1)
Amos ‘n’ Danny were masters of the bamboozle: trick questions where the intuitive answer is also the wrong answer. Quick: are there more words that start with R, or words that have R in the third position? Most people think it’s the former because it’s easier to come up with r-words (rain, ring, rodent) than it is to come up with _ _ -r words (uh...farm...fart...?). But there are, in fact, more words where r comes third. That single silly example actually gives us an insight into how the mind calculates frequencies—apparently, not by conducting a census of your memories, but by judging how easy or hard it feels when you try to think of examples.
These little cognitive gotchas show us how the mind works by showing us how it breaks. It’s like a visual illusion, but for your whole brain. This is the understated genius of Kahneman and Tversky—it’s not like other research, where some egghead writes a paper about some molecule and then ten years later you can buy a pill with the molecule in it and it cures your disease. No, for K&T, the paper is also the pill.
But the duo was so successful that they have, in part, undone their own legacy. The tricks were so good that everybody learned how they worked, and now it’s hard to be bamboozled anymore. We’re no longer surprised when the rabbit comes out of the hat. And that’s a shame, because the best part of their work was that half-second of disbelief where you go “no no that can’t be right!” and then you realize it is right. That kind of feeling loosens your assumptions and dissolves your certainty, and that’s exactly what most of us need: an antidote to our omnipresent overconfidence.
I’m here to bring the magic back. I’m no Kahneman nor Tversky, but I can at least do two things: resurface some of their long-forgotten deep cuts, and document a few tricks that bamboozled the Bamboozlers-in-Chief themselves—including one that, I think, I have just discovered and am documenting for the first time.
Let’s see if we can find another rabbit in this hat.