2025-04-15 22:35:58
In 2016, I accidentally became a bit character in UK history.
I had bumbled my way onto a British reality show called Come Dine with Me, where four strangers take turns hosting, attending, and rating each other’s dinner parties, and the person with the highest score at the end of the week wins an extremely modest £1,000. Usually, the show is low-stakes—its version of “drama” is when someone sticks a whole whisk in their mouth. It’s the kind of trashy, easy-viewing TV you might watch while you’re recovering from having your appendix removed.
My episode was different. On the final day, when a contestant named Peter realized he had lost, he delivered a now-iconic denunciation of the winner and kicked the rest of us out of his house. The clip is routinely scrubbed from YouTube for copyright violations, but here’s a version that has lived on (I’m the guy in blue sitting sheepishly on the couch):
This is tame by American standards, of course, where our reality shows involve stripping people naked and dropping them in the woods. But for the Brits, this was as scandalous as the Queen showing cleavage. Peter’s blowup became national and international news. Bootleg versions of the episode racked up millions of views before being taken down. Voice actors, vintage shop employees, and precocious children posted their best Peter impressions, while enterprising Etsy sellers slapped his visage on coasters, t-shirts, spatulas, Christmas jumpers, doilies, and religious candles. Internet citizens turned his rant into an auto-tuned ballad, a ukulele ditty, and an honestly very catchy indie single.
Most memes die, but a few transcend. This one transcended. “You won, Jane” became a permanent part of British memetic vernacular, right up there with “Keep Calm and Carry On”, destined to be resurrected and remixed to fit whatever’s in the headlines. For instance, when covid struck, this appeared:
When England lost a soccer game:
When Keir Starmer became prime minister last year:
When Chappell Roan revealed the original title of her hit song “Good Luck, Babe!”:
If you went to the Edinburgh Fringe Festival in 2023, you could go see live actors reenact the whole Come Dine with Me debacle, with one caveat: “To avoid copyright infringement, we will recreate the episode using sock puppets”.1
Every generation casts down the memetic gods of their forefathers, and so I expect “You won, Jane” to eventually be displaced by Gen Z’s pantheon of rizz and skibidi toilet. But that hasn’t happened yet. Even to this day, although I no longer live in the UK, every once in a while I’ll see some stranger squinting at me, and they’ll walk over, and I’ll secretly be hoping that they’ll say “Hey, I read your blog” but instead they will say, “Hey, were you on that one episode of Come Dine with Me?”
I have resigned myself to the fact that, no matter what I do for the rest of my life, if I am remembered for anything at all, it will be the for the thirty seconds I spent sitting idly by while a man ruined his life on national television. I haven’t said much about the whole episode since then, for reasons that will become clear later. But now that we’re nearly 10 years out, it’s time to unburden myself of these secrets I’ve been carrying ever since. Because what they show on TV ain’t the whole story. It ain’t even close.
2025-04-02 00:15:53
I got a bone to pick with an ancient meme. Remember five years ago, when a viral post was claiming that you could judge someone’s capacity for “self-governing” by observing their behavior in the grocery store parking lot? Good people return their shopping carts, the theory went, and bad people don’t.
Like everything that comes out of the website 4Chan, the Shopping Cart Test was designed to get people’s attention and then make their lives worse. These are the people who brought us Pizzagate, QAnon, Pedobear, the bikini bridge, and a high-profile heist of a flag owned by the actor Shia Lebeouf.1 So this meme, too, did exactly what it was supposed to do: it made people look and then it made them mad. The inevitable backlash came, the backlash to the backlash, the New York Times article, etc. People got bitter, then they got bored, then they moved on.
But not me. I think the 4channers accidentally discovered something useful. They’re right about one big thing: our moral mettle will be tested by dilemmas so small that they’re nearly imperceptible. As fun as it is to toss around Trolley Problems and Lifeboat Ethics, most of us will never actually have to decide whether to kill one to save five, or whether to toss a toddler into the sea to keep the boat afloat. You cannot purchase an experience machine, nor can you move to Omelas. But you will have to decide whether to take 30 seconds to push a cart 50 feet. If there really is a Saint Peter waiting for us at the pearly gates with a karmic dossier, it will be stuffed not with a few grand actions, but a million mundanities like this one.
Because they’re trying to turn the whole world into a kingdom of trolls, however, the originators of the Shopping Cart Test made it nefarious instead of useful by pointing it in the wrong direction. The test is presented as a cudgel to use against your fellow citizens, a way to judge them worthy of coexistence or condemnation. (In the meme’s own words, anyone who fails the test is “no better than an animal.”) But Saint Peter is not like a district attorney, willing to let you walk into heaven in exchange for kompromat on your accomplices. (“I saw Bill from down the block leave a cart in the handicapped parking spot! Do I get to have eternal life now?”) No, the point of the Shopping Cart Test is not to administer it, but to pass it—to practice a tiny way of being good, because most ways of being good are, in fact, tiny.
There should be many more Shopping Cart Tests, then, and we should be pointing them at ourselves, rather than at each other. So I’ve been keeping track of them—those almost invisible moments when you can choose to do a bit of good or a bit of bad. I think of these as keyholes to the soul, ways of peeking into your innermost self, so you can make sure you like what you see in there. Here are seven of ‘em.
God help the driver who gives me control over the music in the car, because the second I get that Bluetooth connection, I become a madman. I take ‘em on a wild ride through my Spotify, hellbent on showing them just how interesting of a guy I am, and how cool and eclectic my tastes are. “They’re playing authentic medieval instruments,” I shout over the music, “But it’s also mixed with death metal!”
I was once on a third date where we needed to drive somewhere, and when my date connected to the car’s Bluetooth, I figured she would do like I do and show off her discography. Instead, my favorite tunes started coming out of the speakers. “Whoa, you like this too??” I asked. “No,” she said, “I’m playing it because you like it.”
I was awestruck and dumbstruck. I had never even entertained the idea of creating a good experience for someone else. And I had never realized that the only pleasure greater than playing the music you love is other people playing the music you love. The scales fell from my eyes as I thought of all the times I had been in charge and decided, without thinking, to cater to my own tastes: order the food I like, set the temperature to what’s comfortable for me, pack the itinerary full of stuff I want to do. I’ve come to think of this as the Bluetooth Test: when you’re given the smallest amount of power, do you use it to make things nice for everybody, or just yourself?
Anyway, me ‘n’ that girl are married now.
You know that moment when you’re at a party or a conference or whatever and you have no one to talk to, so you sidle up to a circle of people and then stand there awkwardly at the periphery, hoping for a chance to jump in? There is no more vulnerable position than this, to be teetering on the edge of personhood and oblivion, waiting to see whether a jury of your peers will decide that you exist.
My friend Wanda just doesn’t let that happen. If she’s in a circle and someone tries to join, she’ll go “Oh hey everybody this is Adam. Adam, we were just talking about...” and then we all go on normally. If she doesn’t know you, she’ll introduce herself quickly, tell you everybody’s names, and then pick up where the conversation left off.
This is the itsiest bitsiest mercy of all time, but when you’re on the receiving end of it, it feels like an angel has snatched you out of the maw of Hades. That’s why I call it the Circle of Hell Test: when you see someone writhing in social damnation, do you grab their hand, or do you let ‘em burn?
I think most people fail this test this because they’re too anxious about their own status, like “Hey man how can I affirm your personhood?? I’m still waiting to see if they affirm my personhood!” But this is the wrong way of looking at it. Bringing someone else into the fold doesn’t cost you status. It gives you status. Taking the floor and then handing it to someone else is a big conversational move, and you look cool when you do it. Wanda ends up seeming like the most high-status person in any conversation for exactly this reason.
No one can convince my friend Micah of anything. He treats conversations like trench warfare—you have to send a full battalion to their deaths if you want to gain an inch of ground. When anybody tries to give him advice, he stares into the distance and waits for them to stop. Meanwhile, Micah is quick to tell other people how to live, and then he gets kinda huffy if you blow him off.
Oh, who am I kidding? Micah is me.
I’m usually skeptical of self-help books, but when I read that John Gottman’s Principle #4 for a Successful Marriage is “let your partner influence you,” I didn’t just feel seen. I felt caught. This is Gottman Test #4, and it applies not just to partners, but to people in general: do you expect to influence others, but refuse to be influenced yourself?
I think I’m so resistant to the idea of being swayed because I feel like I’m made out of opinions. Changing them would be like opening up my DNA and scrambling my nucleotides. That’s why I surround ‘em with barbed wire and mine fields and machine gunners. This assumes, of course, that I just happen to have all of my cytosines and guanines in exactly the right place, that none of my DNA is useless or mutated, and that every codon is critical—change one, and you change everything. I mean, if I admit my ranking of Bruce Springsteen albums is slightly out of order, then who am I anymore?
But it ain’t like that. Like genes, opinions acquire errors over time, and they have to be perpetually proofread and repaired, or else they start going wonky and you end up with a whole tumorous ideology. Unlike genes, however, those repairs generally have to come from outside. I always assume that it will feel frightening to let this happen, to reel in the barbed wire, to deactivate the mine field, to order the machine gunners to stand down. Instead, it feels like a relief to finally give up and agree that Born to Run is, in fact, superior to Born in the U.S.A.
I recently walked through a train station that was plastered with ads like: “There’s a lot of stigma against disabilities, but we’re here to change that!” and “80% of Americans have a prejudice against people with disabilities. It’s time to lower that number!”
If you actually want to reduce prejudice toward people with disabilities, you would never ever run a campaign like that, not in a million years. When you tell everybody that there’s a lot of stigma against something, you stand a pretty good chance of increasing that stigma. Maybe the nonprofit that paid for these ads has a different theory of human behavior, or maybe the ads looked reasonable because that nonprofit cannot actually imagine a future without stigma. Perhaps because if that stigma went away, the nonprofit would have to go, too.2
I think of this as the Codependent Problems Test: do you actually want to solve your problem, or are you secretly depending on its continued existence? If you showed up to fight the dragon and found it already slain, would you be elated or disappointed?
After all, a righteous crusade gives you meaning and camaraderie, to the point where you can become addicted to the crusading itself. It is possible to form an entire identity around being mad at things, and to make those things grow by pouring your rage on them, which in turn gives you more things to be mad at. This is, in fact, the business model for approximately half the internet.
When I was an academic, I used to worry about getting scooped—if someone debuts my idea before I do, they get all the glory. As soon as I stopped publishing in journals and started blogging my research instead, this fear went away. I realized that I wasn’t actually looking for knowledge; I was looking for credit. I was in a codependent relationship with ignorance: I wanted it to keep existing until the exact moment that I, and only I, could make it go away.
Everybody loves my friends Tim and Renee because they are willing to match your freak. If you do something weird, they’ll do something weirder. Dance an embarrassing little jig, talk like you’re a courtier to Louis XIV, pretend you’re from a universe where 9/11 never happened, and they’ll be right there with you: “That’s so crazy! We’re from a universe where Saddam Hussein became a famous lifestyle TikTokker!”
Every moment you’re with another person, you are implicitly asking, “If I’m a little bit weird, will you be a little bit weird too?” And when you’re with Tim and Renee the answer is yes, yes, a thousand times yes. It’s hard to describe how good it feels when someone passes the Match Your Freak Test, to know that no matter how far you put yourself out there, you won’t be left hanging.
Technically this is called “Yes, And,” but in our attempt to mass-produce that idea, we’ve made it mechanical and cringe. (Somehow you don’t get the vibe right when you put people in a circle with coworkers they hate and force them all to play “Zip, Zap, Zop”.) I knew plenty of talented improvisers who would never disobey the letter of improv law, but would still find a way to make it clear that they hated your choices.
Matching someone’s freak isn’t about reluctantly agreeing to their reality. It’s about declining the opportunity to judge them, and choosing instead to do something that’s even more judge-able. It’s the opposite of being a bully—it’s seeing someone with a mustard stain on their shirt, and instead of pointing and laughing, you grab a bottle of mustard and squirt it all over yourself.
In fourth grade, the teacher handed us all a blank map of the US and told us to color in every state that we had visited. Immediately, my mission was clear: I needed to be the kid who had been to the most states. As I was sharpening my Crayolas, though, I saw this kid Ian coloring in swaths of the Northeast—as if, knowing this showdown would one day come, he had gone on a road trip through New England specifically to juice his stats with all those tiny states.
Desperate, I hatched a plan: I had once flown from Ohio to Florida to visit my uncle, so hadn’t I technically been “in” all of those intervening states? This led to a pitched metaphysical debate with the teacher, and many thought experiments later (“What if you drive through Pennsylvania but you never get out of the car?”, “What if you walk through Delaware on stilts, so that your feet don’t technically touch the ground?”), she relented and allowed me to identify all of my “flyover” states in a different color. That put me just barely ahead of Ian, who soon moved out of town, I assume because of his shame, or to up his numbers for our inevitable rematch.
That story has stuck in my head for decades because that stupid, petty instinct has never left me. I am constantly failing the Pointless Status Test—whenever there’s some way I could consider myself better than other people, no matter how stupid or arbitrary it is, I feel compelled to compete.
The problem isn’t the competition itself; it’s only a vice when it doesn’t produce anything useful. So I’m proud of my fourth-grade self for demonstrating creativity in the face of adversity. I just wish I had used it to do something other than win a status game that existed entirely inside my own head.
That’s why, in my opinion, we should feel the same derision toward people who engage in pointless competition as we feel toward people who embezzle public funds. We all benefit from the public goods that society provides—safety, trust, knowledge—and so we all owe society some portion of our efforts in return. If instead you squander your talents on the acquisition of purely positional goods, you are robbing the world of its due. It’s like commandeering a nuclear power plant so you can heat up a Hot Pocket.
In college, I crammed my schedule so full that my GCal was one solid block of red: classes, extracurriculars, jobs, committees, shows, research. My improv group would routinely rehearse from 11:15pm to 1:15am because that was literally the only time left. It felt like every dial of my life was permanently turned up to 11 and it was great.
Well, mostly great. One spring, Maya, a good friend of mine, was performing a play she had written for her senior thesis. A sacred rule among theater kids is that you go to each other’s shows—otherwise, you might have to perform your Vietnam era reimagining of The Music Man to no one. There was only one night I could possibly make it to Maya’s play, and when that night arrived I just...didn’t go. It wasn’t because I forgot. I decided. I wanted a few hours to finish an essay, to read, to respond to emails, to think, to sit motionless on my couch, and while I could have delayed all those things to some later time and nothing bad would have happened, I didn’t have the gumption to do it.
When Maya came by my room later that night, upset, and rightfully so, I realized for the first time: extreme busyness is a form of selfishness. When you’re running at 110% capacity, you’ve got nothing left for anybody else. Having slack in your life is prosocial, like carrying around spare change in your pocket in case someone needs it. My pockets were permanently empty—I was unable to bake anyone a birthday cake, proofread their essay, pick them up at the airport, or even, if I’m honest, think about them more than a few seconds. I was failing the Too Busy to Care Test. “Oh, you’re going through a breakup and need someone to talk to? No problem, just sign up for a 15-minute slot on my Calendly.”
I once read a study where they found that people’s perception of the care available to them was a better predictor of their mental health than the care they actually received. That made a lot of sense to me. It’s not every day that you need to call someone at 2am and bawl your eyes out. But every day you wonder: if I called, would someone pick up?
I’ve heard that assholes can occasionally transform into angels, but I’ve never seen it happen. Any improvement I’ve ever witnessed—in myself, in others, doesn’t matter—has been, to borrow Max Weber’s description of politics, “the strong and slow boring of hard boards.” That’s because there’s no switch in the mind marked “BE A BETTER PERSON”. Instead, becoming kinder and gentler is mainly the act of noticing—realizing that some small decision has moral weight, acting accordingly, and repeating that pattern over and over again.
It’s much easier, of course, to wait for a Road to Damascus moment, to put off any self-improvement to some dramatic day when the sky will open and God will reprimand you directly, so you can do all of your repenting and changing of ways at once. For me, anyway, that day is always permanently and conveniently located in the near future, so in the meantime I can enjoy my “Lord make me good, but not yet” phase.
If you accept that nothing is going to happen on the way to Damascus, to you or anyone else, if you let go of the myth of an imminent moral metamorphosis, you can instead enjoy a life lived under expectations that are both extremely consistent and extremely low. It is always possible to become a better person—even right this second!—but only a very very slightly better one. Whatever flaws you have today, you will probably have them tomorrow, and same goes for your loved ones. But you can shrink ‘em (your flaws, that is, not your loved ones) by the tiniest amount today, a bit more tomorrow, and a bit more after that.
It’s like you’re trying to move across the country, but each day, you can only move into the house that’s right next to yours. It might be months before you even make it to another zip code. But if you keep carrying your boxes from house to house, soon enough you’ll be on the other side of town, and then in the next state over, and then the next one after that. The most important thing to remember is: keep track of those states, because you never know when Ian might return.
In their earlier, more innocent days, they also brought us lolcats and rickrolling.
See also: the Shirky Principle.
2025-03-18 22:05:39
A couple years ago, I got a job interview at a big-name university and I had to decide whether to go undercover or not. On paper, I looked like a normal candidate. But on the internet, I was saying all sorts of wacko things, like how it’s cool to ditch scientific journals and just publish your papers as blog posts instead.
If the university wanted to hire me, I was pretty sure they wanted the normal me, not the wacko me. But I was also starting to suspect the wacko me was the real me. This was a big problem, because that job came with a paycheck, an office, and the approval of my peers. Could I maybe Trojan Horse myself into their department and only reveal my true nature after we’d signed all the documents (“Ha ha, you fools! Couldn’t you tell I’ve gone bananas??”)? Or could I maybe lock up my wackiness, maybe forever, or at least until I had tenure, when I could let it loose again?
Those lies were extremely tempting, but they were also lies. I knew that if I went undercover, at some point they were going to show me a picture of my wacko self and ask, “Do you know this man?” and I would have to say, “Never heard of him.” As much as I love having health insurance, I couldn’t bring myself to do that.
So in the end I went as the real me, a normal guy who was in the process of becoming a wacko, like a caterpillar who had snuggled up inside his cocoon and was soon to emerge as a, I dunno, a teeny tiny walrus in a fedora. I gave my normal talk about my dissertation—a project I loved and was happy to present—but I ended by saying, “I don’t think we need a hundred more papers like this one. We need to turn psychology into a paradigmatic science, and I think I can help do that.” I told them why I thought psychology hadn’t made much progress in the past few decades, and how it could, and how I might be wrong about everything, but I would hopefully be wrong in a useful way. When people asked me if I planned to publish my papers in journals, I said no, and I explained why.
It probably doesn’t sound like much, but this was the scariest thing I’ve ever done. This is the academic equivalent of getting on stage and mooning the audience. You’re not supposed to say this stuff, especially not at a job interview. You’re risking a fate worse than death—people might think you’re weird.
That’s how it felt when I decided to do all that, anyway. But on the actual day, I didn’t feel afraid at all. I felt free. Invincible, even. I had already condemned myself to death, so what could they do to me now? In the unlikely event that they liked the real me, great! And if they didn’t, well, better to find that out sooner rather than later. The only real danger was if they hired the fake me and then I had to pretend to be the fake me for the rest of my life.
I expected to last like 15 minutes before someone said, “Oh wow we’ve made a terrible mistake, please leave.” Instead, people were nice. Some of them were excited—if nothing else, they had never seen someone get on stage and drop their drawers before. A couple people were like “Between you and me, I like what you’re doing, but I don’t know about the others.” I never encountered those supposed others, but maybe they waited until I left their office to bust out laughing. Some were skeptical, but in a “I’m taking you seriously” kind of way. And some people didn’t react at all, as if I was being so weird that they couldn’t even perceive it, like I was up there slapping my bare buttcheeks and they were like, “Um yeah I had a question about slide four?”
And then...I got the job!!
Haha no just kidding I didn’t get the job, are you nuts?
But fortunately it didn’t hurt at all, it actually felt great!
Haha no it sucked! Of course it hurts when people hang out with you all day and then decide they don’t ever want to hang out with you again. I might be a wacko, but I’m not a psychopath.
And yet the hurt was only skin deep. It felt like I got a paper cut when I expected to be sawn in half. I guess I imagined that being myself would make me feel saintly but grim, like one of those martyrs in a Renaissance painting who has a halo around his head and a sword through his belly. Kinda like this:
Instead, it made me feel like this:
This felt like a triumph, but it also felt confusing and silly. Why was it so hard to be myself in front of my potential employer? Why was I even considering trying to bamboozle my way into a job that I wasn’t even sure I wanted?
Maybe it’s because, historically, doing your own thing and incurring the disapproval of others has been dangerous and stupid. Bucking the trend should make you feel crazy, because it often is crazy. Humans survived the ice age, the Black Plague, two world wars, and the Tide Pod Challenge. 99% of all species that ever lived are now extinct, but we’re still standing. Clearly we’re doing something right, and so it behooves you to look around and do exactly what everybody else is doing, even when it feels wrong. That’s how we’ve made it this far, and you’re unlikely to do better by deriving all your decisions from first principles.
Here’s an example1. Cassava is tasty, nutritious, and easy to grow, but it is also unfortunately full of cyanide. Thousands of years ago, humans developed a lengthy process that renders the root edible, which involves scraping, grating, washing, boiling, waiting, and baking. It’s a huge pain in the ass and it takes several days, but if you skip any of the steps, you get poisoned and maybe die. Of course, the generations of humans who came up with these techniques had no idea why they worked, and once they perfected the process, the subsequent generations would also have no idea why they were necessary.
The knowledge of cassava processing only survived—and indeed, humans themselves only survived—because of conformity, tradition, and superstition. The mavericks who were like, “You guys are dummies, I’m gonna eat my cassava right away” all perished from the Earth. The people who passed their genes down to us were the ones who were like, “Yes of course I’ll do a bunch of pointless and annoying tasks just because the elders say so and it’s what we’ve always done.” We are the sons and daughters of sheeple.
No wonder it’s hard to be yourself, even long after we’ve gotten all the cyanide out of our cassava. Looking normal, pleasing others, squelching your internal sense of self and surrendering to the standards of your society—that all feels like a matter of life and death because, until recently, it was.2
It’s funny that none of this dawned on me even though I’ve been staring at this fact for years. My home field, social psychology, got famous for demonstrating all the ways that people will conform to the norm: the Milgram shock experiments, the Asch line studies, the smoke filled room study, the (now discredited) Stanford Prison Experiment, etc. Come to our classes and we’ll show you a hilarious Candid Camera clip where they send stooges to face the wrong way in the elevator, and the hapless civilians around them cave to peer pressure and end up facing the same way, too3:
“Ha ha! Look at what those dopes will do!” I thought to myself, profoundly missing the point. I didn’t realize at the time that demonstrating people’s willingness to conform is like demonstrating their willingness to doggy paddle when they get tossed in a lake—that’s what a lifesaving instinct looks like.
I guess what I’m saying is: everybody tells you to be yourself, but nobody tells you it’ll make you feel insane.
Maybe there are some lucky folks out there who are living Lowest Common Denominators, whose desires just magically line up with everything that is popular and socially acceptable, who would be happy living a life that could be approved by committee. But almost everyone is at least a little bit weird, and most people are very weird. If you’ve got even an ounce of strange inside you, at some point the right decision for you is not going to be the sensible one. You’re going to have to do something inadvisable, something alienating and illegible, something that makes your friends snicker and your mom complain. There will be a decision tucked behind glass that’s marked “ARE YOU SURE YOU WANT TO DO THIS?”, and you’ll have to shatter it with your elbow and reach through.
You shouldn’t do that thing on a whim—the snickers and complaints are often right, and evolution put the glass there for a reason. Nor should you do it in the hopes that it’ll make your life easier, because on the whole, it won’t. Sticking to the paved path puts you beyond reproach. If you, say, hit a pot hole and go flipping into oncoming traffic, well, hey, that’s not on you. Go off-road, though, and you’re vulnerable to criticism. If you crash, it’s your fault. Why couldn’t you just drive on the street like a normal person?
When you make that crazy choice, things get easier in exactly one way: you don’t have to lie anymore. You can stop doing an impression of a more palatable person who was born without any inconvenient desires. Whatever you fear will happen when you drop the act, some of it won’t ultimately happen, but some will. And it’ll hurt. But for me, anyway, it didn’t hurt in the stupid, meaningless way that I was used to. It hurt in a different way, like “ow!...that’s all you got?” It felt crazy until I did it, and then it felt crazy to have waited so long.
Which is to say: once you moon the audience, you don’t have to worry anymore, because the worst is already over. All that’s left is to live the rest of your life in a world where people have seen your bare butt. And I can tell you, it’s better than you might expect.
This example comes from Joe Henrich’s The Secret of Our Success, pg. 97-99.
Our instinct to conform is so strong that it can even reprogram our taste buds. Once, at a party, some of my friends handed me a glass of wine and were like “Here, try this!” and the way they said it made me think there was something wrong with the wine, like it had gone off or something, and they wanted me to see how bad it was. (These are the kind of people I hang out with.) So I took a sip and went, “Oh, that’s rancid!”
My friends got quiet. One of them looked especially horrified. “That’s my favorite wine,” she said. She had brought it to share with everybody.
I took another sip and tried to backpedal, “Actually it’s not so bad! It’s good, even!” I said, looking insane. “I’m bad at tasting things,” I added, helplessly. (I didn’t yet have documented scientific evidence that I have a poor sense of smell and taste.) I wasn’t lying—it really had tasted bad at first, and as soon as I knew it was supposed to taste good, it did.
Whoever uploaded this clip called it “Groupthink” but that’s…a different thing
2025-03-11 22:13:58
Madeline Warner was killed by an excess of alarms. Her death certificate says “cardiac arrest,” but the real cause was the beeps, boops, whines, and buzzes of her hospital ward, which drowned out the “low battery” alert on Warner’s cardiac monitor. When the 77-year-old Warner later had a heart attack, no alarm sounded, and no one notic…
2025-03-04 21:39:48
This is the quarterly links and updates post, a selection of things I’ve been reading and doing for the past few months.
Everybody knows about the Milgram shock experiments, but I had never heard of the Hofling hospital study, which basically did Milgram in the field.
A guy introducing himself as “Dr. Smith” would call into the ward and ask a nurse to administer a mega dose of “Astroten” to a patient. The nurses had never met Dr. Smith (because he was made up), they had never dispensed Astroten (because it was also made up), they weren’t supposed to take orders over the phone, and the bottle itself clearly indicated that Dr. Smith’s requested dose was double the daily limit. When this situation was described to a sample of nurses in a separate study, almost all of them insisted that they would refuse the order. And yet 21/22 nurses who were subjected to the real situation made it all the way to the patient’s room with their overdose in tow before the research team stopped them.
In a sort-of replication 11 years later, 16/18 nurses protested in some way when a “little-known” doctor called and asked them to administer way too much Valium to a patient. Still, most of the nurses got pretty close to giving it, and nearly half of them said they would have continued if the doctor insisted.
People have tried to debunk the Milgram studies, but those debunkings have failed because they miss the whole point of experiments like these. It’s not that people are blindly obedient. It’s that we live in a world where the folks around us are usually acting normal and doing reasonable things, and it would be impolite, weird, and annoying to second-guess them all the time, so we generally don’t.
People always think the end times are around the corner (see my recent posts: 1, 2), but to be fair, they sometimes have good reasons. For instance, around the year 1500, there were terrifying reports of “monstrosities” being born—abominable creatures never seen before on Earth, whose arrival could only herald the coming apocalypse. The grandaddy of these was the “Papal Ass,” a Frankenstein of different beasts that supposedly washed up on the Tiber River in 1496.
Word on the street was that the Papal Ass was sent by God to demonstrate the depravity of Pope Alexander VI, AKA Rodrigo de Borgia. Pope Alex had indeed done some non-popely things, like having a bunch of illegitimate children, maybe killing off some of his political enemies, and making his 17-year-old son Cesare a cardinal. He is perhaps best known for appearing in Assassin’s Creed II, where he zaps the player with a magic staff.
On the other hand, sometimes weird births could be good omens. For instance, in 1595, a child was born in Silesia (now mostly Poland) with a golden tooth. A certain Dr. Horst investigated and made this report:
at the birth of the child the sun was in conjunction with Saturn, at the sign Aries. The event, therefore, though supernatural, was by no means alarming. The golden tooth was the precursor of a golden age, in which the emperor would drive the Turks from Christendom, and lay the foundations of an empire that would last for a thousand years.
On Christmas Eve 2020, the New York State Representative Brian Kolb wrote an op-ed warning people not to drive drunk over the holidays. On New Years Eve 2020, he got drunk and drove his state-issued SUV into a ditch.
Many people blame scientific stagnation on ideas getting harder to find. A new paper offers another explanation: maybe as more people have gone into research, the average quality of researchers has fallen. This makes some sense—if your company has 100 researchers and suddenly decides to hire 1,000 more, eventually you’ll have to lower your standards.
That said, I don’t think the evidence they present is all that convincing, because they assume they can measure researcher ability by looking at how much people make when they leave their research jobs. As in, if people are leaving their postdoc positions to start billion-dollar companies, they must be pretty smart. But if they’re leaving to go dig ditches, their research probably sucked. This is obviously pretty fraught—it assumes that people always choose the job that maximizes their income, and that the highest earners outside the research sector would be the top performers inside it. It also focuses on the average quality of researchers, when in fact science is a strong-link problem, so we should be asking whether the best has gotten worse.
Before I started a blog, I thought I would write a post like “How to like more things” and I never did, and now I’m glad because recently wrote a better version: How to Like Everything More
Smallpox is the only human disease that’s ever been eradicated. It used to kill an estimated 5 million people every year; now it kills zero. That’s in large part thanks to the smallpox vaccine developed by Edward Jenner. And yet Philosophical Transactions—one of the most prestigious scientific outlets of its day—rejected Jenner’s paper describing his results. And here’s how people felt about his vaccine:
The other day I was like, “wait, who lost money in Bernie Madoff’s ponzi scheme?” I did not expect the list of victims to include...Nobel Prize winning author of Night, Elie Wiesel. He apparently entrusted both his life savings and his foundation’s finances to Madoff, who then made them disappear. Here’s the punishment Wiesel recommended for his erstwhile friend:
I would like him to be in a solitary cell with a screen, and on that screen ... every day and every night there should be pictures of his victims, one after the other after the other, always saying, “Look, look what you have done.” ... He should not be able to avoid those faces, for years to come.
He added, “This is only a minimum punishment.”
On a single day in London, 1902, “fourteen people slipped on banana skins on Fleet Street and the Strand alone, and were injured enough to need treatment.”
I love this post from Uri Bram at Atoms vs. Bits. Hoeing a field is no fun, and so we’d love to know how much hoeing is necessary for maximizing crop yield, and do exactly that much. But nobody knew the optimal amount of hoeing until Jethro Tull (the inventor, not the band) designed the world’s simplest experiment. Just plant some seeds all in a row, hoe a little bit around the first seed, hoe a little bit more around the second one, and so on, like this:
Then watch ‘em grow. Wherever their growth plateaus, that’s how much you need to hoe. Someone could have done this a thousand years earlier, and if they had, we would have way more turnips by now.
One turnip I’m glad we have: , a Substack that posts rarely but always posts good. Their most recent: Mechanisms Too Simple for Humans to Design
Another sporadic-but-never-miss Substack is Desystemize. I thought I was tired of hearing about AI, but actually I was tired of hearing the same takes over and over. His recent post (If You’re So Smart, Why Can’t You Die?) is very good and comes with some useful tools for thinking that I hadn’t encountered before.
The sociologist Claude S. Fischer casts doubt on the idea of a “loneliness epidemic”:
We can provisionally conclude that, over the last half-century or more, friends have remained roughly constant, probably even expanding their roles in Americans’ lives. Yet, as we saw, that long history has usually been accompanied by repeated alarms about the loss of friendship.
Speaking of things that were supposed to be running out, the journalist Ed Conway intended to write a series about “the world’s lost minerals.” He now reports that he failed: “So far, we haven’t really, meaningfully run out of, well, pretty much anything.”
According to of , after the film adaptation of Lolita came out in 1962, a few hundred parents decided to name their daughters “Lolita”. I wonder if they...watched the movie.
PageRank (Larry Page)
Taco Bell (Glen Bell)
shrapnel (Henry Shrapnel)
I would add boycott (Charles Boycott)
The French scientist Pierre Borel was one of the first people to argue that life exists on other planets. In A New Treatise Proving a Multiplicity of Worlds, published in 1657, he suggests that aliens are real and they’ve come to Earth—we call them “birds”. Specifically, birds of paradise:
This Bird is so beautiful, that no one in the Earth is to be compared to it; its figure is of so rare a form, and so extraordinary, that never the like hath been found […] no body ever saw its eggs, nor its nest; and it’s asserted, that it lives by the Air; this Bird never being found upon Earth, is it not consonant to Reason, that it may come from some other Starre, where it lives and breeds
When people are like “psychology is simply too complicated, we’ll never understand it much better than we do now,” I think of Dr. Borel, who was like “we’ve never seen this bird’s eggs, so we never will, so it must be from another planet.”
When I wrote about the fraud scandals surrounding Dan Ariely and Francesca Gino, some folks wondered how anyone could get away with fraud for very long. Doesn’t anyone notice? One answer: yes, people notice all the time, but nobody’s willing to speak up about it. A graduate student named Zoé Zaini originally noticed inconsistencies in Gino’s papers, but according to Zaini’s retrospective, her dissertation committee told her to bury her doubts instead of airing them:
After the defense, two members of the committee made it clear they would not sign off on my dissertation until I removed all traces of my criticism of [Gino’s paper] [...] one committee member implied that a criticism is fundamentally incompatible with the professional norms of academic research. She wrote that “academic research is a like a conversation at a cocktail party”, and that my criticism was akin to me “storming in and shouting ‘you suck’”
This is a classic case where we were missing the second bravest person.
Speaking of which: “ is launching a program to promote investigations into research fraud and other serious misconduct.” If you’re in Zaini’s position, consider reaching out to them.
Most histories of the replication crisis (including mine) begin in 2011 with Daryl Bem publishing a paper in the Journal of Personality and Social Psychology “proving” that ESP is real. I didn’t realize that Bem was old news by then: in 1974, Nature published a paper confirming that the magician Uri Geller could read minds and see through walls.
The book was banned by the Catholic church in 1739.
Lastly, my friends Slime Mold Time Mold have a new series out called The Mind in the Wheel, which proposes a cybernetic paradigm for psychology. I’m extremely excited about this project and I’ll have a lot more to say about it in the future.
News ‘n’ updates from the burgeoning science-on-the-internet movement. Original post here; email me if you have an update.
Alex Chernavsky is doing lots of self-experiments and is looking for collaborators. Some of his recent studies include: the effects of creatine on reaction time, and potassium for weight loss. I love seeing work like this; if you join forces with Alex (or do similar stuff on your own), please send it to me and I’ll link to it here.
Another replication of putting your toaster in the dishwasher:
is pulling together people interested in putting memetics to good use—reach out to him if you want to collaborate.
, an official Experimental History-recommended Substack, publishes a terrific Best of Science Blogging series. Now they’re sharing their revenue with everyone they publish.
Asimov Press has a list of stories they’d like to publish. That includes a piece on “Alternatives to Peer Review,” which I of course would love to read myself. Asimov’s editors are lovely, so if you’re itching to do some science writing, don’t sleep on this.
A little out of date, but you can also pitch Works in Progress.
I was on Derek Thompson’s Plain English podcast talking about the lack of progress in psychology.
Chris Turner is a very funny standup comedian and also my friend; I went on his Godforsaken podcast to talk about the time we got chained up in a Hungarian basement as Brexit happened.
I contributed a piece to THE LOOP, which I would describe as a cross between Teen Vogue, a scientific journal, and a fever dream of the internet.
And finally, from the vault: in 2022, I started wondering, “is every popular movie really a remake these days?” I analyzed the data and discovered: yes, but it’s not only movies. It’s everything.
2025-02-18 23:08:00
It’s a grim time for science in America. The National Science Foundation might be forced to fire half its staff, grants are being frozen and reviewed for ideological purity, and universities may see their cut of those grants reduced by 40%.
We were bound to end up here sooner or later. Science funding has been riding an atomic shockwave since 1945, buoyed by the bomb and the Cold War and the conviction that we could vanquish our enemies if we just kept cutting checks to the geeks. Now the specter of Soviet submarines isn’t so scary anymore, and our most feared and hated enemies happen to share the same country with us—we’re out of our Red Dawn era and into our Civil War era. And so people are wondering: why are we spending billions on basic research, again?
There’s a good answer to that question, but nobody’s giving it. We all assumed that science was self-evidently worthwhile, thus allowing our arguments to atrophy and leaving us with two half-assed defenses. On the one hand, we have romantics who think science is important because “something something the beauty of the universe triumph of the human spirit look at this picture of a black hole!” And on the other hand we’ve got people who think science is like having a Geek Squad that you can call upon to solve your problems (“My tummy hurts! Scientists, fix it!!”). These points aren’t completely wrong, but they’re achingly incomplete—why should we pay for people to stand agog at the wonder of the universe, and why would we let them do anything that isn’t immediately related to some pressing problem?
And then there’s an alarmingly large contingent who thinks there isn’t any argument to make. In their heart of hearts, they think the NSF and the NIH are, in fact, charitable organizations. In this view, science funding is just welfare for eggheads, and scientists are a bunch of Dickensian beggars going, “Please sir, can you spare a few pence so I can run my computational models?” Witness, for instance, the Johns Hopkins cardiologist who thinks that all NIH-funded scientists owe an annual thank-you note to the American public.1
Even the folks who have a soft spot for science often think of it as a nice-to-have—you know, let’s first make sure we build enough battleships, mail enough checks to the right people, etc., and then if there’s a little left over we can toss a few bones to the nerds. If we’re all really good, we can have some science, as a treat.
If this is all we got, it’s no wonder that people wanna smash the science piggybank and distribute the pennies to their pet projects instead. The case for science should be a slam dunk; we’ve turned it into a self-own. So lemme take a crack at it.
There is only one way that we improve our lot over time: we get more knowledge. That’s it. Everything that has ever made our lives better has come from collecting, cultivating, and capitalizing on information.
When I say “knowledge,” I mean everything we’ve figured out, from the piddling (“cotton is more comfortable than burlap”) to the profound (“energy cannot be created or destroyed”). I mean the things you have to learn in school (“kidneys filter your blood”) and the things that now go without saying (“it’s better if the king isn’t allowed to kill whoever he wants”). Some of this knowledge looks science-y, like when engineers use the theory of relativity to make GPS work, and some of it looks folksy, like when a lady on Long Island invents a little plastic table that prevents pizza from sticking to the top of the box.
No matter where it comes from, it’s all part of the same great quest to de-stupify ourselves. If you’re doing anything remotely good—basically, unless you are shaking down local businesses for protection money or blowing up UNESCO World Heritage sites—you are either enabling, conducting, or cashing in on the search for knowledge, and therefore you’re part of the project.
We don’t talk about the history of our species this way. In school, I learned things like “the Egyptians made hieroglyphics” and “the Romans did aqueducts” and “the Pilgrims wore hats,” but nobody mentioned that if I had been born an ancient Egyptian, a Roman, or a Pilgrim, there’s a 50/50 chance I would have died before I turned 15. I might have starved or frozen, or maybe I would have been executed for believing in the wrong god, or maybe I’d be done in by microscopic invaders that I didn’t even know existed. (Making it to your quinceañera was once a much greater reason for celebration, not that anyone but the king could afford streamers and cake.) And if I wasn’t dead, I would be working—farming, shepherding, child soldiering, etc.—not sitting in social studies making dioramas.
Nor did anyone explain why I got to have a different life than my ancestors did: I was born into a society where we know more. We know how to grow enough food for everybody, how to keep the cold out, how to fend off the microscopic invaders, and how to get along—more or less—with people who worship different gods.2 None of this happened by chance, and none of it came for free. In fact, for most of our history, our stock of knowledge didn’t increase at all, and the people who dared to add to it were often ridiculed and sometimes killed.
Despite all that, we have made a lot of progress in the millennia-long project of diminishing our ignorance, and that’s why I get to eat focaccia and play Call of Duty, while my ancestors had to eat moldy bread and play a game called “hide from the marauding Visigoths”. But the project isn’t finished yet. It’s not even close. There’s still so much suffering, and we could escape it if only we knew how.
That’s why we fund science. We all pitch in to hunt down the knowledge that can’t be found any other way. We don’t seek the knowledge that will turn us a profit tomorrow—that’s what businesses are for—but the knowledge that will support a permanently better life. We do science that is speculative and strange because that’s where the breakthroughs will come from, the frontiers of knowledge where our intuitions stop working, where predictions fail, and where the things that seem sensible are unlikely to be important. We do this with public funding because it produces public goods. The things we discover are too important to be owned; they must be shared.
So yes, it’s beautiful, but that’s not why we do it. Yes, it’s practical, but not right away. And no, it’s not charity. You don’t “save” money by skimping on science, just like you don’t save money by sending second graders to the coal mines instead of the classroom. You could think of it as investment because it does pay off in the long run, but even that undersells it. Pooling our resources to discover new truths about the universe so that we can all have better lives, to strike back against disease, suffering, poverty, and violence, to reduce ignorance for the benefit of all—that’s literally the most badass thing we do.
Our past wasn’t inevitable and our future isn’t guaranteed. We have to choose to keep increasing our knowledge. That choice might seem like an easy one, but we have to contend with three tempting but false arguments for choosing to do opposite, three Sirens of Entropy trying to seduce us into running the ship of civilization aground.
The first is rejection. Knowledge comes with tradeoffs—the chemistry that cures can also poison, the physics that builds space rockets can also make cruise missiles, and so on. Plus, the past always looks idyllic—as long as you don’t look too closely—and so it always seems like history has just recently gone wrong. Anyone could be lulled into believing that these tradeoffs cannot be managed or improved and must be avoided entirely, that the solution to our problems is less knowledge, rather than more. This view does require you to ignore or deny things like the near disappearance of extreme poverty, the end of child labor, the historically low rates of violent death, etc., but there will always be people willing to rise to that challenge, because the appearance of one problem will always be more salient than the disappearance of another.
The second is complacency. When the lines keep going up and to the right, it’s easy to assume that’s just what they do, and to forget that every increase ultimately comes from our expanding stock of knowledge. You can slip into thinking that our living standards rise of their own accord, that death and disease recede because we want them to, that the GDP fairy puts 4% growth under our pillows to reward us for being such good boys and girls. When you think that progress is a perpetual motion machine, you’ll see no need to top up its gas tank.
And the third is the pie apocalypse. Every time we grow the pie that we all share, we also have to figure out how to split it fairly. We get pissed off when the products of progress are disproportionately captured by the rich, and well we should—it’s like playing Hungry Hungry Hippos against someone who gets to mash two buttons instead of one. But it’s easy to focus on the pie-splitting problem and forget the pie-growing problem entirely. We can thus descend into an all-out war for pie, where they only way to get a bigger slice is to steal someone else’s. Meanwhile, the pie shrinks and shrinks, and we end up fighting over crumbs.3
So it’s a good idea to get smarter, and we can all contribute to that mission. But that doesn’t mean we’re doing a good job right now. People are right to be mad about the state of science funding in America: the fraud, the waste, the low ambitions, the dogmatism, politicking, and rent-seeking. Maybe this chaos is at least a chance to sunset some of the most outrageous parts of the system, so long as we’re committed to figuring out the best way to spend our science dollars, rather than throttling or lavishing funds the way a king dispenses dukedoms and decapitations. (“Sussex for my real friends, no necks for my sham friends.”)
Fortunately, this ain’t hard to do. There’s so much low-hanging fruit that we’re tripping over it. Here are three of the easiest, most obvious moves that we could make right away, and that we should have done half a century ago.
For-profit publishers make their money by privatizing public goods. The government pays the scientists to do the research, then publishers paywall it, and finally the government pays again so the scientists can read what they wrote. This gets obfuscated because publishers don’t charge the government directly. Instead, universities fork over millions for journal access, then charge it back to the government as part of the much-hated “indirect costs”. The taxpayers foot the bill for all of this, but they don’t even get to read the studies themselves unless they have a .edu email address.4
Everybody knows this is ridiculous. If this business didn’t already exist and you tried to pitch it on Shark Tank, Mark Cuban would laugh in your face. It only worked out this way because some schemers realized that academics are a captive market, so they bought up all the journals in the 70s.5 Individual scientists have tried to upend the system through self-immolation—that is, by refusing to publish in or review for journals owned by Elsevier and the like—but it hasn’t worked, because it’s hard to convince everyone to immolate at the same time.
This is the kind of coordination problem that only the government can solve. We could score such an easy win by paying for the minor costs of publishing directly (hosting, copyediting, the pittances occasionally paid to editors, etc.), rather than paying middlemen to do them at a ~40% markup. I’ve got plenty of issues with scientific journals, but if we’re going to have them, we shouldn’t also set money on fire. So if you want to smash things, smash this one!
By one estimate, principal investigators spend 44% of their time applying for grants. It takes an average of 116 hours to fill out a single NSF proposal. Most applications get rejected, and so most of that time is wasted. These costs don’t appear in the budget because nobody can say “if you pay me, I will spend half of my time trying to make sure that you will pay me again in the future.” But that’s what they’ll do, because failure to secure a grant is death for an academic. So whether federal agencies realize it or not, by making their applications so laborious and competitive, they are paying people to spend almost half of their time trying to get paid.
And that doesn’t include the cost of grant panels who have to sift through those applications, the bureaucracy that has to process the paperwork, the mandatory Institutional Review Boards that take six months to tell you whether it’s okay to ask someone if they own a lawnmower, etc. The government requires all of these things, and since the government is funding the whole enterprise, it also pays for them.
You might think we adopted all of these policies because we had evidence suggesting they would make us better off, but we didn’t. We adopted them because they sounded good at the time, and why would you check something that sounds good?
Whenever people complain that a lot of government-funded science ends up un-cited and unused, or that it’s hideously ideological6 or pointlessly incremental, I gotta laugh because those are the projects we picked. We got exactly what we asked for; it’s like ordering something from Amazon and then being angry when it arrives. The problem isn’t the product—it’s the picking.
If we want better outcomes, we should pick different projects. The whole point of funding science is to discover things that wouldn’t get discovered anywhere else. Pharmaceutical companies can make plenty of profit turning molecules into medicines, but they can’t go looking into the mouths of gila monsters, asking “Any good drugs in here?” Thats how we got GLP-1 agonists, which are now used by millions of people.
So public funding should go to projects that are foundational, speculative, long-term, useful but unsexy, or big if true. Some of those projects can be identified by committee, but many can’t, and so we should pick them some other way: lotteries, golden tickets, trust windfalls, fast grants, bounties, prizes, retroactive public goods funding7, “people not projects,” and moonshots that are actually moonshots, to name just a few. We should be placing some of our bets outside the scientific consensus so that we don’t waste billions on one idea that turns out to be wrong. And we should really try to figure out how one guy funded almost every single person who won a Nobel Prize for molecular biology between 1954 and 1965...19 years before they won. It would be cool to do that again!
Many of these methods cost less than the standard “solicit one million pages of applications” procedure. If we tracked them long term—not “did this get into a good journal”, but “did this end up mattering”—we could figure out which ones work better for what ends, and we could get more science for less money. It is a national embarrassment that the agencies who fund experiments do basically zero experiments themselves. Would you trust a pulmonologist who smokes?
I understand why people think we can balance the budget by shrinking science, and I understand why we spend our limited science funding in such a cowardly way. We want everything to be accountable, and it’s hard to tell the difference between actual accountability and the mere appearance of it. Nobody gets blamed for “saving” money, even when it costs more in the long term. And nobody gets blamed for doing things by the book, even when the book turns out to be fiction.
But paying for science is different from paying for other things. When you pay for a bridge, you get a bridge. When you pay for Medicare, you get Medicare. When you pay for an F-35 fighter jet, you...pay an extra $500 billion, and then you get an F-35 fighter jet. In the short run, though, you can’t know what your science dollars are going to get you. That’s the whole point of doing the science!
The more we fight against that fact, the more we demand legibility in the form of applications and metrics, the more we try to squash and slash and cut just for the hell of it, the less we get the thing we’re actually trying to buy. There’s a sort of Heisenberg Uncertainty Principle at work here: you can’t spend your money wisely and make sure you’re spending your money wisely at the same time. It would be cool to only run experiments that were guaranteed to both work and teach us something, just like it would be cool to only buy stocks that are guaranteed to increase in value. Until that becomes possible, we’re gonna have to take some risks.
In the long run, however, you know exactly what you’re going to get. The only thing that lifts our boats is the rising tide of knowledge. One of the basic functions of the government is to help make that happen. It requires some patience and some money. In return, it gives us literally everything.
I’m fascinated by this logic. The government pays a lot of people to do a lot of things, so why are researchers uniquely indebted to the American public? Should your local police officer also send you a thank-you note? Should the Secretary of State? Do we all deserve a thoughtful box of mixed nuts from the guy who trims the bushes in front of the DMV?
It’s fitting that in Cixin Liu’s Three Body Problem series, hostile aliens try to ensure our defeat by halting our scientific progress.
This same tension between growing and splitting, by the way, is one of the core problems of negotiation.
The Biden administration tried to fix this by requiring government-funded research to be open-access, but they got outfoxed by the publishers, who started charging authors for publishing rather than charging readers for reading. The last paper I published in one of these journals made us pay $12,000 to make it “open-access”. Publishing my papers on Substack, on the other hand, costs me $0.
As one publisher recalls, “When you’re building a journal, you spend time getting good editorial boards, you treat them well, you give them dinners. Then you market the thing and your salespeople go out there to sell subscriptions, which is slow and tough, and you try to make the journal as good as possible. That’s what happened at Pergamon. And then we buy it and we stop doing all that stuff and then the cash just pours out and you wouldn’t believe how wonderful it is.”
Although we’re probably not funding all that much hideously ideological science, see this analysis and clarification
I’ve only seen very online/crypto-y people try to do this, but there’s no reason we couldn’t use it for regular stuff