MoreRSS

Noah SmithModify

Economics and other interesting stuff, an economics PhD student at the University of Michigan, an economics columnist for Bloomberg Opinion.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Noah Smith

Europe's crusade against air conditioning is insane

2025-08-23 14:09:13

Photo by Douglas Paul Perkins via Wikimedia Commons

Many years ago, I was watching a nature show. It was about some hunter-gatherers on some Pacific island. The film crew went right up and talked to one of the hunter-gatherers about his life — hunting, gathering, finding and killing witches among his fellow tribesmen, and so on. But as they talked, I realized that there must be a giant video camera right in the face of this tribesman. And he wasn’t even reacting to it. What was this strange, unnaturally shaped object, made of strange unknown materials, and potentially possessing magical powers? Didn’t he wonder? And didn’t he ask himself if he could get something like it, and use it for whatever these strange foreigners were using it to do?

I often think about the example of the tribesman and the video camera. It’s a small version of a story that happens again and again, on a far grander scale, determining the fate of entire nations and geopolitical systems of power: absorption of foreign technology. Most of the things you use on a day-to-day basis were not invented in the country in which you live (even if you live in America). They were invented all over the world, and one crucial reason you have access to them is that your society deemed it fitting to allow those technologies into the country.

Adopting foreign technologies sounds like a no-brainer, but there are lots of risks involved. Hierarchies of power and status can be disrupted, creating political chaos. Existing economic relationships can shift, creating unexpected winners and losers. But perhaps most frighteningly, foreign technology can change a country’s traditional culture.

One Pacific island civilization that was determined to absorb foreign technology without letting it change their culture was Japan. When the “black ships” from the West arrived in the 1850s and demonstrated how helpless Japan was in the face of foreign powers, the country’s leadership (after a brief civil war) decided that their only choice was to absorb foreign technologies and institutions. But they wanted to preserve Japan’s traditional culture as well. They thus came up with the concept of “wakon yosai” (和魂洋才), which translates roughly as “Japanese soul, Western technology”. Over the course of the next century and a half, Japan intentionally strove to preserve elements of its unique culture even as it reshaped its society around new gadgets and production processes.

Travel to Japan today, and I guarantee that unless you are staying in a very backwoods rural place, the room where you stay will have an air conditioner. It will almost always be a “mini split”, or wall unit, looking much like the image at the top of this post. It will be quiet, but powerful enough to keep your room cool even in the increasingly hot summers that Japan now suffers due to climate change. This is a technology never available in Japan’s premodern days, and yet it has been near-universally embraced with no apparent degradation to the country’s traditional culture or national pride.

Europe is different. Data sources differ, but nobody puts AC usage in Europe (or the UK) at more than around 20%. This technology, which almost all Japanese people enjoy, is one that most Europeans do without.

You might think Europe is simply too far north to need AC. But latitude is no longer the defense against heat that it used to be, because climate change is stalking the region:

Source: Euronews

With this rise in temperature — and the aging of the European population — has come a rise in preventable death. Estimates of heat-related mortality vary, but the most commonly cited number is 175,000 annually across the entire region. Given that Europe has a population of about 745 million, this is a death rate of about 23.5 per 100,000 people per year. For comparison, the U.S. death rate from firearms is about 13.7 per 100,000.

So the death rate from heat in Europe is almost twice the death rate from guns in America. If you think guns are an emergency in the U.S., you should think that heat in Europe is an even bigger emergency.

Most of this death is preventable. The technology that prevents it is air conditioning. Barreca et al. (2016) find that heat deaths in America declined by about 75% after 1960, and that “the diffusion of residential air conditioning explains essentially the entire decline in hot day–related fatalities”. Essentially, wherever AC gets rolled out, heat-related death plunges. Taking Barreca’s estimate and applying it to Europe suggests that as many as 100,000 European lives — 0.013% of the population — could be saved every year if the 80% of European households who don’t have AC were to get it.1

And yet Europe has not done this. The official reason — at least, where one is given — is that AC uses electricity, which contributes to climate change. For example, this is from a 2022 article in MIT Technology Review:

Climate change is making extreme heat the norm across more of the world, increasing the need for adaptation. But in the case of AC, some experts are concerned about how to balance that need with the harms the solutions can cause…

[M]any Europeans are hesitant to welcome air conditioners with open arms. “Seeing AC as a solution to heat waves and to climate change is of course a bit problematic because of the energy that’s being used,” says Daniel Osberghaus, an energy and climate economics researcher at the Leibniz Centre for European Economic Research in Germany.

Today, cooling devices like ACs account for about 10% of global electricity consumption—and since most of the world’s electricity still comes from fossil fuels, that’s a significant chunk of worldwide emissions. Because of their massive energy use, “they do get a bad reputation,” says Kevin Lane, an energy analyst at the IEA.

Many other stories also mention climate as a reason Europe resists AC. Green organizations like the World Resources Institute, which have a lot of influence in Europe, consistently recommend far less effective “passive cooling solutions” due to emissions concerns. And European regulations do block AC, by mandating that newly built buildings be carbon-neutral. (This in addition, of course, to good old NIMBYism also blocks AC installation, especially in the UK.) Tyler Cowen writes:

European governments do a great deal to discourage air-conditioning, whether central AC or window units. You might need a hard-to-get permit to install an AC unit, and in Geneva you have to show a medical need for it. Or in many regions of Europe, the air conditioner might violate heritage preservation laws, or be illegal altogether. In Portofino, Italy, neighbors have been known to turn each other in for having illegal air-conditioning units. The fines can range up to €43,000, though most cases are settled out of court by a removal of the unit.

In fact, Andrew Hammel alleges that Germany has raised climate-based opposition to AC to the level of an ideological crusade. Here are some excerpts from his thread:

I believe attitudes toward air-conditioning are class markers in many European countries. Air-conditioning is seen as prototypically American, and that's important…

The urban haute bourgeoisie -- bureaucrats, public media executives, NGO employees, humanities grads, journalists, professors, lawyers, judges, etc. -- are the holdouts [in terms of installing AC]…

First of all, *every one* of these people has a story about visiting the USA and nearly freezing to death in an over air-conditioned store or office. Every. Damn. One…To these people, A/C is the ultimate American solution to a problem. Instead of accepting nature as it is, Americans use expensive, wasteful technology to artificially change the environment to fit their fat, lazy lifestyles. They insist on defying and conquering nature, not "cooperating" with her. And they don't care if they cook the planet while they do so…

[T]he European urban haute bourgeoisie turns it into a rigid ideological aversion to any form of air-conditioning…These people regard these decisions not just as their personal lifestyle choices, but rather as a *model for all of society*. They regard themselves as a revolutionary vanguard of advanced ecological consciousness which must aid the less enlightened to reduce their carbon footprints. And these people *run German society*…Urban planners and people who create construction codes in Germany are also brigadiers in the anti-A/C jihad…

Which is why it's pretty common on sweltering days to hear Germans complain about the "goddamn 'eco-this' 'organic-that' pencil pushers" who continue to force them to sweat for hours in overheated hospitals, classrooms, and offices.

This is immediately recognizable as the poisonous ideology of degrowth. Degrowth frames climate change as a problem of personal overconsumption and extravagance to be curbed by austere self-restraint and government policy, rather than as a technological problem to be overcome by installing green energy. This is foolish, of course — it leads to human suffering while not doing much to actually curb climate change. But it’s very popular in northern Europe.

The climate-based crusade against AC is a little infuriating, because it probably kills a lot more people than the reduced emissions save. Right now, Europe is responsible for only about 13% of global carbon emissions from fossil fuel use, meaning that the climate impact of installing AC all over the region is pretty minimal. Does anyone think that incredibly tiny margin of emissions reduction is really worth tens of thousands of lives a year?

But from reading anecdotes like Hammel’s, I kind of suspect that there’s a second, deeper reason why Europe so far refuses to install AC: protection of traditional culture. The thing about German elites pooh-poohing AC as an unnecessary American extravagance suggests that some Europeans view lack of AC as quintessentially European culture — a tradition by which Europeans can define their own uniqueness vis-a-vis the rest of the world.

Many articles about Europe’s strange reluctance to use AC hint at this attitude. For example, here’s CNN:

A big part of the reason [they don’t install AC] is many European countries historically had little need for cooling, especially in the north…“In Europe… we simply don’t have the tradition of air conditioning… because up to relatively recently, it hasn’t been a major need,” said Brian Motherway, head of the Office of Energy Efficiency and Inclusive Transitions at the International Energy Agency. [emphasis mine]

And here’s Euronews:

The rest of this story lies in history and culture…Southern Europe built its cities to cope with heat: thick walls, shaded windows, and street layouts designed to maximise airflow…That’s also why white paint dominates the picturesque skylines of Mediterranean places like Santorini in Greece or Vieste in Italy: The bright surfaces reflect sunlight and radiant heat, helping interiors stay cooler…In northern Europe, on the other hand, summers were once mild enough that cooling was rarely needed…Air conditioning, when it appeared in Europe, was seen as a luxury or even a health risk. Many Europeans still believe exposure to cold air can make you sick, and the stereotype persists that AC is for rich people.

And the WSJ reports that there are widespread superstitions about the dangers of this technology that most of the rest of the world uses every day:

In France, media outlets often warn that cooling a room to more than 15 degrees Fahrenheit below the outside temperature can cause something called “thermal shock,” resulting in nausea, loss of consciousness and even respiratory arrest. That would be news to Americans[.]

Even if climate is the official, intellectual reason for Europe refusing live-saving AC, the idea that AC goes against Europe’s traditional culture is probably an important underlying motivator.

(This trend isn’t unique to Europe, of course. Americans may pride themselves on being more futuristic than the Europeans, but they still haven’t adopted Japanese washing toilets in significant numbers, and so their quality of life has suffered in small ways that, having never experienced the luxury of this foreign technology, they cannot even comprehend.)

Whatever the reason, the resistance to AC technology is making Europe a more impoverished civilization. It’s a major reason why Europe now feels shabbier and more hardscrabble than America, despite its beautiful old cities and low crime rates.

Europe needs to emulate societies that embrace the technological future. Japan is a good one, but an even better example might be Singapore. That city-state’s legendary founder, Lee Kuan Yew, believed that air conditioning was the crucial technology that allowed his country to become one of the richest on the planet:

“Air conditioning was a most important invention for us, perhaps one of the signal inventions of history. It changed the nature of civilization by making development possible in the tropics.

Without air conditioning you can work only in the cool early-morning hours or at dusk. The first thing I did upon becoming prime minister was to install air conditioners in buildings where the civil service worked. This was key to public efficiency.”

Europe would do well to listen to his advice.

Absorption of foreign technology simply makes the difference between a poor society and a rich one — between a technologically advanced society and a backward one. Most countries have their blind spots here, but Europe’s spasmodic rejection of air conditioning is far more costly than most.


Subscribe now

Share

1

This is actually a bit of an overestimate, since the European households who already have AC are probably ones who need it more.

Moderation is good for its own sake

2025-08-21 04:09:54

Photo by Pax Ahimsa Gethen via Wikimedia Commons

The blogosphere is back! In the early 2010s when I started writing, there were tons of interesting debates carried on between blogs — one person would write a post, and someone else would link to it and respond to it on their own blog, and they’d go back and forth for a few rounds. It reminded me a little of the “Republic of Letters”, the network of intellectual exchange that existed in Europe and the Americas in the 1600s and 1700s. In the mid to late 2010s, this epistolary exchange was superseded by Twitter fights; instead of slow, measured responses, intellectuals would “dunk on” each other with 280-character denunciations. Something important was lost.

But I’m happy to report that with the degradation of X and similar platforms, and the rise of Substack and other new blogging utilities, we’re starting to see some of the old debate style return. A good example is the recent debate over whether moderate Democrats are more electable. So in traditional blogosphere fashion, I’ll try to give a rundown of the debate itself, and then see if I can add anything new.

Run moderate candidates, or turn out the base?

This argument has been simmering for a long time (perhaps for centuries), but recently a new crop of analysts has started to bring new statistical methods to bear on the question. Since the late 2010s, political scientists have begun to borrow a concept called “wins above replacement” from sports. Sports WAR is fairly complicated, since it involves isolating a single player’s contribution from the contributions of the rest of the team. But elections are an individual sport; when political scientists talk about WAR, they really just mean doing some regressions on election results to figure out what individual characteristics cause candidates to outperform.1

A data analysis company called Split Ticket, headed by Lakshya Jain, is probably the most famous for using this methodology. They’ve consistently found that moderate candidates do better than strongly ideological ones on both the Democratic and the Republican sides. For example, here’s a chart where Split Ticket breaks out recent performance by Congressional caucus:

This seems pretty solid, but it’s subject to a subtle caveat. Split Ticket tries to control for every important feature of a Congressional district that might influence the outcome of elections. But there might be interactions between those characteristics and a candidate’s level of moderation. For example, Blue Dog Democrats might outperform in their own districts, but if you plunked them down in the districts that elected AOC or Ilhan Omar, they might do even worse than those progressives. It’s hard to say. But this is sort of a second-order problem; these results still suggest that Democrats should try running more moderate candidates.

Some political scientists have disagreed with this conclusion. For example, Bonica et al. (2025) use an alternative methodology to assess the benefits of moderation. They measure candidate’s ideology based on a combination of how they vote in Congress and which donors donate to them.2 They find that moderation is beneficial, but the benefit is smaller than what Split Ticket finds. Bonica and his co-authors argue that this means that base turnout is ultimately more important, and that politicians should therefore be unafraid to embrace strong ideology in order to fire up the base. Here are some excerpts from a thread Bonica wrote:

This conclusion has a major problem, and hopefully you can already see what it is. Suppose a researcher comes to you and tells you that people in hospitals are more likely to die of disease than people outside of hospitals. Should this make you avoid hospitals when you’re sick? No, of course not. People go to hospitals because they’re sick, so of course those people are more likely to die of disease!

Similarly, Bonica’s observation about Democrats’ national election performance interprets correlation as causation, when there’s an obvious reason not to interpret it that way. Obama won in 2008 running a campaign that was more lefty than usual, and won. But maybe he was able to run on a more lefty platform precisely because the electorate was in a more lefty mood than normal! After all, voters in 2008 were very mad about the Iraq War and the financial crisis. Obama harnessed that anger to win, and the anger also caused high turnout.

But in 2010, the electorate was in a far more conservative mood, and brought in the Tea Party Congress. Democrats ran to the center that year, and still lost big. But they might have run to the center precisely because the electorate was in a conservative mood! And had they not run to the center, their performance in a conservative-leaning year might have been even worse!

Kamala Harris tacked to the center in 2024 and still lost a fairly close contest. But suppose Kamala Harris had come out as a progressive fire-breather in 2024, the way she tried to in the 2020 primary. Suppose she had railed against systemic racism, called for an end to military aid to Israel, proposed cutting police budgets, and offered a full-throated defense of Biden’s tolerant immigration policies. Do we really believe that this strategy would have kept the election as close as it was, or even won it by turning out the base? That’s what Bonica and his co-authors would have us believe. And though we can’t prove them wrong, it doesn’t really pass the smell test, does it?

Of course the WAR analyses for congressional races also don’t separate correlation from causation. But because congressional analyses look at the characteristics of the candidate and not just of a particular election year, they’re much more robust to this kind of obvious reverse-causation. In other words, Bonica’s call for Dems to ignore the persuasive benefits of moderation and focus on turning out the base rests on extremely shaky assumptions. The benefits of moderation at the candidate level are a lot more well-established than the benefits of turning out the base with ideological appeals.

Anyway, the debate over moderation picked up again recently, when G. Elliott Morris released the results of his own WAR measure:

Strength In Numbers
Moderation is not a silver bullet
Yesterday, Strength In Numbers released our first-ever estimate of U.S. House representatives' "Wins Above Replacement" (WAR). The reception has been very positive, and I'm pleased with how much the community has engaged with our (open-source!) model. We have the best and most comprehensive publicly available measure of House candidate skill, and we plan to expand on this — including publishing historical Senate races and estimates…
Read more

Like Bonica et al., he found that moderation is correlated with electoral victory, but that the effect is a lot smaller than what Split Ticket finds — maybe a one percentage point bonus instead of four, relative to the median Democrat. His conclusion is that the benefits of moderation have been overblown.

Matt Yglesias took issue with that conclusion:

Slow Boring
Moderation is not overrated
G. Elliott Morris published a piece last week, which I originally saw with the headline “Moderation Is Overrated,” but Substack’s A/B testing function led it to settle on “Moderation Is Not a Silver Bullet…
Read more

First, he noted that even a small benefit of moderation is worth trying to take advantage of, and that Democrats should acknowledge this more. Morris fired back, arguing that when statistical uncertainty is so great, we shouldn’t put a lot of emphasis on something like moderation, when it’s so easily swallowed by the noise.

Yglesias also argued that moderation isn’t well-captured by the typical measures, and that the important part of moderation is to take strong iconoclastic stances on a few hot-button issues, instead of voting with the party most of the time or taking money from lefty donors. Finally, Yglesias claims that the Democrats as a whole would benefit from moving to the center, so that individual moderate Democratic candidates could reap the benefits of their moderation without being saddled with their party’s extreme stances. These are interesting arguments, and they might be true, but they need to be validated with data.

Meanwhile, Jain and Bonica continue to debate the value of their respective measures of “wins above replacement”. Bonica and another political scientist named Jake Grumbach argue that Split Ticket essentially fudged their numbers, introducing secret “adjustments” to their regressions to put their thumb on the scale for moderates; Jain replies that the adjustments are tiny and don’t affect the main result. Here Bonica and Grumbach have a good point — Split Ticket should make the adjustments explicit — but it probably doesn’t drive the result.

Bonica and Grumbach also argue that Split Ticket didn’t control for enough variables in their construction of WAR. They use a machine-learning model to predict electoral victories with a high degree of accuracy, and conclude that because the residual of that model is small, moderation isn’t very important. But the machine-learning model is basically mystery-meat — the variables it’s using to predict candidate victories might be strongly correlated with moderation, in which case moderation is important.

Bonica and Grumbach would seem to be on more solid ground with a new working paper entitled “The Electoral Effects of Candidate Ideology in the Trump Era”. They show screenshots of this paper in their blog post, and discuss its methodologies. One of these methodologies — looking at close primary contests between progressive and moderate candidates as a sort of randomized trial — seems very promising. But unfortunately, I can’t find the actual paper online; the blog post claims it’s on Grumbach’s personal website, but I can’t find it there, or anywhere else. So we’ll have to wait and see what turns up there.

So far, the pro-moderation side seems to be getting the better of the debate, but only slightly. Running more moderate candidates seems to convey a small benefit. But much more important is the overall stance of the Democratic party and of Democratic presidential nominees, and here the data just doesn’t tell us very much. It’s a bit like macroeconomics versus microeconomics; the former is the only way to answer the big questions, but it’s so confounded that it’s hard to get solid answers.

Moderate policies are usually better for the people

This is an interesting technical debate, with some pretty high stakes. Winning elections is very important, as the consequences of Democrats’ defeats in 2016 and 2024 have made plain. But at the same time, we live in a representative democracy, and politicians aren’t perfect avatars of the popular will; they have a lot of leeway to make policy choices that are better or worse for their constituents. And they have a moral responsibility to help their constituents instead of hurting them.

Politics matters, but policy matters too.

And when it comes to policy, moderation tends to produce better results. This is because the effects of policy are highly uncertain; when you make big changes to the status quo, it’s a lot riskier than making small changes. Sometimes you need to take big risks — for example, in a war or other acute emergency, where the status quo will clearly lead to disaster in a short space of time. But most of the time, and along most dimensions, the world isn’t in an emergency, and so you should take risk into account.

That doesn’t mean big policy changes are always bad; often, they’re the right thing to do. It just means that unless you’re in an emergency, you should be careful about making big abrupt policy changes, and you should demand clear and compelling reasons before making them. In other words, moderation isn’t always the answer, but it has some value.

An example was fervor for police defunding in 2020. A lot of progressives, including Kamala Harris, AOC, and Zohran Mamdani, called for deep cuts to police funding. But the balance of evidence strongly indicates that a robust police presence is very important for deterring crime. And logic and evidence make the mechanisms of this deterrence clear — more cops means crime carries a greater risk of arrest, cops deter crime in public spaces just by standing there, and cops physically remove hardcore criminals from society.

Harris, Mamdani, and the other Democrats who hopped on the activist bandwagon in 2020 should have been more circumspect. Not only did this turn out to be bad politics, but it was bad policy as well — and a rapid return to more robust police presences probably helped tamp down the crime wave of 2020-21. Even if Democrats could have won some elections in 2020 by yelling “Defund the police”, the few cases of actual police defunding probably resulted in more Democratic constituents getting killed.

Another example is fiscal policy during the pandemic era. The CARES Act during the pandemic was radical, and was good policy overall — but that was an emergency, where doing nothing clearly would have resulted in personal financial devastation for millions of Americans. In 2021 the danger of devastation had receded, yet Biden still passed a very large pandemic relief bill. Moderates like Olivier Blanchard warned that a bill of this size would lead to inflation, but these warnings were ignored. And Biden’s American Rescue Plan probably did lead to increased inflation, bringing down real incomes for millions of Americans. That had negative electoral consequences in 2024, but it was also just bad for regular people.

A third example is housing. Most cities have seen a dynamic emerge where the “progressive” position is to largely block new housing development, while the “moderate” position is to allow new development. Although the electoral benefits of YIMBYism versus left-NIMBYism are not yet clear, the evidence is strongly in favor of the more moderate position as being more conducive to housing abundance. And where politicians try it, it almost always seems to work. Winning elections is important, but people having somewhere to live is intrinsically a valuable thing.

Of course, good policy and good politics aren’t naturally at odds — in fact, in the long run, the two goals are probably aligned. We hope that in the long term, electoral outcomes and policy outcomes roughly align — i.e., democracy works if people know what’s good for them and eventually elect leaders who give them what they want. In their book Abundance, Ezra Klein and Derek Thompson make a persuasive case that if progressives can’t deliver real results for people, people will eventually abandon progressivism. Hearteningly, some progressives are taking this admonition to heart, and embracing policies that previously might have been seen as too “moderate”.

Some ideologues have argued that adopting extreme policy positions is a way to “move the Overton Window” — that unless you start from an extreme first offer, you can never negotiate your way to a more moderate change. It’s a thesis that deserves investigation, but recently it doesn’t seem to have been working very well. Socialists demanded a total ban on private health insurance in 2016 and 2020. But what they got wasn’t a more moderate public option or expanded Medicare coverage — it was total defeat. The American electorate, seeing unreasonable demands, basically decided to ignore further health reforms completely, robbing Democrats of a key issue and tabling further efforts toward government health care indefinitely. Even if you think government health care is good, this is a bad outcome.

On sociocultural issues, there is a tendency to sneer at the idea of moderation, due to the memory of the Civil Rights Movement. At that time, moderates cautioned the nation to go slowly on desegregation, but activists pushed ahead and won big reforms anyway. If you agree that desegregation was good (which you should), then this is a clear case where moderation wasn’t the right course. But that case doesn’t generalize. There’s no reason to think that just because racial segregation was bad, that therefore trans women should be allowed to play on women’s sports teams and change in women’s locker rooms. In fact, I don’t know if those things should be allowed or not. But it’s a mistake to think that the long arc of history bends toward whatever progressive activists are currently pushing for.

Right now, it’s the MAGA right that’s making policy, not progressives. And from mass deportation to tariffs to defunding of scientific research, we’re once again going to encounter the downsides of extremist policy and the benefits of moderation. Public opinion, realizing this, is already turning against MAGA’s excesses.

In this situation, it’s natural for Democrats to think more about how to win the next election than about what kind of policies would benefit the citizens of the United States. But unless Dems embrace moderation before winning back power, I fear we’ll just see another round of ping-ponging between two extremes, with Americans getting increasingly more disillusioned each time. Moderation doesn’t mean going easy on Trump — it means attacking Trump hard, but also planning a replacement regime that will be sensible and effective instead of reactive and ideological.

“Do the stuff that works” is simply a good approach to governing a country, and one that America’s political class used to value more than they do now.


Subscribe now

Share

1

Basically, this is just a two-stage regression. First you regress candidate wins on a vector of electorate-specific observables like partisan voting history and presidential vote share in the same year, then you label the residual “WAR”, then you regress WAR on various candidate-specific observables like ideological moderation, in order to predict which individual candidate characteristics are correlated with higher win probabilities. The name “WAR” seems to be used mostly for marketing purposes. But hey, as a researcher, you need to get your ideas out there!

2

Lakshya Jain criticizes this measure of moderation, because he says it doesn’t do a good job of measuring stances on social issues, and thus focuses too much on economic issues.

The U.S.-China competition is on pause

2025-08-18 16:33:34

Photo by Dan Scavino via Wikimedia Commons

During the Biden years, I wrote a lot about U.S.-China competition. Like many other analysts, I was pretty convinced that the next decade or two would be shaped by geostrategic, military, technological, and economic competition between these two giant nations and their respective coalitions of allies. This seemed like a reasonable assumption, because of the political actions of the people in charge of the two nations. Xi Jinping seemed dead set on having China supplant the U.S. as the world’s dominant power, while both the first Trump administration and the Biden administration made competition with China a primary policy goal. Export controls, tariffs, industrial policy, and most other economic policy innovations in the U.S. all seemed oriented around the same goal: competing with China.

Most of the structural preconditions for a Cold War style confrontation seemed in place — territorial flashpoints, intractable ideological and institutional differences, competing tech ecosystems. Opinions of China in the U.S. and elsewhere plunged around 2019, adding fuel to the fire. The Ukraine war increasingly felt like a proxy war between China and NATO. On top of all that, we had the Second China Shock, as China’s post-pandemic industrial policy flooded the world with cheap goods and threatened manufacturers in every other country.

When Trump returned to power, many people predicted that the competition between the two countries would intensify. But I was suspicious of that narrative. Trump had already come out against the TikTok divestment bill, and denounced the Biden-era industrial policies that were giving the U.S. a chance of competing with China in strategic industries like semiconductors and EVs. Furthermore, prominent China hawks like Matt Pottinger seemed to have been purged from the Trump orbit.

The simplest explanation, in my view, was that Trump had simply chosen to change America’s alignment to a neutral, isolationist stance. He seemed to want to revive Charles Lindbergh’s idea of dividing the world into regional spheres of influence, ceding influence in Asia to China and influence in Europe to Russia.

After 7 months of the Trump presidency, however, I’m starting to gravitate toward a different theory. It now seems to me that the U.S. and China have simply mutually decided to pause their incipient rivalry in order to focus on domestic issues. In America’s case, this manifests as a combination of popular exhaustion and elite distraction; China, meanwhile, seems to be focusing more on its economic problems.

America isn’t even trying anymore

Read more

Toward a Shallower Future

2025-08-17 11:53:11

Art by GPT-5. Prompt: “an image that evokes ‘The Triumph of Death’, with similar buildings and a similar layout, but where everyone is healthy and happy”

Today I saw my friend Noor Siddiqui getting some grief on the internet. Noor is the founder and CEO of Orchid, a company that will select your embryos for IVF in order to avoid passing on genetic diseases. As someone with a number of friends who have genetic disorders, this seems highly appealing — I think most parents would want their child not to have to suffer the same innate handicaps that they suffered. Noor was recently interviewed about her company by Ross Douthat of the New York Times.

When she tweeted out a link to the interview, Noor asked:

What if your baby never walks? What if they are never able to live independently? What if you could have stopped it… but chose not to? That’s the question @OrchidInc’s embryo screening forces. You optimize everything…career, diet, skincare…but you’re going to chance it on your child’s genome, one of the most significant determinants of their health?

A lot of people on X got mad at this, calling it “eugenics”, claiming that it invalidated the life of people born with genetic disorders, and generally saying that Noor’s vision of healthy babies is dystopian.

The argument reminded me of one of my favorite essays that I’ve ever written — my New Year’s post in January 2024. It was about how lots of people have the instinct to value human suffering, and to disdain technological solutions that make the struggles of the past obsolete. I thought I’d repost it, because I think it applies to the controversy over embryo screening as well.


“I would love to live to be 50 years old.” — Keith Haring

Yes, this post starts with the latest ridiculous contretemps on the social media platform formerly known as Twitter. But I promise, it gets more interesting!

The latest contretemps revolves around a famous painting: Keith Haring’s Unfinished Painting. Painted in 1989, it represented the artist’s impending death from AIDS. Haring died the following year, at the age of 31.

It’s an incredibly haunting, tragic image. The streaks of paint falling from the fragment of a pattern immediately evoke tears, blood, disintegration, futility; they emphasize just how much of the canvas was left blank. It’s a reminder of how much of our potential as individuals is wasted, and of an almost-forgotten pandemic that claimed 700,000 lives in the U.S. alone.

The other day, a pseudonymous account named DonnelVillager1 posted an AI-generated image that “completes” the pattern in the upper left of Haring’s painting:

DonnelVillager’s post — perfectly calculated to simulate ingenuousness, while actually poking fun at art appreciators — was itself a masterwork of internet pranksterism. It was instantly condemned by tens of thousands of angry Twitter users for “desecrating” Haring’s art. Defenders responded that DonnelVillager’s trollish tweet was itself a work of art, and that the furious response proved that AI art has the potential to be transgressive and to tweak the cultural orthodoxy’s tail.

Normally I would just shake my head at one more social media food fight and move on. But this reply by my friend Daniel caught my eye:

Of course, Daniel is also poking fun, but in a very important way, he’s right. If AIDS had never existed — or if HIV treatments had come just a little sooner — Haring might have created something like DonnelVillager’s AI image. After all, a fair amount of Haring’s other work did look like that.

And yes, without AIDS, Haring very well might never produced anything as haunting or evocative as Unfinished Painting. His art might have remained forever cheerful and whimsical, peppered with the occasional political statement. This June, William Poundstone wrote that “Everybody loves Keith Haring, but nobody takes him seriously…The [latest] exhibition does not exactly demolish the notion that Haring was repetitious.” The AI image that DonnelVillager created is an incredibly shallow thing — an unthinking regurgitation of meaningless patterns in a Haring-like style by a large statistical model. But without the pressure of a life cut short, Haring’s art might never have been as deep as it was.

Yet that would have been a good trade. Unfinished Painting is a great work of art, but it wasn’t worth the price of Haring’s life. Without AIDS, the world might have been a bit shallower, with less tragedy for humans to struggle against. But no one in their right mind wishes for tragedies to continue just so that human life can continue to be filled with pathos. Adversity is not worth the price of adversity. Even a world where Keith Haring lived to old age, but every one of his paintings was pointless AI-generated crap, would have been preferable to the world we actually got.

This got me thinking about the meaning of progress.

One of my grandfathers was a bombardier in the European theater of World War 2. He came back uninjured, but the stress of so many near-death experiences, and so many dead friends, drove him to lifelong alcoholism. Once, in the 1990s, I heard a conservative pundit claim that young Americans had become soft and weak because they had never had to face adversity like the World War 2 generation did. I asked my grandfather what he thought of that. After uttering something unprintable, he said: “I did that [stuff] so you wouldn’t have to.”

In a letter to his wife in 1780, John Adams, one of America’s founders, expressed a sentiment that was very similar to what my grandfather felt — and with which many veterans undoubtedly agree. He wrote:

I must study politics and war, that our sons may have liberty to study mathematics and philosophy. Our sons ought to study mathematics and philosophy, geography, natural history and naval architecture, navigation, commerce and agriculture in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry and porcelain.

Embedded in these statements is the belief that the trials and challenges of the world are potentially impermanent; that rather than something to be endured again and again ad infinitum, they are something that can and should be conquered and put behind us forever. It’s the belief that with effort, we can create a durably better world.

That’s not a trivial assumption. Humans have always dreamed of creating a better world, but for most of our history, the world stubbornly refused to get better at anything faster than a snail’s pace. A human in China or Europe or the Middle East in the year 1400 didn’t live a significantly better life than one in 400 B.C. Civilizations would rise, but then they would fall, smashed back to earth by something that looked suspiciously like a Malthusian ceiling. As a Frenchman in the year 1000 you could dream of creating God’s kingdom on Earth, but short of supernatural intervention, you could not reasonably dream of a world without smallpox, bedbugs, or senile erectile dysfunction.

Then, of course, something changed. By now you’ve all seen the graph where world GDP creeps along and then explodes upward like a hockey stick; I won’t post it again. Instead I’ll post this one:

For American women to die from pregnancy used to be a normal occurrence; then in the 1930s and 1940s it became an extreme rarity. Suddenly, a fundamental fact of human suffering that had stubbornly resisted change since time immemorial simply gave way. We fought and lost, and fought and lost, and then one day we fought and won.

The proximate reason for the abrupt decline in maternal mortality was the invention of antibiotics in 1928, and the development of medical practices like blood transfusions whose safety depends on antibiotics. But although penicillin was discovered by accident, it didn’t simply appear out of nowhere; its discovery required the edifice of an industrial society that took centuries to build. The victory over maternal mortality was achieved by a long struggle, not by a happy accident. (In fact, in some countries, maternal mortality began to fall in the 1800s, thanks to the wealth created by industrialization.)

A romantic could argue, if they were so inclined, that the conquest of maternal mortality has made the world a shallower place. In the early 1800s, you could tell stories whose emotional power rested — explicitly or silently — on the universal knowledge that childbirth meant mortal danger. Today, our high school English teachers have to explain this to us when we read Jane Austen or Emily Brontë, just so we understand, on an intellectual level, how brave the women in their novels were.

Such conquests have become commonplace. HIV was a death sentence in 1995; the next year, David Ho and his team unveiled a new combination drug therapy that turned it into a manageable chronic disease. And now, almost three decades later, Unfinished Painting is already becoming something that most people need explained to them; we still understand the meaning of terminal disease, but the context of AIDS, and especially what it meant to gay people in the political climate of the 1980s, is already fading from living memory into dry history.

As the world becomes safer — as one after another edifice of human suffering crumbles before the collective might of science, technology, and industrial society — it becomes harder to harness the emotional power of tragedy, risk, adversity, and heroism. The lives of more individuals become childlike, pure, and unmarked — or at least a little bit more so than before.

I first realized this years ago, while watching Disney’s The Little Mermaid. In the original 1837 Danish fairy tale, the mermaid wagers her life on a chance to win the love of a prince; she fails, and her life is forfeit to the evil sea witch. In the 1989 Disney movie, the same exact thing happens — except instead of passively accepting defeat, the mermaid and the prince simply stab the sea witch in the chest with the broken prow of a ship, and live happily ever after.

Perhaps this is the kind of resolution that could only feel natural and satisfying in America, a country that grew up after the Industrial Revolution. Some call Hollywood endings shallow, but they reflect our everyday reality in the modern world; what is David Ho’s defeat of AIDS, but the stabbing of an evil sea witch in the chest?

Nor, I think, are we simply on a temporary upswing. Some romanticists imagine that society is a cycle, where hard times create strong men, who create good times, which creates weak men, who create hard times. But whether or not that sort of institutional cycle exists, the technologies discovered during the last upswing will be preserved. Countries may collapse, but humanity will not forget antibiotics.

Nor is there any sign that this process will be naturally limited by humans’ inability to appreciate the improvements in their material lives. There is no upper limit on the correlation between life satisfaction and GDP. Contrary to popular myth, suicide rates tend strongly to fall as countries become richer. The higher measured rate of depression in developed nations is likely due not to ennui, but to better diagnosis.

Some romanticists feel the urge to knock over the edifice of industrial society intentionally, in order to kick against the seeming shallowness of modern life — to return humanity to a world of toil and struggle, in order to ennoble us. But these dark romantics are rightfully recognized in fiction and public discourse as villains. The heroes of our stories are the people like David Ho — the ones who fought to hoist humanity up from the muck so that future generations could be a little more childlike, the ones who studied politics and war so that our grandchildren may study statuary, tapestry, and porcelain.

Romanticists need to accept that the nobility of suffering has always been a coping mechanism — a way to sustain hope through the long twilight of apparent futility. And they need to accept that heroism is always inherently self-destroying — that saving the world requires that the world is worth having been saved.

And they must at least try to understand that in a more general sense, happiness isn’t truly shallow — it just has a different kind of depth. The passions of people raised in a kinder, gentler world may be alien and incomprehensible to the older generation, but they are no less intense, and the culture around them is no less complex. Adversity forces us to rise to its challenge, but abundance allows us to discover who we might become, and that is a different sort of adventure.

Looking back on my own life so far, I remember the happy child I was, before clinical depression changed me. Depression is horrible, but it added a richness and depth to the person I am today, and I appreciate the value of those changes. But if that happy child had gotten a chance to grow up without depression, I think he would have been changed in different ways, and under the tutelage of gentler teachers, would have become no less worthwhile and interesting of a person.

So it must be with humanity. The modern world of push-button marvels has lost something, but it has gained more than it has lost. By celebrating it, we honor the countless millennia of heroes who worked in some small way to bring it about, even as we dedicate ourselves to continuing their great enterprise. Our legacy is to fill the Universe with children who laugh more than we were allowed to.


Subscribe now

Share

1

Interestingly, DonnelVillager’s handle is one of the things that inspired me to write this post. It’s a reference to one of my favorite video games, Fire Emblem: Awakening. The character Donnel is a simple villager who is forced to go fight in an apocalyptic war after his father is killed by bandits. If you take care to level him up, he becomes a very powerful hero, but at the end of the war he goes back to his farming village and lives out a simple life, giving up fighting and adventure forever. His story serves as a reminder that struggle is not done for struggle’s sake.

America has only one real city

2025-08-15 18:30:12

Americans who go to Tokyo or Paris or Seoul or London are often wowed by the efficient train systems, dense housing, and walkable city streets lined with shops and restaurants. And yet in these countries, many secondary cities also have these attractive features. Go to Nagoya or Fukuoka, and the trains will be almost as convenient, the houses almost as dense, and the streets almost as attractive as in Tokyo.

The U.S. is very different. We have New York City, and that’s about it. People from Chicago or Boston may protest that their own cities are also walkable, but transit use statistics show just how big the gap is between NYC and everybody else:

Source: Census Bureau via @StatisticUrban

Chicago, Boston, and the rest have their old urban cores with a few train lines and some shopping streets. But for the most part, even these cities are car-centric sprawl. You can also see this in the population density numbers; New York simply towers over all the rest:1

Source: Census Bureau via Wikipedia

There’s simply no other town in America that looks and feels like NYC.

Some of the reasons for this are historical. NYC became a big city before the rise of the mass-market passenger car, so it had to use transit to move people around; many cities, like L.A., Houston, and Phoenix, saw their growth happen later. America’s car-friendly policies, abundant land, and desire for suburban living created the car-centric development pattern that we see in many cities in the West and South today.

But many older cities don’t have this excuse. For example, take Philadelphia. In 1910, NYC was only three times bigger than Philly; by 1960 it was almost four times as big, and by 2010 it was five times as big. In other words, Philadelphia had its big growth spurt earlier than NYC did, but its outcome in terms of walkability and transit is just much weaker, with fewer than 20% of Philadelphians using transit for their commute. Very little of downtown Philly looks like Manhattan.

The reason NYC is so much bigger than every other city in America is partly mathematical — every country tends to have one city that towers over the rest in terms of total population. And it’s partly economic — Ed Glaeser has a great essay on the industrial history of NYC. But those reasons can’t explain why NYC is so much denser than other cities. In fact, because NYC includes such an unusually large percent of its metropolitan area (44%, compared to less than 33% for other major cities), you might naively expect it to be less dense — San Francisco is just the tiny metropolitan core of the Bay Area, while NYC includes Staten Island and other outlying areas. Yet NYC is still far denser than SF or any other large American city.

The reason NYC is America’s only truly dense large city is due to policy. Other cities have restrictive zoning codes that limit floor-area ratios, impose citywide height limits, impose parking minimums, and restrict certain areas to single-family homes.2 For example, here’s a map showing just how much of San Francisco’s land (in pink) is zoned to allow only single-family homes:

Source: SF Planning Dept. via San Francisco Public Press

Keep in mind that this is America’s second-densest big city. New York City really stands alone, in terms of allowing tall buildings.

New York City is also unique in having an extensive subway system. In terms of miles of rail, NYC has more than other cities, but just as important is the shape of the network. NYC’s subway is a dense grid that covers all of Manhattan and much of Brooklyn; other cities tend to have commuter rail systems that connect the city center directly to outlying areas but which aren’t as useful for getting around within the central city. For example, here are train maps for NYC, San Francisco, and Boston:

Source: MTA

American cities are no longer able to build subways. This is partly because we’ve outlawed the cheap methods used to build them:

The Works in Progress Newsletter
Why we stopped building subways cheaply
The Linear No Threshold model says that there is no safe level of radiation exposure. There is overwhelming evidence it is false, yet it inspires the ALARA principle, which makes nuclear power unaffordable worldwide. Read the lead article from Issue 19 of Works in Progress…
Read more

But a lot of it is because of the same problems of low state capacity and excessive citizen input that block every other construction project in America.

In other words, America has only one New York because no other American city wants to become like New York. Throughout the country, “Manhattanization” is a scary term that gets thrown at any developer who wants to increase density.

And yet the number of Americans who want to live in NYC is not small; it’s huge. NYC 1-bedroom rents have been soaring, even as they stagnate nationwide:

Source: Zumper via CRE Daily

Someone wants to live in NYC, obviously. Partly that’s because of the enormous consumption benefits for the young wealthy childless people who love living in cities. And partly that’s because dense cities allow industrial clustering effects — everyone knows that if you want to hire good employees in banking, publishing, corporate law, and so on, it helps to be in NYC.

Is one city enough to hold all of the Americans who want to live in big, dense cities, as well as all of the Americans who need to live there for work? It is not. The middle class is being pushed out of NYC at a rapid clip. Americans are trying to pile into other cities, but NIMBYism isn’t letting those cities build many new houses to accommodate them; as a result, rents in other cities go up faster than wages.

America needs more than one NYC. It needs Chicago, Philadelphia, and other big old cities with existing walkable urban cores to step up and Manhattanize themselves, so that the country won’t just have one Manhattan.

How can this be done? The first step is simply to adopt NYC-style big floor-area ratios, as well as all the city’s other permissive building policies. Allow more density, and some density will get built.

The second thing these cities can do is to build more trains. Because the “cut and cover” policies that build subways cheaply are always very unpopular, this probably also means building elevated trains and surface rail. NIMBYism will have to be overcome, but that’s true of just about anything that anyone wants to get done. Cities should also focus on building trains that allow their residents to get around the city, rather than just get into and out of the city; this means constructing trains in a grid or web pattern.

Another idea is that if other big cities can reduce crime, their citizens will be less apprehensive about allowing more density and transit. NYC is one of America’s safest big cities, with a homicide of less than 4 per 100,000 population as of 2024. Chicago, in contrast, was at 17.5, and Philadelphia at 16.9. San Francisco has a fairly low homicide rate of 6.4, but it still has a big problem of public disorder, including fentanyl use, homeless encampments, store raids, and general lawlessness. Reducing this public disorder — as well as crime in general — would make it far more appealing to live in a dense area, to walk down shop-lined streets, to take the train, and so on.

Some Americans instinctively recoil from calls to make more cities like NYC. They prefer their single-family homes, their cars, their strip-malls and lawns. Fine. But those people should consider that if America had one or two more New York-style cities, the people who want to live in that sort of city would move there, freeing up more space for everyone else.

The U.S. needs both dense cities and suburbs, in order to satisfy all the different Americans who want different lifestyles. We are overweight on Los Angeles type cities, and underweight on NYC type cities. We need to restore balance, by converting more of our big old cities into gleaming new Manhattans.


Subscribe now

Share

1

This is not true of, say, Japanese cities. Osaka is actually about twice as dense as Tokyo. That’s partly an artifact of how density is measured; Tokyo is more of an office town, where people commute in from residential areas outside the city proper.

2

NYC has a few other innovative policies that allow it to achieve greater density. These include density bonuses, special-purpose districts, as-of-right development, and the ability to sell unused floor-area ratio so that nearby buildings can use it.

Corporations aren't the reason your rent is too high

2025-08-13 16:29:06

Photo by Jim.henderson via Wikimedia Commons

Donald Trump is choking off U.S. manufacturing with tariffs, replacing statistical agency personnel with apparatchiks who will manipulate data to make the President look good, and so on. Yet some progressives remain convinced that the key to winning back the country is to harness a wave of populist anger by attacking big corporations. I’m not sure I see the political logic there, but I guess I’m not much of an expert on politics.

Anyway, I’m sympathetic to the notion that monopoly power has increased in the U.S. economy since the turn of the century, and that this is making life harder for some Americans. But corporate power is simply not the cause of many of the problems regular Americans face — there are a lot of other things going on too. And because antitrust progressives insist on fitting every problem into the paradigm of corporate power, they end up believing a number of false things about the world. One example I’ve written about before is that of health insurers, whom antitrust progressives view as the chief architect of everything that’s wrong with the U.S. health system; in fact, these companies make almost no profits and are fairly efficient.

Another important example is the housing market. Overall, housing has not actually gotten more expensive throughout America; if you compare median personal income to the CPI measure for rent of primary residence, you’ll see that income has actually gone up slightly faster than rent since 1980:

But in the attractive cities where most people would like to live if they had the choice, rent has gone up much faster than in the decayed Rust Belt cities and small towns where most Americans would prefer not to live. The rental crisis is a local one, but it’s real.

Abundance liberals blame this problem on lack of housing supply, and support YIMBY policies to build more housing in cities. But although some progressives are coming around on this, many are strongly opposed to the abundance agenda. Instead, they want to blame high rents on powerful companies who buy up all the houses and then jack up prices.

A few years ago, this manifested as a panic about BlackRock buying up large amounts of the housing stock in America. This was a silly mistake; BlackRock doesn’t buy homes, except indirectly by investing in stocks called REITs. People were probably thinking of Blackstone, a much smaller asset manager with a similar name, which does buy up homes.

In addition to this silly mistake, the broader panic just wasn’t based on facts. In 2021, Derek Thompson did a great job of debunking the myth:

The U.S. has roughly 140 million housing units, a broad category that includes mansions, tiny townhouses, and apartments of all sizes. Of those 140 million units, about 80 million are stand-alone single-family homes. Of those 80 million, about 15 million are rental properties. Of those 15 million single-family rentals, institutional investors own about 300,000; most of the rest are owned by individual landlords. Of that 300,000, the real-estate rental company Invitation Homes—in which BlackRock is an investor—owns about 80,000. (To clear up a common confusion: The investment firm Blackstone, not BlackRock, established Invitation Homes. Don’t yell at me; I didn’t name them.)

Megacorps such as BlackRock, then, are not removing a large share of the market from individual ownership. Rental-home companies own less than half of one percent of all housing, even in states such as Texas, where they were actively buying up foreclosed properties after the Great Recession. Their recent buying has been small compared with the overall market.

The actual number of homes Blackstone (or BlackRock) was buying was tiny — far too tiny to affect rental prices in any significant way, except perhaps in a few very localized areas.

But somehow, despite its lack of connection to reality, the meme stuck around, and the size of the problem grew as the story was repeated around the internet. There are still people who think BlackRock is buying up much of the housing in America. In fact, even some right-wingers are convinced of this:

The exact form of the claim varies. Sometimes it’s 44% of the housing that the evil corporations are buying up, sometimes it’s just 20%. Sometimes it’s BlackRock alone that’s responsible, sometimes it’s the private equity industry:

But the meme remains false. Many news outlets have debunked it over the years. For example, Logan Mohtashami posted the following charts in Yahoo Finance in 2024:

The first of the two charts shows that institutional buyers — which includes private equity, BlackRock, etc. — own only a tiny sliver of the homes in America. The second chart shows that there was a corporate home-buying spree in 2022, but it never even hit 5% of home purchases at its peak — far lower than the rumors claim.

Kriston Capps also wrote a great debunking of the “corporate landlords” myth in 2024. Here was his chart:

Almost zero of the U.S. housing stock gets bought by owners who own more than 9 units. Corporate landlords just aren’t significant enough to be driving the rental crisis in America’s most desirable cities. (Of course, this doesn’t stop anticorporate types from mocking the very idea that high rents are caused by something other than market power.)

In fact, it gets even worse for the antitrust story here. It turns out that corporate landlords probably don’t even do what antitrust people think they do! The common story is that corporations buy up all the houses in an area, thus creating a local monopoly, and using that local monopoly to jack up rents — which in turn causes gentrification and pushes poor people and minorities out of the neighborhood.

Except Konhee Chang, an economics job market candidate, found evidence that corporate landlords actually make housing cheaper for lower-income folks, and lead to diversification instead of gentrification!

Using property-level data on tenants, home prices, rents, and acquisition timing, I show that increasing rental supply in American suburbs, where rentals are scarce and expensive relative to owner-occupied housing, reduces segregation by enabling lower-income, disproportionately non-White renters to move into neighborhoods where they otherwise could not afford to own. In response, nearby incumbent households are more likely to move out, perceiving renters as a disamenity. Large-scale landlords expand rental supply by converting owner-occupied homes into rentals, exploiting cost efficiencies from geographic concentration.

Chang found that corporate landlords drive down rents and slightly raise the price of buying a house:

This is only to be expected, since what the corporate landlords are doing is buying up housing (which raises purchase prices because it increases demand) and converting it to rental units (which lowers rents because it increases supply).1

So as far as we can tell, corporate landlords — at least, right now — aren’t causing the harms that the antitrust progressives (and some right-wing pundits) claim. The cause of the housing crisis in desirable metro areas must lie elsewhere. The obvious culprit here is just supply limitations — i.e., land-use regulations and NIMBYism. And the obvious way to address that is the abundance agenda.2

For antitrust progressives, the problem with that conclusion is that it doesn’t place the blame on their class enemies. If corporate power isn’t the problem when it comes to high rents, then it means fewer opportunities for progressives to harness populist rage against the business class. In addition, antitrust progressives probably still believe that they can harness a wave of anticorporate populist sentiment to buoy Democrats back to victory. I would be very surprised if that strategy worked.

But in any case, the story that corporate landlords are making American housing unaffordable is simply false. It’s just another free-floating memetic myth that keeps getting in the way of our ability to solve our very real problems. And the zeal with which antitrust progressives have embraced and propagated that myth should make us a little more pessimistic about their ability to accomplish positive change in the current political economy.


Subscribe now

Share

1

This ought to be an open-and-shut case, but a progressive economist popped up to try to argue:

As I explained, the reason welfare goes down in the paper is that corporate landlords push down prices enough that poor Black and Hispanic people are able to move in to previously richer, whiter neighborhoods. Chang, the author of the paper, assumes — probably correctly — that rich white homeowners do not like to live next to poor Black and Hispanic renters. And so the rich white homeowners lose utility from corporate landlords, because they no longer get to exclude people from their neighborhoods along racial and class lines!

Somehow, this point was lost on the progressive economist, who ended up defending white flight as a socially desirable thing.

2

Of course, even the abundance agenda won’t totally be able to negate the local impacts of big increases in demand for housing in coastal cities. Demand for life in those cities is just extremely high. But supply increases will blunt those impacts, and create affordability further from the center of New York City, San Francisco, etc.