2025-07-12 04:16:48
Trump is back with a new round of big tariff announcements. They include a 35% tariff on Canada (America’s top trading partner), 25% tariffs on Japan and South Korea (two key U.S. allies), a 50% tariff on copper (almost half of which the U.S. imports), and various other high tariffs on various other countries.1
So far, stock markets — which plunged after “Liberation Day” in April — appear not to be reacting to the new announcements. Perhaps investors are assuming that Trump will chicken out and “pause” these tariffs, as he has paused most of his others since taking office. Or perhaps they’re assuming the tariffs will be halted by the courts — constitutionally, tariff powers belong to Congress, and Congress has allowed the President to declare tariffs only in an emergency, so if courts rule that America isn’t in an emergency (which of course it’s not), Trump’s powers vanish. It’s also possible (though unlikely) that American investors have changed their minds since April and decided that tariffs aren’t that bad for the real economy.
Whatever the reason, the country seems to have decided that Trump’s tariff announcements are basically just noise, and that there’s no reason to panic unless and until a tariff-driven recession actually materializes.
However, amid all the noise of tariff announcements and pauses and court injunctions, actual U.S. tariffs are higher than they’ve been since the 1930s. Yale Budget Lab estimates that even accounting for Americans switching to different types of consumption goods as a result of the tariffs, the current rate that Americans are now legally required to pay on their imports is around 17%:
Now, legal requirements aren’t the same as actual tariff revenue collections; a lot of companies find ways around the tax, and the U.S. government’s capacity to actually collect this amount of taxes is patchy. But the most recent data we have shows that even in May, revenue from U.S. customs duties had more than doubled since before “Liberation Day”:
So the tariffs are tariffing. But so far they have failed to do one thing that many commentators predicted they would do: raise inflation. As of May, there was no sign that prices were rising faster than normal — indeed, inflation looked tame:
So what’s going on? If you tax something, people should pay more. If people pay more, that means prices go up. So where’s the inflation?
I see three possible explanations. As usual, they aren’t mutually exclusive; the reality could be a combination of the three.
2025-07-10 20:56:37
One area of seeming bipartisan consensus in America over the past decade is the idea that free-market economics — or “neoliberalism” — has failed, and that our economic system needs to be overhauled. Leftists have always believed this, of course, and in recent years they’ve been joined by more mainstream progressives like the folks at the Hewlett Foundation. On the right, thanks to Trump, tariffs and immigration restriction have overthrown free trade as the reigning orthodoxy. And the GOP in general seems to want to put their thumb on the scale for fossil fuel industries and other traditional sectors.
And in case this wasn’t clear, I have been one of the voices calling for a new economic system! For years I criticized free-market ideology, fretted about the decline of the Rust Belt, expressed suspicion about free trade, and called for industrial policy to revive manufacturing. I wrote many times that the Biden Administration’s industrial policies represented a needed break with the dogmas of the past, and that Trump had enabled that needed shift by destroying the political consensus for free trade. I’m constantly warning that without attention to strategic industries, America and its allies will lose a war to China. There’s a good chance I’ll end up advising the Hewlett Foundation in some capacity on what our next economic ideology ought to be. Despite being elected “Chief Neoliberal Shill” in a humorous online poll back in 2018, I was never an actual neoliberal or a free-markets kind of guy.
So to be clear, when I say that criticism of free markets has been overdone, I’m partly talking to myself. A couple of months ago, horrified by Trump’s tariff policies, I wrote an apology to libertarians, admitting that I had failed to see the political usefulness of their project in terms of maintaining economic sanity on the Right:
But it’s not just the political benefits of free markets that have been undersold; I think the purely economic advantages are also too often ignored.
Exhibit A is Javier Milei’s track record in Argentina. A year and a half ago, when Milei was elected President of Argentina, a bunch of left-wing economists warned darkly that his radical free-market program would lead to economic devastation:
The election of the radical rightwing economist Javier Milei as president of Argentina would probably inflict further economic “devastation” and social chaos on the South American country, a group of more than 100 leading economists has warned…[S]ignatories include influential economists such as France’s Thomas Piketty, India’s Jayati Ghosh, the Serbian-American Branko Milanović and Colombia’s former finance minister José Antonio Ocampo…
The letter said Milei’s proposals – while presented as “a radical departure from traditional economic thinking” – were actually “rooted in laissez-faire economics” and “fraught with risks that make them potentially very harmful for the Argentine economy and the Argentine people”…[T]he economists warned that “a major reduction in government spending would increase already high levels of poverty and inequality, and could result in significantly increased social tensions and conflict.”
“Javier Milei’s dollarization and fiscal austerity proposals overlook the complexities of modern economies, ignore lessons from historical crises, and open the door for accentuating already severe inequalities,” they wrote.
Milei won anyway. His first big policy, and the one the lefty economists fretted about the most, was deep fiscal austerity. Argentina’s long-standing economic model, created by dictator Juan Peron in the 1950s, involved a large and complex array of public works projects and subsidies for various consumer goods like energy and transportation. Milei slashed many of these, as well as cutting pensions, civil service employment, and transfers to provinces. Overall, he cut public spending by about 31%, resulting in a near-total elimination of Argentina’s chronic budget deficit:
The point of all this cutting wasn’t just to remove state intervention in the economy — it was to stop inflation. Basically, macroeconomic theory says that if deficits are high and persistent enough, then they convince everyone that the government will eventually inflate its debt away by printing money (which becomes a self-fulfilling prophecy).1 And most or all countries that experience hyperinflation end up escaping it only when they get their fiscal house in order. Perpetual deficits were part of Argentina’s “Peronist” system, and it’s probably a good bet that this has been responsible for the periodic bouts of hyperinflation that it experiences.
So Milei’s austerity shock therapy was as much about macroeconomics as about micro. So far, it seems to be working. Inflation, which was spiking dangerously before Milei took office and looked like it was headed back into “hyper” territory, has plunged:
Now, this is still a level of inflation that would have Americans up in arms; 2.4% monthly inflation translates to a 33% annual rate! But for Argentina, this is an incredible relief.
Milei also did a bunch of deregulation, privatization, anti-union stuff, and other libertarian policy, mostly by executive order. He made it easier to hire and fire workers, made it harder for unions to strike, took steps toward privatization of state-owned industries, and deregulated finance, health care, and air travel. He also scrapped rent control.
Finally, Milei made some changes to currency policy. Argentina is primarily a commodity exporter, and like many other commodity exporters, it has long kept its currency (the peso) overvalued.2 Keeping the currency overvalued required maintaining a bunch of restrictive laws that keep people — both Argentinians and foreign investors — from exchanging their pesos for U.S. dollars (which would drive down the price of the peso against the dollar). Those laws let Argentinians overconsume, but only by hampering banks and discouraging foreign investment. Milei scrapped some of these rules, and promised to scrap others, and allowed the currency to depreciate by over 50%.
The lefty economists thought that all this was going to be a disaster for regular Argentinians. Austerity represents temporary pain — it destroys aggregate demand in the short term, raising poverty and unemployment. Cuts to social programs and consumption subsidies hurt. Lifting rent control could raise rents. And in fact, for the first year of Milei’s term in office, the poverty rate did soar, from an already sky-high 42% to 53%. Unemployment went up too, to over 7%, as the economy went into recession in 2024.
Then a funny thing happened. The recession ended, and Argentina bounced back:
Argentina's economy grew year-on-year for the second consecutive quarter and by the most since 2022 as the economy recovers from last year's recession…Gross domestic product expanded 5.8% in the first quarter…Monday's data showed signs of recovery at the consumer level, with private consumption growing 11.6% from a year ago.
The poverty rate is falling too, as growth picks up:
Argentina’s poverty rate dropped to 38.1% in libertarian President Javier Milei ‘s first year in office. The decline in poverty for the second half of 2024 — from July to December — marks an improvement from the 41.7% that Milei’s left-wing populist predecessors delivered for the second half of 2023.
Leftists dispute the reality of the poverty drop, but they are going entirely on anecdotes instead of data. Unemployment is still high, but continued growth should bring it down again — J.P. Morgan is forecasting Argentina’s economy to continue accelerating next year.
Is this a huge victory for free-market economics? It’s tempting to declare it one, but the reality isn’t so simple. Austerity isn’t fundamentally a free-market policy; socialist countries can run fiscal surpluses, and the most capitalist countries in the world can run deficits. As Tyler Cowen has pointed out, Milei’s macro policies look more like orthodox IMF stabilization policy than libertarianism. Any microeconomic benefits from shrinking the Argentinian state will be slower to materialize; right now, the main effect has just been to tame inflation.
Also, it’s worth mentioning that the lefty economists who were terrified of Milei were also worried about a plan he didn’t carry out — a scheme to let Argentinians use U.S. dollars instead of pesos for domestic transactions. This wouldn’t have worked; there just weren’t enough dollars in the country. Fortunately, Milei didn’t have the political capital to enact the plan.
But still, Milei’s success so far should make us somewhat more confident about free-market policies — especially when we evaluate them against the new socialist ideas that have been gaining currency in the U.S. In the past, socialists and other left-leaning economic thinkers advocated central planning and nationalization of industry; in recent years, they have taken to calling for expansion of the state through fiscal policy, mixing macroeconomic justifications with micro. At all times, they call for deficit-financed expansion of social programs; when fiscal hawks want to tame the deficits, the lefties warn of the short-term macroeconomic harms of austerity.
If you’re always more terrified of austerity than you are of deficits, expansion of the state — and of the deficit — becomes a one-way ratchet. This approach is very different than Keynesianism, which advocates stimulus to overcome recessions, followed by austerity during boom times. You’ll recognize it as bearing a distinct similarity to MMT; that pseudo-theory has largely fallen out of favor, but there are plenty of more respectable progressive types whose ideas nonetheless have a lot of this “macroleftist” flavor.
Milei’s success in bringing down Argentina’s inflation, while also restoring growth after one painful year, show that the macroleftists’ constant dark warnings about austerity are at least sometimes overblown. Fiscal conservatism isn’t always desirable, but Milei is showing that the costs often aren’t as high as many progressives believe.
Meanwhile, Milei’s microeconomic policies — the deregulation, the moves toward privatization, the anti-union policies — are the dog that didn’t bark here. With poverty now falling and consumption rising in Argentina, it doesn’t appear that those free-market policies have crushed the middle class. So far there’s no sign that inequality has increased substantially either, with the Gini coefficient looking stable.
And in a few cases, we can see Milei’s free-market policies actually getting results. The scrapping of national rent control seems to have created such a big housing supply boom that it has actually driven rents down. Here’s Newsweek with some numbers:
Milei also ripped up Argentina's rent-control law in late 2024, removing limits on lease terms and rent increases that had discouraged landlords from renting. Within months, the supply of rental housing in Buenos Aires jumped by 195 percent, according to the city's real estate observatory, and median asking prices fell by about 10 percent as more apartments returned to the market.
Some leftist and socialist organizations, including the Socialist Workers' Party (PTS) and the Workers' Left Front, criticized the move at the time, arguing it favored landlords at the expense of tenants. But the reality has so far borne out the opposite: supply has soared. On Zonaprop, one of Argentina's largest real estate platforms, traditional rental listings surged from about 5,500 before the reform to over 15,300 — a 180 percent increase — with a third of that rise happening in just the first month after deregulation.
It’s too early to know the full effects of all of these policies. Perhaps inequality will eventually increase as a result of all Milei’s deregulation, or perhaps poverty will stabilize at an unacceptably high level. But as of now, things are looking a lot better than even many libertarians hoped.
And when we compare Milei to the Latin American regimes that socialists and progressives have endorsed in recent years, there’s just absolutely no comparison. In the 2000s, Joe Stigliz was overflowing with praise for Venezuela’s Hugo Chavez:
In 2006, Nobel Prize–winning economist Joseph Stiglitz praised the economic policies of Hugo Chávez. The Venezuelan president ran one of the “leftist governments” in Latin America that were unfairly “castigated for being populist,” Stiglitz wrote in Making Globalization Work…In fact, the Chávez government aimed “to bring education and health benefits to the poor, and to strive for economic policies that not only bring higher growth but also ensure that the fruits of the growth are more widely shared.” In October 2007, Stiglitz repeated his praise of Chávez at [a] forum in Caracas, sponsored by the Bank of Venezuela. The nation’s economic growth rate was “very impressive,” he noted, adding that “President Hugo Chávez appears to have had success in bringing health and education to the people in the poor neighborhoods of Caracas.” After the conference, the Nobel laureate and the Venezuelan president had an amicable meeting.
We all know how that turned out; Venezuela suffered one of the most catastrophic economic collapses ever recorded outside of wartime. But in 2022, Stiglitz praised Milei’s Argentinian predecessors for resisting calls for austerity after the pandemic. Two years later, Argentina’s annual inflation rate hit 1500%.
These dramatic failures of judgement have never been called to account. But when libertarian approaches to economic policy are faring so much less disastrously than leftist approaches in Latin America, that should tell us that we’ve overcalibrated ourselves too far in the direction of anti-neoliberalism.
In truth, Milei is hardly the only example of neoliberal success in recent years. Although China is nominally communist and now engages in a lot of industrial policy, from the 1980s through the early 2000s its approach was almost entirely one of privatization. India got a big growth expansion from liberalization in the 1990s and 2000s, as did Vietnam. Poland’s development miracle is largely a neoliberal one — its industrial policy has focused mostly on simply promoting foreign direct investment, while its other policies have simply been a mix of institutional improvements and free trade.
Does this mean that hardcore libertarians are right, and that countries all over the world should slash government and unleash market forces? Well, no. The more complicated, nuanced truth is that which economic policies are best depends a whole lot on where you start. Argentina before Milei was a Peronist mess; China before Deng was a Maoist disaster. Plenty of government expansions throughout history have reduce poverty without wrecking economic growth; witness the New Deal, or Korea’s industrial policy push in the 1970s.
The boring truth is that the ideal economy is a mixed one; it’s built on the foundation of markets, but also contains a significant amount of redistribution, public goods provision, and industrial policy. The exact optimal balance depends on the country, and on the times; even if you happen to get it exactly right for a while, the optimal mix will change over time as countries develop, as technology changes, as trading patterns shift, and so on. Someday, if Argentina over-indexes on Milei’s early successes, they might very well become too laissez-faire.
Instead of picking one ideology and sticking to it, countries should recognize when they’ve veered too far in one direction, and take steps to change course. If some of your people are suffering in poverty while others prosper, you should establish a social safety net. If you’re choking on pollution from unregulated industry, you should establish some environmental protections. If you’ve nationalized your industries and they aren’t doing well, you should privatize them. If you’re falling behind technologically, you should try some industrial policy. If you’ve shackled your economy with inefficient subsidies and entitlements, then you should do some deregulation.
Evolution isn’t as sexy as revolution; it’s fun to wave around a chainsaw and shout about how your ideology will send your enemies to the graveyard of history. But evolution is what works.
Some economists believe that fiscal policy is actually the only important long-term determinant of inflation. This is almost certainly way overstated.
Commodity exporting countries tend to do this because the political incentive to let consumers buy more imported goods outweighs the political incentive to keep the currency cheap in order to help domestic manufacturers sell more products. In this way, the U.S. behaves more like a commodity exporter than an export manufacturer.
2025-07-08 00:44:59
Wow, I just realized it’s been a long time since I did a roundup! I hope I haven’t gotten rusty.
First, podcasts. I went on the Big Technology podcast to talk about AI taking jobs, and why humanity could end up being just fine no matter how good AI gets:
A few weeks ago, I went on the WhoWhatWhy podcast with Jeff Schechtman, to discuss the economy:
And finally, here’s an episode of Econ 102, in which Erik and I talk about the revolution China is creating in physical technologies:
Anyway, on to the list of interesting things!
For years, we’ve been deluged with charts and rhetoric and memes about how American wages haven’t gone up for decades. Bernie Sanders, for instance, regularly claims that wages are lower than they were 50 years ago. Is it true, though?
No. The basis for this claim is one particular data set: average hourly earnings for production and nonsupervisory workers in the private sector, divided by the consumer price index. That measure of wages was indeed lower in 2019 than in 1973. But if you use the PCE price index instead — which measures the changes in the prices of what people actually consume, rather than what they used to consume in the past — you see a very different story:
This is enough to show that the U.S. economy as a whole is delivering wage growth (though less than we’d like, of course). But what we really care about on a personal level, when it comes to wage trends, is probably some combination of two questions:
How much do a typical person’s wages increase over time?
Do young people make more than their parents did at a similar age?
Ben Glasner of the Economic Innovation Group has a great chart that allows us to see the answer to those two questions in a nutshell:
We can see that every American generation’s income, except for the Silent Generation, has risen strongly over the first two decades of their working life. We can also see that Gen Xers started earning more than the Boomers after about age 27, and that Millennials started earning more than the Xers around age 25. Those generation gaps widened over time, as the younger generations pulled away from the older ones.
Most importantly, we can see that Zoomers — the current young generation — beat the wages of the Millennials, Xers, or Boomers right out of the gate. Gen Z has benefitted from a strong re-acceleration of American wage growth since the early 2010s.
If people tell you that the American economy is not delivering higher wages over time, they’re just wrong.
The public discussion around birth rates is incredibly cursed. Aging and shrinking populations are a huge long-term economic problem that nobody has yet figured out how to solve. And yet debates about the problem almost instantly degenerate into racism, sexism, and accusations of racism and sexism. Thus, as a society, we’re not yet able to take this looming threat seriously.
One reason for this is the correlation between women’s education and fertility. If you look at a chart, you see that there are no high-fertility countries where the average woman completes high school:
To rightists, this correlation suggests not just a causal effect, but an ironclad law — if you want to preserve the human race, you must prevent girls from going to school. Thus, they believe that the only societies that survive will be sexist ones that treat women as breeding machines instead of economic providers. Naturally, this idea makes a lot of people very mad.
But is it true? Everyone knows how to mouth the phrase “correlation isn’t causation”, but when the rubber hits the road, do they really understand what that means? Just observing that women’s education and fertility are negatively correlated doesn’t tell you whether preventing women from going to school would make people have a bunch of kids.
Maybe the countries with low female education and high fertility are just deeply economically dysfunctional. Maybe this prevents governments from rolling out good education systems, and keeps young girls working out of economic necessity. And maybe this dysfunction also means that more kids are needed to work the fields, or that kids represent the only way people can economically survive in old age — thus raising birth rates. If this is true, then taking girls out of school won’t restore high fertility unless you also return your country to a pre-industrial standard of living.
In fact, there is some research about the causality here (which few of the people in the public debate seem interested in referencing or even reading). Some studies in poor African countries find that sending girls to school for an additional year does reduce childbearing. And that effect probably would be big enough to explain the observed fertility drop from 0 to 8 years of schooling.
But this estimate might not actually hold for higher levels of schooling; it doesn’t really tell us what happens when you go from, say, 9 to 14 years of female education. Chen (2022) looked at an expansion of higher education in China, and found that it actually raised birth rates by a significant amount. Monstad et al. (2008) found zero effect of education on fertility in Norway, and Cummins (2025) find zero effect in England.
So it may be that while giving girls an education reduces fertility rates from the unsustainable, explosive level of 7-8 children per women to maybe 3 or 4, the “last mile” — the drop of fertility below the replacement rate — is due to something else entirely. And since we really do not want to fertility rates of 7-8, this would mean that sending girls to school is unambiguously good for population stability.
If we want to fix the fertility crisis, we should probably not try to transform our society into The Handmaid’s Tale.
One of my favorite bloggers has teamed up with one of my favorite economists to write a paper about housing costs! It’s Christmas in July! Brian Potter, the author of the excellent blog Construction Physics, knows a whole lot about construction costs. And Chad Syverson of the University of Chicago is the undisputed master1 when it comes to measuring productivity. Together, they have written a paper entitled “Building Costs and House Prices”. They write:
Perhaps the clearest conclusion of our analysis is that building costs have never had all that much explanatory power over US housing prices, but even the imperfect correlations of the past have weakened further in recent decades along multiple dimensions.
This is an important result, but — surprisingly — I’m a little unhappy about how the authors frame this conclusion. When you read a sentence like “building costs have never had all that much explanatory power over U.S. housing prices”, you might conclude that we shouldn’t worry about whether it costs $1 million to build a single small unit of affordable housing in San Francisco. And this might imply that we shouldn’t worry about policies that increase housing construction costs, such as onerous contracting requirements or construction regulations. It might also imply that we shouldn’t worry about the stagnation in construction productivity (which Brian Potter has written extensively on).
In fact, this is not what Potter and Syverson are trying to tell us. What they actually mean when they say that construction costs have little “explanatory power” over house prices is three things:
House price differences between cities aren’t very correlated with construction cost differences.
Differences in the rate of growth of house prices between cities aren’t very correlated with differences in the rate of growth of construction costs.
Changes in the rate of growth of U.S. house prices over time don’t seem to line up with changes in the rate of growth of construction costs.
You can really see these conclusions from looking at a single, excellent chart that Potter and Syverson make:
You can see that for some cities like Minneapolis, Houston, Detroit, and Atlanta, prices (the red line) and costs (the blue line) line up almost exactly. But for other cities, like San Francisco, Seattle, Los Angeles, and NYC, prices soar far above costs. These are the “superstar” cities, where people are paying a huge and growing premium to buy houses.
That difference between superstar cities and “normal” cities can explain Potter and Syverson’s findings. Since construction costs alone don’t tell you whether your city is a superstar, just looking at costs can’t predict whether prices in your city will grow like crazy or stay in line with costs. And the emergence of superstar cities as a new phenomenon over time means that the relationship between costs and prices in the country overall has broken down.
And yet if you live in Minneapolis, Houston, Detroit, or Atlanta, you probably do care a lot about construction costs. Sure, higher costs won’t turn you into San Francisco. But they probably will drive up prices. Just from basic theory, or from looking at Potter and Syverson’s chart, it’s pretty hard to argue that raising the cost of building a home in Minneapolis to $10 million with boneheaded regulations wouldn’t make housing a lot less affordable for the people of Minneapolis.
Also, the construction cost data that Potter and Syverson use is cost per house, rather than cost per square foot. When they use cost per square foot, it turns out that construction costs are a lot more correlated with housing prices than in their baseline results:
As one further check on the housing cost-price relationship, we note our [construction] cost levels for every city are for a house with standardized attributes, including size. If the size of median homes differs systematically across cities, this could be another reason for a wedge between prices and building costs. To check this, we compare both Zillow prices (though just for new construction) and RSMeans cost estimates in terms of dollars per square foot for 74 of the 100 cities in our original sample. The regression coefficient of prices on costs using this per unit-area data is 2.16 (s.e. = 0.33), with an R2 of 0.59…Adjusting for the fact that RSMeans cost estimates focus on new construction and for differences in house sizes across metros substantially raises they ability of building costs to explain house prices…That said, even in this best case, almost half of the variation in housing prices remains unexplained.
Um…an R-squared of 0.59 is really high, as empirical results go. And the chart looks like a solid correlation:
I’m not quite sure why the authors decided not to make this substantial correlation the baseline result.
Yes, there are clearly other important factors explaining housing prices — supply restrictions and price bubbles probably being the two main ones. Yes, costs don’t explain the “superstar city” effect that has driven prices off the charts in a few coastal metros. But at the end of the day, this paper still gives me reason to be concerned about stagnating construction productivity.
After the financial crisis of 2008, a common parlor game was to guess where the next financial crisis was going to come from. It never came — at least, not so far. This is probably at least partly an observer effect — because everyone was thinking about financial crises, they were paying a lot of attention to risks of all kinds, which prevented the systemic risk-buildup that causes a crisis.
But now it’s been a long time, so maybe people are getting complacent enough that the seeds of another financial crisis could start to grow. In general, the risks that lead to a financial crisis tend to be concentrated in some hot, new, poorly understood sector of the economy — railroads in the 1870s, financial engineering in the 2000s, and so on. Today, the two obvious hot new poorly understood sectors would be crypto and AI.
And when we look for potential financial crises, we want to look at buildups of debt. It’s debt defaults, not declines in equity prices, that put financial institutions in danger of insolvency. Leverage magnifies risk, and chains of lending increase the complexity and the fragility of the system.
Crypto, being basically just a form of poorly regulated finance, seems almost designed to produce financial crises. For a long time, crypto basically didn’t involve any debt, so when it crashed, people lost their money but nothing systemic went down.2 But I start to get a little worried when I see stories like this one in USA Today:
Due to the rising cost of housing in America, many young people now think they might not ever be able to afford a new house. But don't worry, Bitcoin…could change all that and make home ownership a reality…At the end of June, the U.S. Federal Housing Finance Agency issued a new directive, instructing both Fannie Mae and Freddie Mac to count Bitcoin as an asset on single-family home mortgage applications. Previously, mortgage applicants had to convert any Bitcoin holdings into U.S. dollars if they wanted their crypto to count.
Given Bitcoin's rapid price appreciation over the past decade, this move could end up being a real game-changer…[I]nvesting in Bitcoin could help you afford your next house. Bitcoin is a disinflationary asset and a potential hedge against inflation. Best of all, Bitcoin is widely available to everyone…The really exciting part about all this is that many top investors now expect Bitcoin to hit $1 million within the next five years. For example, Cathie Wood of Ark Invest thinks Bitcoin will hit $1.48 million by the year 2030…Given Bitcoin's current price of $107,000, that's a more than 10x increase within a very short period of time. With those types of gains, you could be well on your way to home ownership in just a few years.
If you grimaced while you read that, you weren’t alone. What if a bunch of unqualified borrowers get mortgages because the government sees that they have a lot of Bitcoin, and assumes that they can pay off their mortgages with the price appreciation of Bitcoin? And then what if Bitcoin goes down in price a bunch, and people start defaulting on their mortgages en masse? The sort of glowing, breathless, “prices can never go down” prose of that article I just quoted from will sound familiar to everyone who lived through the early 2000s.
It’s also possible that the next financial crisis might come from AI — specifically, from the enormous debt-financed boom in data centers. Paul Kedrosky has a great post warning that the companies building these giant data centers are trying to hide their debt:
I was struck late this past week by Meta's rumored $29b fundraising for a rapid buildout of more AI data centers. The company is supposedly talking to various private equity firms, looking to structure it as $26 billion in debt, $3 billion in equity…A friend asked me, "Why do that? Don't they have the money?"…[C]ompanies like Meta, which can raise money from banks at low rates any time they want to, increasingly choose ... not to. Instead, they turn to private investment groups—private equity, essentially—who can create custom financing for the project. And for which the company pays a significant premium over investment grade interest rates. How much more? As much as 200-300 basis points, or 2-3%…
So, why would an investment-grade company [like Meta] agree to do that? They do it because the capital needed for these buildouts is so large that doing it with orthodox balance sheet debt, or by issuing sufficient equity, let alone spending your cash, would make a mess of your balance sheet…By structuring it this way, via special purpose vehicles (SPVs) in which they have joint ownership, companies like Meta don't have to show the debt as their debt. It is the debt of those guys over there, that SPV. Not us. Granted, they retain shared control, and they get to use the AI data center, and nothing there happens without their say-so, but still. It's not ours.
This is accounting trickery, of course. It is a transparent attempt to raise large amounts of money without balance sheet damage by putting the debt in a vehicle you indirectly control, but that, for accounting reasons, doesn't have to be disclosed as your debt on your balance sheet…
At the same time, an increasing percentage of private credit providers are funded, in part, by controlling interests in insurance companies, whose capital they use to fund investments. Finding investments that generate higher yields without higher risk—lending at above-market rates to companies like Meta—is exactly what they want to get higher yield while not running afoul of insurance regulators.
And while it all makes perfect sense as financial engineering, this is where the risks start. Why? Because this system creates a powerful incentive loop between structurally overcapitalized insurers, return-hungry private equity firms, and mega-cap companies trying to avoid looking like they're leveraging up. Everyone gets what they want—until something breaks.
That also sounds like a classic buildup of financial risks. The AI boom is a staggeringly huge one-way bet on a single technology. Sometimes, as with the railroads in the 1870s, a technology that really does end up transforming the world can still create a messy financial crisis along the way.
For my first few years as a blogger, I was primarily known as a critic of macroeconomics as a discipline. The data was far too patchy and confounded to warrant most of the strong conclusions that macroeconomists routinely proclaimed. And macroeconomists rarely tested the “microfoundations” (assumptions about consumer and corporate behavior) that undergirded their models. No one ever seemed to be able to draw a definitive conclusion in macro — a lot of it was all just about who could shout the loudest in seminars, or appeal to the right political opinions, or deluge their critics with a blizzard of equations. It was not very scientific.
Microeconomics, on the other hand, seemed a lot like a real science. Micro theory often really worked — auction theory gave us Google’s business model, consumer theory really could predict consumer behavior (and, often, prices), matching theory was used in a bunch of applications, and so on. And on the empirical side, the “credibility revolution” brought causal methods to the fore, allowing us to make good educated guesses about the effects of many policy changes.
I found that most economists I talked to agreed with me on this basic divide. Now, a decade later, we have some data to show that economists as a whole have been walking away from the grandiosity of macro toward the humble reliability of micro. Garg and Fetzer (2025) use LLMs to analyze the topics of economics papers. Like other researchers, they find that causal empirical methods have been taking over the discipline since the 1990s. But they also note that macroeconomic topics have become steadily less “central” to econ research:
Don’t let the nerdy labels on the graph distract you here; this just means that the authors’ AI says that macro stuff is getting less important as the focus of econ papers, and micro stuff is getting more important, and that the big change happened since the late 1990s. (This chart is actually for papers without causal empirical methods; there’s another similar one for those with causal methods, and the pattern is the same.)
If you think economics chases current events, this could come as a shock. After all, the 90s and early 00s were the “Great Moderation”, where macro events seemed to matter less and less, while the years since 2008 have seen a once-in-a-century financial crisis, a long-lasting global recession, and the return of high inflation. Why are economists thinking less about macro, even as macro has become more important?
The obvious interpretation here is that economists, in general, are walking away from the temptation to spin big, unprovable theories about great big questions, and working on smaller, more answerable questions instead, making use of the new data and methods that the computer revolution has provided them. This suggests that most economists had the same realization I did, and they dealt with it by making their profession more humble and more scientific. Next time you feel the urge to declare that “economics isn’t a science”, remember how the field has changed.
In these roundups, I love keeping track of YIMBY political victories as they pop up. Last week we had a big one. California has a law called CEQA, which is basically a version of the national law NEPA but on steroids. CEQA basically allows NIMBYs to sue any development project for practically any reason — in Berkeley, some people sued to block a low-income housing development, on the grounds that college students who would live there are a form of human pollution. They lost, but the potential for such lawsuits exerts a huge chilling effect on development of all kinds, which is probably one reason California has gotten so unaffordable in recent decades.
Last week, Gavin Newsom signed two bills that will make CEQA a little less of a problem. Here’s a quick explanation of what the bills do:
The first bill, AB 130, expands existing exemptions for infill housing projects in urban areas. Under the revised law, housing projects on infill sites of up to 20 acres will be exempt from CEQA review if certain conditions are met…To be eligible for the expanded exemption under AB 130, an infill housing project must comply with zoning rules, must not be located in a sensitive habitat area, must not require the demolition of any structure on a historic register, and must not be used for temporary lodging (e.g., as a hotel).
The second bill, AB 131, establishes a simplified CEQA review process for infill housing projects that would otherwise not qualify for a CEQA exemption because of a failure to meet a single condition. Under the revised law, these projects will only need to analyze that single failed condition under CEQA instead of conducting a complete environmental review…Among additional changes aimed at streamlining the CEQA process for housing projects, AB 131 also (1) exempts from CEQA review any rezonings consistent with the housing element of a local government’s general plan, and (2) requires the Governor’s Office of Land Use and Climate Innovation to develop, by July 2027, a map of underutilized land within existing urban areas where new infill developments could be built.
As deregulations go, this might sound like small-bore stuff, given all the conditions and caveats. Chris Elmendorf, a UC Davis law professor who has been among CEQA’s most dogged and well-informed critics, believes that the news laws could speed up permitting times for housing in some urban areas by “a lot more than 25%”. And he praises the new laws for avoiding the onerous requirements (“bagel toppings”) that previous YIMBY bills in California had required.
So although it’s only a small start, this is a meaningful victory. And my favorite thing about was what Gavin Newsom said when he signed the bills:
And as Sam D’Amico points out, many of the legal changes apply to advanced manufacturing as well as housing. That’s excellent, and makes me a tiny bit more optimistic about California’s future.
For years, many of us regarded America’s huge prison population as something that required policy changes to solve. But what if the problem ended up solving itself?
From the early 1990s to 2014 there was a huge crime decline in America. Violent crime started to inch up in 2015, and then had a sharp 2-year spike in 2020-21, but in the last few years it has been plummeting back down. And property crime continues its long decline.
As Keith Humphreys notes in The Atlantic, that crime decline is leading pretty mechanically to a drop in youth arrests:
And as he also notes, this is leading to a huge drop in the imprisonment of young people:
Over the next few decades, much of the rapidly graying American prison population will die or be released, and — assuming crime doesn’t soar again — they won’t be replaced by fresh batches of younger criminals. America’s incarceration rate will plummet — not because we decided to be more lenient, but simply because there was a lot less crime for a very long time.
The way to solve mass incarceration is just to reduce crime.
One might even say he is the Chad.
A small exception was the run on a couple of crypto-focused medium-sized regional banks back in early 2023, but that was easily taken care of.
2025-07-06 12:48:26
I remember a moment during the 2012 presidential campaign, when a woman sobbed on camera and cried “I want my America back!”. It bewildered me at the time; as far as I could tell, the America of 2012 was the same America I grew up in — unruly, anti-intellectual, independent to a fault, but kind to their neighbors, hard-working, fiercely protective of their freedoms, and generally accepting of those who were different. Like many others, I shrugged and concluded that the woman who was sobbing on camera was simply upset about the fact that the President was Black.
Thirteen years later, I’m still not sure exactly what that woman was upset about. But I definitely feel that the shoe is on the other foot. Today I look out at my own country and I feel like an intense sense of loss and longing for something that may no longer exist. The shared values that I felt permeated and undergirded my culture haven’t vanished completely, but it feels like among a large segment of the populace, they’ve been replaced with politicized anger.
There is data to support this feeling. Algan et al. (2025) use AI to analyze the sentiment of tweets — both a random sample of tweets, and the tweets of a few hundred of the most prominent political shouters in America (whom they label “partisan citizens”). They find that from around 2016-2019, Americans of both political stripes became much angrier online:
Around that same time, Democrats and political Independents became much less proud to be American:
And a well-publicized poll in 2023 found that Americans prioritize community involvement and tolerance less, and money more, than they did in earlier years:
Some 38% of respondents said patriotism was very important to them, and 39% said religion was very important. That was down sharply from when the Journal first asked the question in 1998, when 70% deemed patriotism to be very important, and 62% said so of religion…The share of Americans who say that having children, involvement in their community and hard work are very important values has also fallen. Tolerance for others, deemed very important by 80% of Americans as recently as four years ago, has fallen to 58% since then.
I’m not the only one who feels this way, either. Right now I’m in Japan promoting my book, and on the 4th of July I gave a talk at a university. A Japanese woman who had lived in America in the 1990s told me that she found American society to be generous, kind, tolerant, and helpful. But when she goes back now, she said, she sees a lot of anger and the culture feels a lot colder.
It’s tempting to look at these data points and anecdotes and conclude that the old America has been swept away by the tide of history, like the Roman Empire or the Qing Dynasty, replaced with something unrecognizable and baleful that just happens to exist on the same plot of land. But I think there’s also evidence that the America I grew up in has not been entirely replaced — that it still exists, battered and a bit shrunken, obscured by the constant flood of social media hate.
For example, when the 2023 poll came out, for instance, Erin Norman wrote a skeptical post, arguing that the shift in values was overstated:
If you don’t make faulty comparisons to previous surveys, the data in the new WSJ/NORC poll is encouraging. Self-fulfillment…is important to 91 percent of Americans. Hard work…tops the list, with 94 percent saying it is important to them…Seventy percent value marriage and 65 percent value having children…90 percent of Americans believe “tolerance for others” is important, and over half qualify it as “very” important…
Perhaps the anonymity of online communication has shown us we aren’t exactly who we thought we were and that there is more diversity of thought in America than stereotypes would suggest. But the WSJ/NORC poll shows that the big-tent, melting pot version of America is very much alive and well.
And a 2024 survey by the Cato Institute found that most Americans of both parties still say they value basic American freedoms:
And at times, the America I grew up in pops up to remind me that it still exists. A few weeks ago I attended one of the largest protests in American history, and everyone was waving American flags and talking about freedom:
And while social media is filled with shouting and hate, mass media still generally portrays the same country I remember from my youth — or even an improved version. The best example of this is the TV show Cobra Kai, which just concluded a 6-season run. Cobra Kai is a follow-up to the Karate Kid movies of the 1980s, focusing on the character of Johnny Lawrence, one of the antagonists from the original 1984 film. Now an aging adult, Johnny belatedly learns to temper the violent, toxic masculinity of his youth with adult values of personal responsibility, community, family, and so on, without ever losing his inherent toughness.
The show is one of the most heartwarming things I’ve ever seen on a screen, and I strongly recommend it. It doesn’t portray a return to the America of the 1980s, but rather an alternate future for that America — a projection of how the country could have kept getting better in the ways it seemed to be getting better back then. The cast is diverse, but racial politics never dominates the story.1 Social media exists, but merely as an adjunct for real life instead of a fantasy-land that absorbs young people’s every waking minute. Real-life community and healthy relationships end up winning out over everything else.
This isn’t a vision of what America is actually like in the 2020s. We can’t simply put down our phones and “touch grass” and go live in the world of Cobra Kai, because that country doesn’t exist. In the real America, young people are glued to TikTok all day, and romantic relationships in high school are now rare, and an Ecuadorian family like that of the protagonist Miguel Diaz would probably worry about the immigration system in some regard. There’s no Trump in Cobra Kai, and no ICE raids. Covid seems never to have happened, and there’s no mention of the BLM protests or the racial tensions of the late 2010s.
Instead, I think fantasies like Cobra Kai show us visions of an America we could have, if we could overcome the forces that filled us with rage and made us so many of us despise our nation. And being the technological determinist that I am, I think we should regard our enemies first and foremost as technological. Our country’s most potent and terrible enemy is in our own pockets.
And having located that enemy, I believe that we can pretty quickly see how to strike it where it lives, and take our country back.
2025-07-04 22:02:13
I usually do a “state of the nation” post on July 4th, but I was traveling to Japan today, and the wifi didn’t work on the plane, so I’ll have to do it tomorrow. In the meantime, I thought I’d repost a fun post that I wrote back in 2022, about how an excessive boom in higher education, coupled with a saturation in the markets for many humanities-based jobs, might have contributed to America’s era of unrest.
Three years later, I think most of what I said in this post still looks right. The decline in college enrollment suggests that Americans might have collectively realized that a bachelor’s degree isn’t an automatic ticket to a comfortable lifestyle. But we may still be in for a second round of elite overproduction, because the “practical” STEM majors that lots of students shifted into in response to the humanities bust are now seeing higher unemployment:
This is due in large part to the crash in tech sector hiring over the past two years:
In fact, unemployment is now higher for recent college graduates than for the general public:
Some people think these trends are due to the rise of generative AI; others disagree. But whatever the reason, the failure of STEM to provide a secure alternative career path in the wake of the big humanities bust might be setting us up for more unrest among the youth.
Anyway, here’s that original post from 2022:
“We're talented and bright/ We're lonely and uptight/ We've found some lovely ways/ To disappoint” — The Weakerthans
Here’s an eye-opening bit of data: The percent of U.S. college students majoring in the humanities has absolutely crashed since 2010.
Ben Schmidt has many more interesting data points in his Twitter thread. To me the most striking was that there are now almost as many people majoring in computer science as in all of the humanities put together:
When you look at the data, it becomes very apparent why the shift is happening. College kids increasingly want majors that will lead them directly to secure and/or high-paying jobs. That’s why STEM and medical fields — and to a lesser degree, blue-collar job-focused fields like hospitality — have been on the rise.
But looking back at that big bump of humanities majors in the 2000s and early 2010s (the raw numbers are here), and thinking about the social unrest America has experienced over the last 8 years, makes me think about Peter Turchin’s theory of elite overproduction. Basically, the idea here is that America produced a lot of highly educated people with great expectations for their place in American society, but that our economic and social system was unable to accommodate many of these expectations, causing them to turn to leftist politics and other disruptive actions out of frustration and disappointment. From the Wikipedia article on Turchin’s theory:
Elite overproduction has been cited as a root cause of political tension in the U.S., as so many well-educated Millennials are either unemployed, underemployed, or otherwise not achieving the high status they expect. Even then, the nation continued to produce excess PhD holders before the COVID-19 pandemic hit, especially in the humanities and social sciences, for which employment prospects were dim.
Turchin and the others who have suggested this theory make some questionable assumptions about how labor markets work — in general, they focus on labor supply while ignoring the importance of labor demand. But still, the Elite Overproduction Hypothesis is fascinating — there’s some circumstantial evidence in its favor, and it dovetails nicely with some other economic and sociological theories I know of. I think it’s a good candidate for explaining at least some of the unrest — and in particular, the resurgence of leftist politics — that we’ve seen in the U.S. recently.
I went over this idea in a Twitter thread back in 2018 (in response to an earlier batch of data about the humanities), but I thought it deserved a longer treatment. As you read this, keep in mind that I’m just making the case for this hypothesis; I’m not sure how much of the last decade it really explains, but I think it’s plausible enough to deserve serious thought.
If you graduated with a degree in English or History back in 2006, what would you do with that degree? If you wanted a secure stable prestigious high-paying job, you could go to law school and be a lawyer. If you wanted to live on the East Coast and work in an industry with a romantic reputation, you could work in media or publishing. If you just wanted intellectual stimulation and prestige, you could try for academia. If you just wanted security and stability and didn’t care that much about money or glamour, you could be a K-12 teacher, or work for the government.
This wealth of career paths probably made young people feel that it was safe to major in the humanities — that despite the stereotype that you couldn’t do anything with an English degree, there was still tons of work out there for them if they wanted it. Studying humanities was fun, it made you feel like an intellectual, and the social opportunities were probably a lot better than if you were stuck in a lab or in front of a computer screen coding all day. And if after a few years enjoying the fullness of youth you felt like going nose-to-the-grindstone and getting that big suburban house and dog and kids and two-car garage like your parents had, well, you could just go to law school.
But in the years after the Great Recession, every one of these career paths has become much more difficult.
First, let’s start with the most important one — the humanity major’s ultimate fallback, the legal profession. Starting around 1970, there was a massive boom in the number of lawyers per capita in the U.S., but by the turn of the century it had started to level off:
As Jordan Weissmann wrote back in 2012, the shakeout that begun in 2008 led to a stagnation in employment in legal services. You could see it in little things like the decline of the “billable hour” (which basically raised lawyers’ incomes by allowing them to overcharge a bit). There was a glut of young people going to law school, but not enough jobs to fulfill their expectations. So after a few years, people realized this, and there was a big crash in law school enrollment:
How about publishing? Here, the decline of titans like Conde Nast is no mere anecdote. The industry also suffered from the Great Recession, but it was probably in long-term decline since the turn of the century:
There’s the internet, of course. Digital publishing is growing, but it seems highly unlikely to make up for the devastation in newsrooms, books, and magazines:
As for academia, tenure-track hiring in the humanities was never exactly robust, but with the decline in higher education funding after the Great Recession, it went into deep decline:
Universities were saving money by replacing tenured faculty with low-paid adjuncts. This led to the horror stories of adjuncts sleeping in their cars, hanging on year after year in the desperate hope that somehow they would catch a lucky break and ascend into the ranks of the tenure track.
How about a job as a public servant? 2008 marked the end of a long boom in government employment:
The same story holds in the K-12 teaching profession. In addition to stagnating employment after 2008, that industry is anything but cushy:
So all of these traditional career paths for humanities graduates suffered in the late 2000s and 2010s. But at the same time, there had been a giant boom in the number of people studying humanities. I showed the percentages above, but looking at the raw numbers gives a better idea of how many people surged into these fields in the 2000s and early 2010s:
That surge set a lot of people up for career disappointment at exactly the wrong time.
So who was hiring in the 2010s? Not Wall Street, which had been at least temporarily tamed by the financial crisis and the Dodd-Frank financial reform. Finance became a tamer and moderately less prestigious industry, and overall employment stagnated. Michael Lewis was famously able to take his art and archaeology degrees and walk into a job as a bond salesman in the 1980s, but in the shakeout after 2008 that was just a lot less possible.
There was Silicon Valley, of course; the 2010s featured the Second Tech Boom, the rise of Google and Facebook and the rest of Big Tech, and the explosion of the venture-funded startup economy. But overall information technology jobs were also in the dumps; sure, you could make a lot of money as an engineer at Google, but “learn to code” is not exactly something you want to be told right after graduating with a degree in art history.
So yes, there were jobs out there in the 2010s. But everything was a lot more competitive than in the decades before — even just to get a regular boring job in corporate America, you had to fight and scrabble. And the kind of intellectually rewarding or socially prestigious careers that humanities majors had prepared for were in especially short supply.
The Elite Overproduction Hypothesis says that this situation produced a combustible social environment that exploded into the unrest of the late 2010s. But why would that happen? Here we have to turn to theory.
The basic concept here hearkens back to the mid 20th century concept of the “revolution of rising expectations”. Here’s a concise statement of the idea:
In the 1960s researchers in sociology and political science applied the concept of the revolution of rising expectations to explain not only the attractiveness of communism in many third world countries but also revolutions in general, for example, the French, American, Russian, and Mexican revolutions. In 1969 James C. Davies used those cases to illustrate his J-curve hypothesis, a formal model of the relationships among rising expectations, their level of satisfaction, and revolutionary upheavals. He proposed that revolution is likely when, after a long period of rising expectations accompanied by a parallel increase in their satisfaction, a downturn occurs. When perceptions of need satisfaction decrease but expectations continue to rise, a widening gap is created between expectations and reality. That gap eventually becomes intolerable and sets the stage for rebellion against a social system that fails to fulfill its promises.
In fact, this idea hearkens back at least to Alexis de Tocqueville, and is sometimes called the Tocqueville Effect. Some people claim to have found evidence for this process.
Why would this happen? If things get better for 20 years and then stop, why would you be mad? After all, at least things are better than they were 20 years ago, right?
But expectations matter. In the finance world, a number of economists have recently been playing around with the idea of “extrapolative expectations”. Basically, when a trend goes on long enough, people start to think there’s some sort of structural process underlying the trend, and therefore they assume the trend will continue indefinitely. For upwardly mobile people, or people in an economy that’s growing rapidly, or people whose stocks or houses are appreciating steadily in value, good times might come to seem normal.
And then what happens when it turns out that good times aren’t baked into the nature of the Universe? Suddenly, the mediocrity of reality intrudes upon the complacent expectations of eternal upward growth — housing prices plateau or fall, incomes hit a ceiling, economic growth stalls out. At this point, people could get quite angry. The economists Miles Kimball and Robert Willis have a theory that happiness is just the difference between reality and expectations. If things are better than you predicted, you’re happy; if things are worse, you’re upset. Kimball and Willis formalize the idea with math, but in fact “Happiness = Reality - Expectations” is already a common saying. Evidence from surveys generally supports the idea.
Together, this expectations-based theory of happiness, along with the idea that expectations are extrapolative, makes for a combustible mix. Extrapolative expectations are almost always unrealistic — growth trends don’t continue forever, so people are setting themselves up for disappointment.
A number of people have invoked this idea to explain the massive global wave of protests in 2019 and 2020. Some World Bank researchers wrote that “[Latin American] protesters…are emboldened by recent social gains, rather than by worsening conditions, to demand levels of fairness and equality which are still far from their reality.” In Chile, the Latin American with the most intense and widespread protests, there was also a growth slowdown in the mid-2010s after decades of rapidly rising living standards:
Anyway, it’s pretty simple to apply this to the U.S. in the 2010s. Productivity growth, which had been robust since the early 90s, slowed down sharply around 2005. Housing prices — a big determinant of middle-class wealth — plateaued in 2006 and began to decline in 2007. And the economy crashed in the Great Recession.
But for elites, especially those on the humanities track, the years after the Great Recession were a particularly brutal slap in the face. Income had largely stagnated for Americans with low and medium incomes, but for people in the upper middle class there had still been steady growth — and the upper middle class is the class in which college graduates typically expect to find themselves. The fact that so many young people flooded into humanities majors in the 2000s and early 2010s suggests that lots of them expected a double bounty — to be able to earn a good income while also having a career that fit their personal interests.
In a 2013 blog post, Tim Urban showed some data that supports this story. Here is a Google Ngrams search for the phrase “a fulfilling career”:
Urban somewhat mockingly depicts educated Millennial expectations with the following meme:
I don’t think people deserve to be mocked for having great expectations for their lives, or for being frustrated when those expectations don’t pan out. Try to think of things from the perspective of a 25-year-old who just graduated from UC Davis in 2010 with an English degree. For the past four years, you’ve lived the life of an intellectual — you’ve read dozens of books, expanded your mind with a hundred deep ideas about society and history and the purpose of life, spent long nights discussing and debating those ideas with people just as smart as you are. And all that time, whether you’re the first in your family to go to school or a scion of an upper-middle-class family looking to make your parents proud, you’ve been told that college is the ticket to a spot in the top 20% of American society. You and your parents have certainly paid a price tag that reflects that expectation! And on top of that, everyone has told you that you can (and should!) find a career doing something you love, something that helps the world, and something that uses the education you paid so much to get.
Then you graduate, and nobody wants lawyers, magazines are dying, newsrooms are dying, universities aren’t hiring, and your best bet is either to roll the dice again with years of grad school or to claw your way into some corporate drone job where you’ll be filing TPS reports all day while your diploma rots in a box in your parents’ attic. Meanwhile you’re stuck with $40,000 in undergraduate debt, and the payments are now coming due. It’s neither entitled nor bratty nor arrogant to be unhappy with that outcome.
So I think this is a strong candidate for explaining why unrest exploded among the American elite in the late 2010s.
I’ve been talking about humanities major so far, because Ben Schmidt’s data is so striking, and because humanities careers seem to have borne the worst of the brunt of the post-2008 economic shakeout. But although it may have been most intense among downwardly mobile or unemployable English majors, the unrest in the 2010s was really a broader phenomenon that touched most of America’s young elites.
Various polls throughout the decade showed that young Americans with college degrees were a bit less happy at work than their high-school-educated peers, despite making a lot more money.
It’s easy to draw a line between this unhappiness and the socialist movement in the U.S. Socialism rapidly became more popular among young Americans in the 2010s, and the Bernie Sanders movement exploded upon the national scene. The socialist movement has people from all classes, but overall it’s far from a proletarian movement — this is fundamentally a revolt of the professional-managerial class, or at least the people who expected their education to make them a part of that class. It’s telling that two of the new socialist movement’s most passionate crusades have been student debt forgiveness and free college.
For me, a telling anecdote that first clued me into this hypothesis was when I debated Jacobin writer Meagan Day in 2018. When I pointed out that very few Americans are financially destitute, she responded that “it’s not just destitution, it’s disappointment”, and proceeded to describe her own frustration with the two unpaid internships she went through as a struggling college-educated writer. (This is far from the only such origin story.) From that moment onward, when socialists with college degrees talked to me about the “working class”, it became clear to me that the class they were describing was themselves.
But educated youth unrest in the late 2010s went far beyond socialism. In the 60s it was the urban poor who rioted, but surveys found that the people who flooded into the streets during the massive protests of summer 2020 were disproportionately college-educated. It’s even possible to see wokeness itself as partly an expression of frustration with the stagnant hierarchies of elite society in early 2010s America. After all, if the number of spots at university departments and companies and schools and government agencies suddenly stops growing, it means that young people’s upward mobility will be blocked by an incumbent cohort of older people who — given the greater discrimination and different demographics of earlier decades — are disproportionately White and male.
We like to think of revolutions as being carried out by downtrodden factory workers and farmers, and in some cases that’s true. But frustrated and underemployed elites are uniquely well-positioned to disrupt society. They have the talent, the connections, and the time to organize radical movements and promulgate radical ideas. So far, education polarization means that a large fraction of the non-college majority hasn’t chosen to join these movements (or has expressed unrest in different, far less intellectual ways). But a society that generates a large cohort of restless, frustrated, talented, highly educated young people is asking for trouble.
So if the Elite Overproduction Hypothesis is broadly correct, how do we get out of this mess? If happiness equals reality minus expectations, simple math tells us that we basically have two options for pacifying our educated youth — improve reality, or reduce expectations.
Improving reality is very hard, but we’re working on it. The industrial policies of the Biden administration are aimed at jump-starting faster economic growth, and more progressives are talking about an “abundance agenda” that would reduce the cost of living for Americans of all classes. But barring a lucky break like the simultaneous tech boom and cheap oil of the 1990s, boosting growth and abundance will be painstakingly slow going. It will also require overcoming the opposition of a whole lot of vested interests — particularly local NIMBYs — who themselves will be disappointed and angry if the government railroads their parochial preferences to fulfill its national objectives.
A more feasible strategy is to reset expectations to a more realistic — or even pessimistic — level. If we take humanities majors as a measure of economic optimism, we can already see this happening, as young people turn to more practical degrees. Interestingly, Google Ngrams for “a fulfilling career” have now ticked down as well. The over-optimistic angry Millennial generation may soon be supplanted by a Generation Z whose modest expectations echo those of their Gen X parents in the late 70s and early 80s.
There may be things that cultural creators and media figures like myself might be able to do to help this “expectations reset” along. Perhaps we should emphasize grit and struggle instead of talking so much about wealth and personal fulfillment.
The government and universities have to be part of this too. Canceling student debt is fine, but long-term reforms to reduce the cost of college, and the debt burdens students incur, will reduce the stakes of the post-college job scramble. Universities should avoid marketing materials that depict them as a golden ticket to wealth and intellectual fulfillment, and should offer career counseling that prepare students for a realistic job market. And government should implement apprenticeships, vocational education, free community college, and other programs that make working-class life a decent bet — in addition to reducing inequality, this will make college graduates feel less “elite” relative to their non-college peers.
Of course, an expectations reset isn’t permanent. If we manage to restore good times, future generations will get used to that upward escalator, and they’ll form extrapolative expectations for their own glide path to success. But that’s a worry for the future. If the Elite Overproduction Hypothesis is true, then our best bet to calm our age of unrest is to bring our dreams down to Earth.
2025-07-02 16:16:00
“I wanted the money.” — Edward Pierce
Donald Trump has done away with much of the Reaganite conservative ideology that defined the Republican party of my youth. But one Reagan tradition remains in place: Every time the GOP finds itself in control of both Congress and the Presidency, they pass a giant tax cut. Bush did this in 2001, Trump did this in 2017, and now Trump is about to do it once again in 2025. Trump’s rather ludicrously titled One Big Beautiful Bill Act is, first and foremost, a tax cut bill. The Economist tallied up the numbers on the version of the OBBBA that just passed the Senate, and found that tax cuts dominated everything else in terms of their impact on the government’s finances:
A more detailed breakdown is available from the Committee for a Responsible Federal Budget. The New York Times has found similar numbers for the House version of the bill.
A lot of people are mad about the OBBBA, for many reasons — the health insurance cutoffs, the huge cuts to scientific research, the crazy energy policy, and so on. But really, this bill is first and foremost about tax cuts for the rich.
Those new tax cuts will require a staggering amount of government debt. Even with all the money that the OBBBA will cut from Medicaid, energy, and so on, the Senate version will add an estimated $3.9 trillion to the federal debt over the next decade. Federal debt is already on an unsustainable path, but this will get much worse after Trump’s big tax cut:
In fact, that’s a low estimate; the truth is probably much worse. As the CRFB points out, Trump’s bill pretends that some of its tax cuts will expire in the future, when in fact they will probably be made permanent. That would raise the debt cost of the bill by an additional trillion dollars or more.
What is the point of borrowing money to cut taxes? Every time they do this, Republicans claim that their tax cuts will supercharge economic growth so much that they’ll pay for themselves, and actually reduce the deficit. And if you go on the White House website, you can still find them making this claim:
MYTH: The One Big Beautiful Bill “increases the deficit.”
FACT: The One Big Beautiful Bill reduces deficits by over $2 trillion by increasing economic growth and cutting waste, fraud, and abuse across government programs at an unprecedented rate….President Trump’s pro-growth economic formula will reduce the deficit…MYTH: “But the CBO says….”
FACT: The Crooked Budget Office has a terrible record with its predictions and hasn’t earned the attention the media gives it. The CBO misreads the economic consequences of not extending the Trump Tax Cuts. The One Big Beautiful Bill delivers real savings that will unleash our economy and prevent the largest tax hike in history, resulting in historic prosperity, while lowering the debt burden.
In the 1980s, you could probably find a few economists who actually believed that tax cuts would pay for themselves. Nowadays, no one really thinks this is true; every big tax cut since the Reagan days has increased the federal debt.
In fact, Trump’s 2017 tax cuts were better in this regard than most. Economists generally agree that corporate taxes are more harmful to the economy than personal income taxes, so when Trump cut the corporate tax rate, some were optimistic. But while Trump’s first tax cut did spur a bit of growth, Chodorow-Reich and Zidar (2024) find that this only offset a tiny amount of the deficit that the tax cut created:
Domestic investment of firms with the mean tax change increases 20% versus a no-change baseline. Due to novel foreign incentives, foreign capital of U.S. multinationals rises substantially. These incentives also boost domestic investment, indicating complementarity between domestic and foreign capital. In the model, the long-run effect on domestic capital in general equilibrium is 7% and the tax revenue feedback from growth offsets only 2p.p. of the direct cost of 41% of pre-TCJA corporate revenue.
This time around, things are likely to be even worse, because of long-term interest rates.
In the late 2010s, for reasons we still don’t entirely understand, America was still in a disinflationary regime, where the Fed could and did keep interest rates at zero without causing inflation. ZIRP meant that big deficits didn’t push up interest rates. Since the pandemic, however, America has been in an inflationary regime, where the Fed has had to keep interest rates around 4-5% in order to prevent inflation from rising. In other words, it looks like interest rates have normalized, and that means there are probably no more free lunches.
There are basically three ways that increased debt can make long-term interest rates rise.
First, more government borrowing can increase inflation. People may assume the government will print money in order to help pay off the debt in the future. This can become a self-fulfilling prophecy, where businesses raise prices in order to get ahead of the inflation — thus causing the very inflation they feared. In this case, the Fed will have to hike short-term rates in order to squash inflation, and raising short-term rates causes long-term rates to rise as well.
Second, higher deficits can crowd out private investment. If banks and bond investors have only a limited amount to invest, then higher government debt can starve the private sector of capital, forcing everyone to pay higher interest rates in order to borrow.
Finally, higher deficits can introduce a default risk premium into long-term interest rates. Investors may assume that the government’s reckless borrowing is a sign that it doesn’t take its own future solvency seriously. At that point, they may start charging the U.S. government a premium to borrow money. If that happens, private businesses have to pay higher rates too.
The Yale Budget Lab thus predicts that Trump’s OBBBA will cause long-term rates to rise:
This would hurt U.S. economic growth, since higher interest rates make it harder for companies to borrow and invest and grow:
A 3% lower GDP is only a small hit to growth, but it’s in the wrong direction — instead of Trump’s new tax cuts paying for themselves, they’ll actually make themselves slightly harder to pay for. And increased rates will also put the U.S. government in a more perilous fiscal position, forcing it to borrow more and more just to service its debt in the future. This line is only going to go higher:
So I doubt that anyone in the administration or the GOP still believes the old line that tax cuts pay for themselves.
If juicing economic growth isn’t actually the point, why is the GOP running up government deficits in order to cut taxes by trillions of dollars? Essentially, they’re promising to tax tomorrow’s taxpayers — whether via higher taxes, inflation, or a sovereign default — in order to give today’s taxpayers a rebate. Why would they do that?
I can think of a number of reasons, none of them very encouraging about the state of our nation. The GOP might simply have an eye on the electoral cycle, hoping to hand out goodies in order to secure a donor base and shore up support among the wealthy voters who have been defecting to the Democrats in recent elections. They might think that a short-term Keynesian stimulus from tax cuts might outweigh the long-term economic drag for a couple of years, allowing them to do better in the 2026 midterms or the 2028 presidential election.
The darkest possibility is that some Republican leaders think that America is effectively a walking corpse of a nation — that its future is nonwhite and non-Christian, and therefore Republicans might as well simply raid and scavenge as much as they can before America turns into South Africa.
But that’s probably too dark. I think the most likely explanation is that Republicans and their existing donor base simply want the money. The U.S. electorate’s obsession with culture wars gives elites an opening to simply raid the U.S. Treasury without suffering major consequences. Few people really like the OBBBA, and the public probably realizes that the bill is an engine of upward redistribution. But maybe the GOP thinks that when the next election comes around, they’ll grumblingly ignore bread-and-butter issues and vote based on immigration, DEI, trans issues, or whatever the current culture war happens to be.
In any case, America simply can’t afford to keep giving big tax cuts to rich people. The government was already running massive deficits, and paying an increasing interest bill on its existing stock of debt. On top of that, the aging of the U.S. population means that Medicare (and to a lesser extent, Social Security) is going to end up costing a lot more over the next three decades:
All of this means that the U.S. has to both raise taxes and cut spending in order to maintain solvency and keep interest costs down.1 The Medicaid cuts that the Republicans are enacting are cruel, but unless our government comes up with some way to control health costs much more effectively than ever before, something along those general lines will eventually be necessary. There just isn’t much else that the government spends a lot of money on; defense spending has been cut to the bone, even as foreign threats proliferate:
So health spending was always going to have to be cut, and even if the poor could have been spared the brunt of those cuts, it was always going to hurt the middle class. That’s bad.
The way to make that bargain seem fair was always to tax the rich. In the 1990s, Bill Clinton hiked taxes on the rich, and managed to raise federal revenue from 16.7% of GDP in 1992 to 19% of GDP by 1998. 2.3% of GDP might not sound like a lot, but today that would be $690 billion a year, or $6.9 trillion over a decade. The rich didn’t even get particularly mad about this. And it ended up making Clinton’s fiscal austerity a lot more palatable to the masses, because people knew the rich were paying their fair share to help bring the deficit down.
And yet somewhere in the years since Clinton, the Democrats lost their appetite for taxing the rich. Obama taxed the rich a little bit, and Biden only by a token amount. The U.S. went from having one fiscally responsible party in the 1990s to having zero fiscally responsible parties. Debt soared under Obama, soared again in Trump’s first term, and then soared again under Biden:
If politicians keep getting rewarded for blowout deficits, massive health care spending increases, and tax cuts for the rich, the U.S. government’s solvency is eventually going to be called into question. We don’t know exactly when that will happen, but the amount of fiscal irresponsibility that bond market investors will be willing to tolerate is not infinite.
There are lots of things America’s government can no longer afford. One of those things is tax cuts for the rich.
Update: Why am I only saying “raise taxes on the rich” instead of “raise taxes on the middle class”? Commenter Dan writes:
The unpleasant reality that neither party wants to contemplate is that we don't just need to raise taxes and cut spending: we need to raise taxes _on the middle class_ and significantly cut spending. America already has one of the most steeply progressive tax systems in the world. There are lots of loopholes we can and should close to prevent the ultra-rich from paying less than their fair share -- I favor eliminating preferential capital gains tax, eliminating step-up in basis, taxing unrealized gains used as collateral to secure loans, and significantly reducing inheritance tax exemptions, myself -- but there just aren't enough ultra-rich to tax to plug the budget gaps, and they're already paying a significantly outsized share of taxes relative to their percentage of total national income.
People often idolize the welfare states of Northern Europe while neglecting to notice that those welfare states are not paid for with punitive taxes on the rich, but rather by tax rates on the middle class that are close to double what Americans pay. We've been living beyond our means as a nation for some time now, and that bill is coming due. And both parties are simply hoping the other one is forced to answer the door when the debt collectors come knocking.
Dan is absolutely right that if America wanted to have a welfare state like a European country, then we would have to raise taxes on the middle class — ideally through a VAT.
But America does not have a European-style welfare state. We have a welfare state, but it’s focused mostly on the poor, with highly targeted benefits. European countries, in contrast, provide broad, less-targeted services toward their middle classes and poor. This is why Europe’s government spending tends to be higher than America’s by about 8 to 12 percent of GDP (a really big gap):
Yes, if we want to be like Europe, we’re going to need big middle-class tax hikes. Progressives like Bernie Sanders don’t seem to understand this yet, but it’s true.
BUT, what America needs now is austerity. Before we decide whether we want a European-style welfare state, we need to get our fiscal house in order. That will involve painful spending cuts, especially to health care. Those cuts will fall heaviest on the middle class. To ask middle-class Americans to pay higher taxes on top of those benefit cuts would be both unfair and politically impossible.
Hence, for the sake of austerity, what we need are higher taxes on the rich. I’m absolutely looking forward to paying my fair share of that.
MMT people and some progressives will tell you that we don’t have to raise taxes or cut spending, because a sovereign country without much external debt can never be forced to default. This is technically true — with a few legal changes, the Fed could just print money to cover U.S. debts — but that would cause spiraling inflation, which is even more painful than austerity.