MoreRSS

site iconMIT Technology ReviewModify

A world-renowned, independent media company whose insight, analysis, reviews, interviews and live events explain the newest technologies and their commercial, social and polit.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of MIT Technology Review

The Download: introducing the Nature issue

2026-04-23 20:10:00

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Introducing: the Nature issue

When we talk about “nature,” we usually mean something untouched by humans. But little of that world exists today. 

From microplastics in rainforest wildlife to artificial light in the Arctic Ocean, human influence now reaches every corner of Earth. In this context, what even is nature? And should we employ technology to try to make the world more “natural”?  

In our new Nature issue, MIT Technology Review grapples with these questions. We investigate birds that can’t sing, wolves that aren’t wolves, and grass that isn’t grass. We look for the meaning of life under Arctic ice, within ourselves, and in the far future on a distant world, courtesy of new fiction by the renowned author Jeff VanderMeer. 

Together, these stories examine how technology has altered our planet—and how it might be used to repair it. Subscribe now to read the full print issue.

What’s next for large language models?

After ChatGPT launched in late 2022, the OpenAI chatbot became an everyday everything app for hundreds of millions of people. It led to LLMs being heralded as the new future. The entire tech industry was consumed by the inferno, with companies racing to spin up rival products.

But what’s the next big thing after LLMs? More LLMs—but better. Let’s call them LLMs+. Find out how they’re set to become cheaper, more efficient, and more powerful.

—Will Douglas Heaven

LLMs+ is on our list of the 10 Things That Matter in AI Right Now, MIT Technology Review’s guide to what’s really worth your attention in the busy, buzzy world of AI. We’ll be unpacking one item from the list each day here in The Download, so stay tuned.

Will fusion power get cheap? Don’t count on it.

Fusion power could provide a steady, zero-emissions source of electricity in the future—if companies can get plants built and running. But a new study published in Nature Energy suggests that even if that future arrives, it might not come cheap.

The research team aimed to improve predictions of fusion’s future price by estimating the technology’s experience rate—the percentage by which its cost declines every time capacity doubles. Their findings offer new clues on the technology’s path to deployment. Read the full story.

—Casey Crownhart

This story is from The Spark, our weekly climate newsletter. Sign up to receive it in your inbox every Wednesday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Trump signaled he’s open to reversing the Anthropic ban
What that really means in practice remains to be seen. (Reuters $)
+ Anthropic says there’s no “kill switch” for its AI. (Axios)
+ “Humans in the loop” in AI warfare is an illusion. (MIT Technology Review)

2 SpaceX plans to manufacture its own GPUs
To support the company’s growing AI ambitions. (Reuters $)
+ Musk is shifting SpaceX’s focus from Mars to AI ahead of its IPO. (NYT $)
+ SpaceX and Tesla may be on a collision course. (FT $)

3 Chinese tech giant Tencent has unveiled its first flagship AI model
A former OpenAI researcher is at the helm. (SCMP)
+ Chinese open models are spreading fast. (MIT Technology Review)

4 High earners are racing ahead on AI, deepening workplace divides
The division in adoption risks widening inequality. (FT $)
+ Startups are bragging they spend more on AI than staff. (404 Media)

5 Thousands of Samsung workers are demanding a new share of AI profits
Chip-division employees want 15% of the operating profit. (Bloomberg $)
+ Here’s why opinion on AI is so divided. (MIT Technology Review)

6 AI is helping mediocre Korean hackers steal millions
They’re vibe coding their malware. (Wired $)
+ AI is making online crimes easier. (MIT Technology Review)

7 Kalshi suspended three political candidates for betting on their own races
Including a Democrat and a Republican running for Congress. (CNN)
+ And an independent candidate who said he did it to make a point. (Gizmodo)
+ Lawmakers argue that prediction markets are a loophole for gambling. (NPR)

8 A ping-pong robot is beating elite human players for the first time
The Sony AI system was trained with reinforcement learning. (New Scientist)
+ Just days earlier, a humanoid smashed the human half-marathon record. (AP)

9 Crypto scammers are luring ships into the Strait of Hormuz
By falsely promising safe passage. (Ars Technica)

10 ‘Age tech’ could help us grow old comfortably at home
Apps, wearables, and remote monitoring could fill caregiving gaps. (NYT $)

 

Quote of the day

“It’s a hallucinogenic business plan.”



—Ross Gerber, the chief executive of Gerber Kawasaki, an investment firm that owns SpaceX shares, tells the New York Times that he’s unimpressed by Musk’s changing goals for the aerospace company. 

One More Thing

Photos of victims are displayed under white crosses at a memorial for the August 2023 wildfire victims
AP PHOTO/LINDSEY WASSON


This grim but revolutionary DNA technology is changing how we respond to mass disasters

After hundreds went missing in Maui’s deadly fires, victims were identified with rapid DNA analysis—an increasingly vital tool for putting names to the dead in mass-casualty events.

The technology helped identify victims within just a few hours and bring families some closure more quickly than ever before. But it also previews a dark future marked by the rising frequency of catastrophic events.

Find out how this forensic breakthrough is preparing us for a more volatile world.


—Erika Hayasaki

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line.)

+ This fascinating dive into botanical history reveals the origins of the first true plants.
+ Here’s how to use Google’s reference desk to find what ordinary search engines miss.
+ Watch duct tape get deconstructed to reveal the physics behind its legendary stickiness.
+ When Radiohead covers Joy Division, the result is a beautiful intersection of two legendary musical eras.

Will fusion power get cheap? Don’t count on it.

2026-04-23 18:00:00

Fusion power could provide a steady, zero-emissions source of electricity in the future—if companies can get plants built and running. But a new study suggests that even if that future arrives, it might not come cheap.

Technologies tend to get less expensive over time. Lithium-ion batteries are now about 90% cheaper than they were in 2013. But historically, different technologies tend to go through this curve at different rates. And the cost of fusion might not sink as quickly as the prices of batteries or solar.

It’s tricky to make any predictions about the cost of a technology that doesn’t exist yet. But when there’s billions of dollars of public and private funding on the line, it’s worth considering what assumptions we’re making about our future energy mix and its cost.

One crucial measure is a metric called experience rate—the percentage by which an energy technology’s cost declines every time capacity doubles. A higher figure means a quicker price drop and better economic gains with scaling.

Historically, the experience rate is 12% for onshore wind power, 20% for lithium-ion batteries, and 23% for solar modules. Other energy technologies haven’t gotten cheap quite as quickly—fission is at just 2%.

In the new study, published in Nature Energy, researchers aimed to improve predictions of fusion’s future price by estimating the technology’s experience rate. The team looked at three key characteristics that can correlate with experience rate: unit size, design complexity, and the need for customization. The larger and more complex a technology is, and/or the more it needs to be customized for different use cases, the lower the experience rate.

The researchers interviewed fusion experts, including public-sector researchers and those working at companies in the private sector. They had the experts evaluate fusion power plants on those characteristics and used that info to predict the experience rate. (One note here: The study focused only on magnetic confinement and laser inertial confinement, two of the leading fusion approaches, which together receive the vast majority of funding today. Other approaches could come with different cost benefits.)

Fusion plants will likely be relatively large, similar to other types of facilities (like coal and fission power plants) that rely on generating heat. They will probably need less customization than fission plants—largely because regulations and safety considerations should be simpler—but more than technologies like solar panels. And as for complexity, “there was almost unanimous agreement that fusion is incredibly complex,” says Lingxi Tang, a PhD candidate in the energy and technology policy group at ETH Zurich in Switzerland and one of the authors of the study. (Some experts said it was literally off the scale the researchers gave them.)

The final figure the researchers suggest for fusion’s experience rate is between 2% and 8%, meaning it will see a faster price reduction than nuclear power but not as dramatic an improvement as many common energy technologies being deployed today.

That means that it would take a lot of deployment—and likely quite a long time—for the price of building a fusion reactor to drop significantly, so electricity produced by fusion plants could be expensive for a while. And it’s a much slower rate than the 8% to 20% that many modeling studies assume today.

“On the whole, I think questions should be raised about current investment levels in fusion,” Tang says. (The US allocated over $1 billion to fusion in the 2024 fiscal year, and private-sector funding totaled $2.2 billion between July 2024 and July 2025.) “If you’re talking about decarbonization of the energy system, is this really the best use of public money?”

But some experts say that looking to the past to understand the future of energy prices might be misleading.“It’s a good exercise, but we have to be humble about how much we don’t know,” says Egemen Kolemen, a professor at the Princeton Plasma Physics Laboratory.

In 2000, many analysts predicted that solar power would remain expensive—but then production exploded and prices came crashing down, largely because China went all in, he says. “People weren’t exactly wrong then,” he adds. “They were just extrapolating what they saw into the future.”

How fast prices drop depends on regulations, geopolitical dynamics, and labor cost, he says: “We haven’t built the thing yet, so we don’t know.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The Download: introducing the 10 Things That Matter in AI Right Now

2026-04-22 20:10:00

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Introducing: 10 Things That Matter in AI Right Now

What actually matters in AI right now? It’s getting harder to tell amid the constant launches, hype, and warnings. To cut through the noise, MIT Technology Review’s reporters and editors have distilled years of analysis into a new essential guide: the 10 Things That Matter in AI Right Now.

The list builds on our annual 10 Breakthrough Technologies, but takes a wider view of the ideas, topics, and research shaping AI, spotlighting the trends and breakthroughs shaping the world.

We’ll be unpacking one item from the list each day here in The Download, explaining what it means and why it matters. Read the full rundown now—and stay tuned for the days ahead.

MIT Technology Review Narrated: desalination plants in the Middle East are increasingly vulnerable

As the conflict in Iran has escalated, a crucial resource is under fire: the desalinization technology that supplies water in the region.

President Donald Trump recently threatened to destroy “possibly all desalinization plants” in Iran if the Strait of Hormuz is not reopened. The impact on farming, industry, and—crucially—drinking in the Middle East could be severe. Find out why.

—Casey Crownhart

This is our latest story to be turned into an MIT Technology Review Narrated podcast, which we publish each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 An unauthorized group has reportedly accessed Anthropic’s Mythos
Users in a private online forum may have gained access. (Bloomberg $)
+ Anthropic said the model was too dangerous for a full release. (Axios)
+ Mozilla used it to find 271 security vulnerabilities in Firefox. (Wired $)

2 Meta will track workers’ clicks and keystrokes for AI training
Tracking software is being installed on workers’ computers.(Reuters $)
+ Employees are up in arms about the program. (Business Insider)
+ LLMs could supercharge mass surveillance in the US. (MIT Technology Review)

3 ChatGPT allegedly advised the Florida State shooter
About when and where to strike, and which ammunition to use. (Washington Post $)
+ Florida’s attorney general is probing ChatGPT’s role in the shooting. (Ars Technica)
+ Does AI cause delusions or just amplify them? (MIT Technology Review)

4 SpaceX has secured the option to buy AI startup Cursor for $60 billion
Or pay $10 billion for the work they’re doing together. (The Verge)
+ SpaceX made the deal as it prepares to go public. (NYT $)
+ Musk’s endgame for the company may be a land grab in space. (The Atlantic $)

5 The Pentagon wants $54 billion for drones
That would rank among the top 10 military budgets for entire nations. (Ars Technica)
+ Shoplifters could soon be chased down by drones. (MIT Technology Review)

6 Apple’s new chief hardware officer signals a sprint to build in-house chips
Apple silicon lead Johny Srouji has been promoted to the role. (CNBC)

7 China’s government is tightening its grip on AI firms that try to leave
It’s doing all it can to stop firms like Manus sending talent and research overseas. (Washington Post $)

8 The FBI is probing the deaths of scientists tied to sensitive research

Including a nuclear physicist and MIT professor shot outside his home. (CNN)

9 The US is accelerating research into psychedelic medical treatment
Including the mysterious ibogaine. (Nature)
+ But psychedelics are (still) falling short in clinical trials. (MIT Technology Review)

10 The first retail boutique run by an AI agent has opened—and it’s chaos
The San Francisco shop is reassuringly mismanaged. (NYT $)


Quote of the day

“I was very impressed with myself to have the head of Apple calling to ‘kiss my ass’.” 

—Donald Trump pays a classy tribute to Tim Cook on Truth Social.

One More Thing

JOHN F. MALTA


This researcher wants to replace your brain, little by little

A US agency pursuing moonshot health breakthroughs has hired a researcher advocating an extremely radical plan for defeating death. His idea? Replace your body parts. All of them. Even your brain. 

Jean Hébert, a program manager at the US Advanced Research Projects Agency for Health (ARPA-H), believes we can beat aging by adding youthful tissue to people’s brains. Read the full story on his futuristic plan to extend human life


—Antonio Regalado

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line.)

+ A Lego set was sent to the edge of space—and survived.
+ Go behind the scenes with Werner Herzog as he guides a new generation of filmmakers.
+ This video about enshittification perfectly captures the frustration of the degrading internet.
+ NASA’s latest deep-space capture offers a rare view of planetary systems in their absolute infancy.

AI needs a strong data fabric to deliver business value

2026-04-22 18:05:06

Artificial intelligence is moving quickly in the enterprise, from experimentation to everyday use. Organizations are deploying copilots, agents, and predictive systems across finance, supply chains, human resources, and customer operations. By the end of 2025, half of companies used AI in at least three business functions, according to a recent survey.

But as AI becomes embedded in core workflows, business leaders are discovering that the biggest obstacle is not model performance or computing power but the quality and the context of the data on which those systems rely. AI essentially introduces a new requirement: Systems must not only access data — they must understand the business context behind it. 

Without that context, AI can generate answers quickly but still make the wrong decision, says Irfan Khan, president and chief product officer of SAP Data & Analytics. 

“AI is incredibly good at producing results,” he says. “It moves fast, but without context it can’t exercise good judgment, and good judgment is what creates a return on investment for the business. Speed without judgment doesn’t help. It can actually hurt us.”

In the emerging era of autonomous systems and intelligent applications, that context layer is becoming essential. To provide context, companies need a well-designed data fabric that does more than just integrate data, Khan says. The right data fabric allows organizations to scale AI safely, coordinate decisions across systems and agents, and ensure that automation reflects real business priorities rather than making decisions in isolation. 

Recognizing this, many organizations are rethinking their data architecture. Instead of simply moving data into a single repository, they are looking for ways to connect information across applications, clouds, and operational systems while preserving the semantics that describe how the business works. That shift is driving growing interest in data fabric as a foundation for AI infrastructure.

Losing context is a critical AI problem

Traditional data strategies have largely focused on aggregation. Over the past two decades, organizations have invested heavily in extracting information from operational systems and loading it into centralized warehouses, lakes, and dashboards. This approach makes it easier to run reports, monitor performance, and generate insights across the business, but in the process, much of the meaning attached to that data — how it relates to policies, processes, and real-world decisions — is lost. 

Take two companies using AI to manage supply-chain disruptions. If one uses raw signals such as inventory levels, lead times, and supply scores, while the other adds context across business processes, policies, and metadata, both systems will rapidly analyze the data but likely come up with different conclusions. 

Information such as which customers are strategic accounts, what tradeoffs are acceptable during shortages, and the status of extended supply chains will allow one AI system to make strategic decisions, while the other will not have the proper context, Khan says. 

“Both systems move very quickly, but only one moves in the right direction,” he says. “This is the context premium and the advantage you gain when your data foundation preserves context across processes, policies and data by design.”

In the past, companies implicitly managed a lack of context because human experts provided the missing information, but with AI, there is a shortfall and that creates serious limitations. AI systems do not just display information; they act on it. If a system does not explain why data matters, an AI model may optimize for the wrong outcome. Inventory numbers, payment histories, or demand signals might be accurate, but they do not necessarily reveal which customers must be prioritized, which contractual obligations apply, or which products are strategically important. As a result, the system can produce answers that are technically correct but operationally flawed.

This realization is changing how companies think about AI readiness. Most acknowledge that they do not have the mature data processes and infrastructure in place to trust their data and their AI systems. Only one in five organizations consider their approach to data to be highly mature, and only 9% feel fully prepared to integrate and interoperate with their data systems.

Don’t consolidate, integrate

The emerging solution is a data fabric: An abstraction layer that spans infrastructure, architecture, and logical organization. For agentic AI, the fabric becomes the primary interface, allowing agents to interact with business knowledge rather than raw storage systems. Knowledge graphs play a central role, enabling agents to query enterprise data using natural language and business logic.

The value of the data fabric relies on three components: Intelligent compute to provide speed, a knowledge pool to provide business understanding and context, and agents to provide autonomous action are grounded in that understanding. What makes this powerful is how these capabilities work together, says Khan. 

The technology provides the architecture — a foundation that makes agent-to-agent communication and coordination possible. The process will define how businesses and IT share ownership, and establish governance and a culture in which people trust enough to adopt it. Now all three things must work together for a business data fabric to truly be successful.

“It empowers confident, consistent decisions, and when these elements all come together, AI just doesn’t analyze and interpret the data — it drives smarter, faster decisions that really create business impact,” he says. “This is the promise of a thoughtfully designed business data fabric, where every part reinforces the other, and every insight is grounded in trust and clarity.”

Technically, building a data-fabric layer requires several capabilities. Data must be accessible across multiple environments through federation rather than forced consolidation. A semantic or knowledge layer is needed to harmonize meaning across systems, often supported by knowledge graphs and catalog-driven metadata. Governance and policy enforcement must also operate across the fabric so that AI systems can access data securely and consistently.

Together, these elements create a foundation where AI interacts with business knowledge instead of raw storage systems — an essential step for moving from experimentation to real enterprise automation.

Beyond data isolation and dashboards

In the emerging era of agentic AI, the responsibility for monitoring, analyzing, and making decisions based on data increasingly shifts to software. AI agents can monitor events, trigger workflows, and make decisions in real time, often without direct human intervention. That speed creates new opportunities, but it also raises the stakes. When multiple agents operate across finance, supply chain, procurement, or customer operations, they must be guided by the same understanding of business priorities.

Without a common knowledge layer connecting disparate data together, coordination between systems quickly breaks down. One system might optimize for margin, another for liquidity, and another for compliance, each working from a different slice of data. 

Importantly, most enterprises already possess much of the knowledge needed to make this work, says Khan. Years of operational data, master data, workflows, and policy logic already exist across business applications — companies just need to make it accessible. Companies that deploy data fabrics gain greater trust in their data, with more than two thirds of enterprises seeing improved data accessibility, data visibility, and exerting more control over their data. 

“The opportunity isn’t just inventing context from scratch, it’s activating and connecting the context across your business that already exists,” he continues, adding that a data fabric is the “architecture that ensures data semantics, business processes and policies are connected as a unified system across all the clouds.”

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff. It was researched, designed, and written by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Los Angeles is finally going underground

2026-04-22 18:00:00

Los Angeles deserves its reputation as the quintessential car city—the rhythms of its 2,200 square miles are dictated by wide boulevards and concrete arcs of freeways. But it once had a world-class rail transit system, and for the last three decades, the city has been rebuilding a network of trolleys and subways. In May, a new four-mile segment with three new subway stations will open along Wilshire Boulevard, a key east-west corridor that connects downtown LA to the Pacific Ocean. What today can be an hours-long drive through a busy, museum-­packed stretch of the city will be, if all goes well, a 25-minute train ride.

The existence of subway stops in this part of town—known as Miracle Mile—is a technological triumph over geography and geology. The ground underneath it is literally a disaster waiting to happen—it’s tarry and full of methane. One of those methane deposits actually exploded in 1985, destroying a department store in the neighborhood. In response, the city pushed its new train routes to other parts of town.

These days, dirt full of flammable goo is no longer a problem. “The technology finally caught up with the concerns,” says LA Metro’s James Cohen, a longtime manager of the engineering for this stretch of subway. The key was an earth-pressure-­balance tunnel-boring machine, an automated digger that is designed to chew through ground packed with explosive gas. It sends removed dirt topside via conveyor belts and slides precast concrete liner segments into the tunnel, which are joined together with gaskets to create a gas- and waterproof tube. All that let the machine dig about 50 feet every day. 

a car on the D line track
A Metro train pulls into La Cienega station
Fairfax Station
Art by Susan Silton at the Fairfax station
Eamon Ore-Giron - LA Metro D Line - La Brea Station
Art by Eamon Ore-Giron at the La Brea station

Meanwhile, engineers excavated the stations from the street level down. They worked mostly on weekends, digging out a space and then decking it with concrete so that work could go on underneath while LA drivers continued to exercise their God-given right to get around by car above.

Did the project finish on time? No. Did it come in under budget? Also no; this segment alone cost nearly $4 billion. Is the city now racing to build housing and walkable areas to take full advantage of the extension? Oh, please. Yet the new stations still manage to feel, in the end, transformative—as if Los Angeles’s train has finally come in.

There is no nature anymore

2026-04-22 18:00:00

When people talk about “nature,” they’re generally talking about things that aren’t made by human beings. Rocks. Reefs. Red wolves. But while there is plenty of God’s creation to go around, it is hard to think of anything on Earth that human hands haven’t affected.

Mat Honan

In the Brazilian rainforest, scientists have found microplastics in the bellies of animals ranging from red howler monkeys to manatees. In remotest Yakutia, where much of the earth remains untrodden by human feet, the carbon in the sky above melts the permafrost below. In the Arctic Ocean, artificial light from ship traffic—on the rise as the polar ice cap melts away—now disrupts the nightly journey of zooplankton to the ocean surface, one of the largest animal migrations on the planet. The remote mountain lakes of the Alps are contaminated with all kinds of synthetic chemicals. Polar bears are full of flame retardants. Cesium-137, fallout from nuclear bomb explosions, lightly rimes the entire planet. 

These examples are mostly pollution—nuclear, carbon, chemical, light—but I raise them not to highlight the ways human industry and technology degrade the environment but to note how the things humans build change it. Nobody really knows what the exact effects of all that will be, but my point is that no part of the globe is free of human fingerprints. We have literally changed the world.  

We’ve changed ourselves as well. Humans are especially adept at bending human nature. Everything about us is up for grabs—appearance, health, our very thoughts. Pharmaceuticals, surgeries, vaccines, and hormones give us longer lives, take away our pain, ease our anxiety and depression, make us faster, stronger, more resilient. We’re getting glimpses of technologies that will let us change who our children will become before they’re even born. Electrodes implanted in people’s brains let them control computers and translate thoughts into speech. Prosthetics and exoskeletons straight out of comic books restore and enhance physical abilities, while gene-­editing technologies like CRISPR are rewriting our very DNA. And meanwhile, people have taken the sum total of all the information we have ever written down and poured it into vast calculating machines in an effort—at least by some—to build an intelligence greater than our own. 

So what even is nature, or natural, in this context? Is it “environmentalist,” in the conventional sense, to try to preserve what one could argue no longer exists? Should we employ technology to try to make the world more “natural”?  

Those questions led us to approach this Nature issue with humility. We try to grapple with them all the time—MIT Technology Review is, after all, a review of how people have altered and built upon nature.

And it’s a place to think about how we might repair it. Take solar geoengineering, for example—a subject we have covered with increasing frequency over the past few years. The basic idea of geoengineering is to find a technological fix for a problem technology caused: Burning ­petrochemicals to fuel the Industrial Revolution turned Earth’s atmosphere into a heat sink, fundamentally breaking the climate. Some geoengineers think that releasing particulate matter into the stratosphere would reflect sunlight back into space, thus reducing global temperatures. After years of theoretical discussions, some companies have begun to actively experiment with such technologies. This might seem like a great way to restore the world to a more natural state. It’s also fraught with controversy and peril. It could, for example, benefit some nations while harming others. It may give us license to continue burning fossil fuels and releasing greenhouse gases. The list goes on. 

Nature isn’t easy. 

In our May/June issue, we have attempted to take a hard look at nature in our unnatural world. We have stories about birds that can’t sing, wolves that aren’t wolves, and grass that isn’t grass. We look for the meaning of life under Arctic ice and within ourselves—and in the far future, on a distant world, courtesy of new fiction by the renowned author Jeff VanderMeer. I don’t know if any of that will answer the questions I’ve been asking here—but we can’t help but try. It’s in our nature.