2025-12-23 22:00:02

IEEE Spectrum’s transportation coverage this year covered breakthroughs in electric vehicles, batteries, charging, automation, aviation, maritime tech and more. Readers followed the race to rebuild U.S. magnet manufacturing, rethink EV-charging architecture, and reinvent automotive software. They tracked China’s sprint toward five-minute charging, the rise of high-power home chargers, and the push to automate airports. Our most-read stories also explored next-generation navigation, zero-carbon shipping fuels, record-size electric vessels, and early road tests of solid-state batteries. Read on for our roundup of the transportation stories published in 2025 that readers found most compelling.
Business Wire
The most-visited transportation post of the year focused on the United States’s efforts to rebuild a domestic supply of neodymium-iron-boron (NdFeB) magnets—critical components for EVs, wind turbines, HVAC systems, and many military systems. MP Materials has begun trial production at its new Texas plant, with plans to ramp up to between 1,000 and 3,000 tonnes per year and supply companies like General Motors. Other U.S. projects from e-VAC Magnetics, Noveon, USA Rare Earth, and Quadrant are also emerging.
But these efforts are dwarfed by China’s rare earths industry: China makes 85 to 90 percent of NdFeB magnets and 97 percent of the underlying rare earth metals, with individual Chinese firms producing tens of thousands of tonnes—far more than all non-Chinese plants combined. China also has massive unused refinement and production capacity, keeping global prices low.
MP Materials’ unique mine-to-magnet strategy could offer intelligence and supply-chain resilience, but competing with China’s subsidies and scale will be extremely difficult. The U.S. Department of Defense may pay a premium for “friendly-nation” magnets, but cost-obsessed automakers like GM might resist higher domestic prices.
Jim West/REA/Redux
A strong public EV-charging network is essential for mass electric-vehicle adoption, especially for drivers who can’t reliably charge at home. Yet today’s fast-charging stations are expensive and complex largely because of one feature: galvanic isolation—the transformer-based safety barrier that protects against electric shock when ground connections fail. This isolation hardware accounts for roughly 60 percent of charger power-electronics cost and about half of power losses, making fast chargers costly to build and maintain. The authors of this piece—veterans of AC Propulsion, whose early technology influenced the Tesla Roadster—argue that galvanic isolation is no longer necessary.
The authors propose a new approach they call direct power conversion (DPC): eliminate the isolation link entirely and replace it with: (1) a double-ground system with ground-continuity detection to prevent shock hazards, and (2) a buck regulator to handle voltage mismatches between the grid and the EV battery. Removing isolation would simplify chargers from four power-conversion stages to just one (plus a buck regulator if needed). This could cut charger hardware costs by more than half, improve efficiency by 2–3percent, enable much cheaper fast-charging stations, allow EV onboard chargers to become powerful enough for Level 3 charging, and accelerate expansion of public charging infrastructure.The authors argue that simplifying chargers—and shedding old assumptions about galvanic isolation—is the fastest path to an affordable and reliable EV-charging network, which is critical to broad EV adoption.
BYD
BYD has unleashed megawatt-class EV charging in China, delivering 400 kilometers of range in five minutes—triple the power (and thus triple the speed) of today’s best U.S. setups. A Han L sedan briefly hit 1,002 kilowatts on BYD’s new 1,000-volt platform, which uses 1,500-V silicon-carbide chips and redesigned lithium iron phosphate batteries to enable safe, ultrafast charging. BYD’s vertically integrated approach—building cars, batteries, and chargers—lets it scale quickly and keep prices low. The company has already deployed 500 megachargers and plans 4,000 more, pushing China far ahead as rivals like Huawei and Zeekr race to match speeds up to 1,500 kW.China makes 85 to 90 percent of NdFeB magnets and 97 percent of the underlying rare earth metals, with individual Chinese firms producing tens of thousands of tonnes—far more than all non-Chinese plants combined.
ChargePoint
MCKIBILLO
Airports are rolling out a wave of new automation to speed trips from curb to gate. Copenhagen Optimization’s Virtual Queuing lets travelers reserve security times, with machine-learning models adjusting slots and staffing in real time. Electronic Bagtags generate paperless luggage tags via NFC, while Idemia’s biometric systems verify identity with a quick face scan. Smiths Detection’s X-ray diffraction machines identify materials by molecular “fingerprint,” reducing false alarms. Amazon’s Just Walk Out shops enable cashierless purchases, and Avidbots’ Neo robots autonomously scrub terminal floors. Even boarding gets smarter with systems that flag passengers trying to jump the queue.Removing galvanic isolation could simplify chargers from four power-conversion stages to just one. This could cut charger hardware costs by more than half, improve efficiency by 2–3percent, and enable much cheaper fast-charging stations.
2025-12-23 21:00:01

The goal of the quantum-computing industry is to build a powerful, functional machine capable of solving large-scale problems in science and industry that classical computing can’t solve. We won’t get there in 2026. In fact, scientists have been working toward that goal since at least the 1980s, and it has proved difficult, to say the least.
“If someone says quantum computers are commercially useful today, I say I want to have what they’re having,” said Yuval Boger, chief commercial officer of the quantum-computing startup QuEra, on stage at the Q+AI conference in New York City in October.
Because the goal is so lofty, tracking its progress has also been difficult. To help chart a course toward truly transformative quantum technology and mark milestones along the path, the team at Microsoft Quantum has come up with a new framework.
This framework lays out three levels of quantum-computing progress. The first level includes the kinds of machines we have today: the so-called noisy, intermediate-scale quantum (NISQ) computers. These computers are made up of roughly 1,000 quantum bits, or qubits, but are noisy and error prone. The second level consists of small machines that implement one of many protocols that can robustly detect and correct qubit errors. The third and final level represents a large-scale version of those error-corrected machines, containing hundreds of thousands or even millions of qubits and capable of millions of quantum operations, with high fidelity.
If you accept this framework, 2026 is slated to be the year when customers can finally get their hands on level-two quantum computers. “We feel very excited about the year 2026, because lots of work that happened over the last so many years is coming to fruition now,” says Srinivas Prasad Sugasani, vice president of quantum at Microsoft.
Microsoft, in collaboration with the startup Atom Computing, plans to deliver an error-corrected quantum computer to the Export and Investment Fund of Denmark and the Novo Nordisk Foundation. “This machine should be utilized toward establishing a scientific advantage—not a commercial advantage yet, but that’s the path forward,” Sugasani says.
QuEra has also delivered a quantum machine ready for error correction to Japan’s National Institute of Advanced Industrial Science and Technology (AIST), and plans to make it available to global customers in 2026.
Arguably, the main trouble with today’s quantum computers is their propensity for noise. Quantum bits are inherently fragile and thus sensitive to all kinds of environmental factors, such as electric or magnetic fields, mechanical vibrations, or even cosmic rays. Some have argued that even noisy quantum machines can be useful, but almost everyone agrees that for truly transformative applications, quantum computers will need to become error resilient.
To make classical information robust against errors, one can simply repeat it. Say you want to send a 0 bit along a noisy channel. That 0 might get flipped to a 1 along the way, causing a miscommunication. But if you instead send three zeros in a row, it will still be obvious that you were trying to send a 0 even if one gets flipped.
Simple repetition does not work with qubits, because they cannot be copied and pasted. But there are still ways to encode the information contained in a single qubit onto many physical qubits, making it more resilient. These groups of physical qubits encoding one qubit’s worth of information are known as logical qubits. Once information is encoded in these logical qubits, as the computation proceeds and errors occur, error-correction algorithms can then tease apart what mistakes were made and what the original information was.
Just creating these logical qubits is not enough—it’s important to experimentally verify that encoding information in logical qubits leads to lower error rates and better computation. Back in 2023, the team at QuEra, in collaboration with researchers at Harvard, MIT, and the University of Maryland, showed that quantum operations carried out with logical qubits outperformed those done with bare physical qubits. The Microsoft and Atom Computing team managed the same feat in 2024.
This year, these scientific advances will reach customers. The machine that Microsoft and Atom Computing will be delivering, called Magne, will have 50 logical qubits, built from some 1,200 physical qubits, and should be operational by the start of 2027. QuEra’s machine at AIST has around 37 logical qubits (depending on implementation) and 260 physical qubits, Boger says.
It may be no coincidence that both of the level-two quantum computers will be built out of the same types of qubits: neutral atoms. While the classical computing world has long since settled on the transistor as the fundamental device of choice, the quantum-computing world has yet to pick the perfect qubit, be it a superconductor (pursued at IBM, Google, and others), a photon (used by the likes of PsiQuantum and Xanadu), an ion (developed by IonQ and Quantinuum, to name a few), or other.
All of these options have their advantages and disadvantages, but there is a reason some of the earliest error-corrected machines are built with neutral atoms. The physical qubits that make up a logical qubit need to be close to each other, or connected in some way, in order to share information. Unlike, say, superconducting qubits printed on a chip, any two atomic qubits can be brought right next to each other (an advantage shared by trapped ions).
“Neutral atoms can be moved around,” says QuEra’s Boger. “That allows us to build error-correction methods that are just not possible with static qubits.”
A neutral-atom quantum computer consists of a vacuum chamber. Inside the chamber, a gas of atoms is cooled to just above absolute zero. Then, individual atoms are captured, held, and even moved around by tightly focused laser beams in a technique known as optical tweezing. Each atom is a single physical qubit, and these qubits can be arranged in a 2D or even 3D array.
Neutral-atom quantum computers consist of individual atoms that are manipulated and controlled primarily by lasers. Complex optical setups guide the laser beams to their precise destinations. Atom Computing
The computation itself—the sequence of “quantum gates”—is performed by shining a separate laser at the atoms, illuminating them in a precisely orchestrated fashion. In addition to maneuverability, the neutral-atom approach offers parallelism: The same laser pulse can illuminate many pairs of atoms at once, performing the same operation on each pair simultaneously.
The main downside of neutral-atom qubits is they are a bit slow. Computations on atomic systems are about one-hundredth to one-thousandth as fast as their superconducting counterparts, says Jerry Chow, director of quantum systems at IBM Quantum.
However, Boger argues that this slowdown can be compensated for. “Because of the unique capabilities of neutral atoms, we have shown that we can create a 50x or 100x speedup over what previously was thought,” he says, referring to recent work at QuEra in collaboration with Harvard and Yale. “We think that when you compare what some people call time to solution, not just clock speed but how long it would take you to get to that useful result…that neutral atoms today are comparable to superconducting qubits.” Even though each operation is slow, more operations are done in parallel and fewer operations are needed for error correction, allowing for the speedup.
Microsoft’s three-level framing is not accepted by everyone in the industry.
“I think that kind of level framing…is a very physics-device-oriented view of the world, and we should be looking at it more from a computational view of the world, which is, what can you actually use these circuits for and enable?” says IBM’s Chow.
Chow argues that, although a large error-corrected machine is the ultimate goal, it doesn’t mean error correction must be implemented first. Instead, the team at IBM is focusing on finding use cases for existing machines and using other error-suppressing strategies along the way, while also working toward a fully error-corrected machine in 2029.
Whether or not you accept the framing, the teams at QuEra, Microsoft, and Atom Computing are optimistic about the neutral-atom approach’s potential to reach large-scale devices. “If there’s one word, it’s scalability. That’s the key benefit of neutral atoms,” says Justin Ging, chief product officer at Atom Computing.
Both the teams at QuEra and Atom Computing say they expect to be able to put 100,000 atoms into a single vacuum chamber within the next few years, setting a clear path toward that third level of quantum computing.
This article appears in the January 2026 print issue.
2025-12-23 03:00:02

The annual IEEE STEM Summit, held this year on 23 and 24 October, brought together preuniversity educators, IEEE volunteers, and STEM enthusiasts to discuss ways to spark children’s interest in science, technology, engineering, and mathematics.
The free virtual summit attracted nearly 1,000 attendees from more than 100 countries. Participants engaged in keynote discussions, networking sessions, and presentations designed to address the most pressing challenges in STEM education. Speakers addressed building a sustainable future, as well as harnessing the power of artificial intelligence in classrooms.
The event was organized and hosted by the IEEE Educational Activities preuniversity education coordinating committee, whose mission is to foster outreach to school-age children worldwide by helping educators and IEEE volunteers create engaging activities.
The coordinating committee provides resources and services through TryEngineering, an Educational Activities program focused on STEM outreach. The program provides educators with lesson plans, activities, and other resources at no charge for use in their classrooms and in community activities.
Young students’ interest in STEM careers can be ignited when they’re introduced to technologies and learn how they operate.
“With the continued growth of our TryEngineering programs, the IEEE STEM Summit brought together global experts in the field of STEM outreach,” says Jamie Moesch, IEEE Educational Activities managing director. “This event is a wonderful opportunity for people with a passion for inspiring engineering and technology in the next generation to get together, learn, and collaborate.”
The summit opened with a welcome from Mary Ellen Randall, the 2025 IEEE president-elect; Tim Kurzweg, vice president of Educational Activities; and Stamatis Dragoumanos, chair of the preuniversity education coordinating committee. They emphasized the importance of collaboration and innovation in STEM education.
Khadijah Thibodeaux and Heidi Gibson from the Smithsonian Science Education Center introduced Smithsonian Science for Global Goals, a repository of hands-on activities and research designed to help students ages 11 to 18 learn about sustainability issues and experiment with solutions in their community.
Mylswamy Annadurai, a program director at the Indian Space Research Organization who is known as the “moon man of India” for leading the country’s two lunar missions, shared stories about innovations that have fueled his nation’s space exploration. He urged summit participants to be resilient, think big, and embrace challenges.
Stuart Kohlhagen, founder and director of the Science Nomad initiative, spoke from Thailand’s MAKERHerSpace. Kohlhagen champions hands-on science and critical thinking in both formal and informal education contexts. He discussed practical strategies to enhance critical thinking, creativity, and collaboration for students globally.
Other sessions addressed interesting topics in STEM education today. Participants explored inclusive strategies to bridge the digital divide, design thinking as a problem-solving tool, and the effective use of artificial intelligence in preuniversity education.
There was a hands-on workshop on prompt engineering, which is designing and refining instructions to guide AI models to get more accurate search results.
Networking sessions and exhibit booths provided additional opportunities for participants to connect, share ideas, and explore resources. More than 2,500 attendees visited exhibit booths hosted by IEEE technical societies and industry partners.
Recordings of summit sessions are available on the IEEE TryEngineering YouTube channel.
Educators may access free resources at TryEngineering.org.
The IEEE Foundation is TryEngineering’s philanthropic partner. Contribute to future events and expand STEM outreach globally by visiting the TryEngineering Fund donation page.
2025-12-22 22:00:01

The telecom networks originally built to carry phone calls and packets of data are in the midst of a dramatic shift. The past year saw early steps toward networks becoming a more integrated data fabric that can measure the world, process and sense collaboratively, and even stretch into outer space.
The following list of key IEEE Spectrum telecom news stories from 2025 underscore the evolution the connected (and wireless) world is today going through. A larger story is emerging, in other words, of how networks are turning into instruments and engines rather than just passive pipes.
And if there’s a clear starting point to watch this shift happening, it’s in the early thinking around 6G.
Source image: Nokia
Unlike previous step-changes in telecom’s evolution (especially the bandwidth upgrades from 3G to 4G and 4G to 5G), the key equation for 6G isn’t “5G plus faster downloads.” Nokia Bell Labs, whose president of core research Peter Vetter sat down for a conversation with Spectrum in November, is starting to test and test-deploy key pieces of 6G’s infrastructure five years before 6G devices are expected to come online. And time is tight. Because, as Vetter explains, downlinks for the must-have consumer tech of the decade ahead may not be the network’s key crunch point for too much longer. Your phone’s ability—and your future smart glasses’ ability—to download streaming video and other content is increasingly not telecom’s hardest problem. Rather, if the Internet of Things scales up as predicted, and smart home and smart city tech takes hold, before too long everything everywhere will be dialing in to 6G infrastructure for more and more sizable uplinks. And that kind of traffic surge might break telecom networks of today. Which is why smart money, beginning with but certainly not limited to Nokia Bell Labs, is on solving that massive uplink problem before it might emerge.
Oliver Killig/HZDR
There’s a range of electromagnetic spectrum between 0.1 and 10 terahertz that has historically been very difficult to harness technologically. Radio waves and microwaves on one side of the “terahertz gap“ and infrared light on the other each have their own types of electronics and waveguides to manipulate photons and translate them back and forth to electrical signals in integrated circuits.
But this past year has seen progress in closing the terahertz gap. In a story from October, Spectrum contributor Meghie Rodrigues chronicled how a new breed of chips are being developed to unlock bandwidths in the tens and hundreds of gigahertz—well beyond 5G’s range and coming up just shy of that long-puzzled-over terahertz gap. Crucially, the new chips can be operated at or near room temperature and on standard semiconductor substrates. To make big progress on telecom’s challenges to come, this kind of tech will be important to scale up and out and into devices that can meet 6G’s uplink and downlink demands.
Seyed Reza Sandoghchi and Ghafour Amouzad Mahdiraji/Microsoft Azure Fiber
While the promise of terahertz data links looms on the horizon, the world today also can’t wait for technologies that might, in 2030 or beyond, be able to fulfill their early promise. Some communications engineers have been leaning on a fundamental rule from physics that fiber optic lines haven’t fully tapped into yet: Light travels some 30 percent faster through air than it does glass. Fiber optic lines, in other words, could be substantially sped up if they weren’t solid glass but rather tiny glass tubes sheltering an air core.
Spectrum contributor John Boyd in September reported on research from a team at Microsoft and the University of Southampton in England that is testing the practicalities of hollow-core fiber links for extremely low latency applications like finance tech, cloud interconnects, and sensor networks. Hollow-core fiber isn’t expected to become the new fiber standard anytime soon, to be clear. But if the manufacturing challenges facing hollow-core lines can be overcome, both higher capacities and cleaner signals (with fewer of glass’s nonlinear distortions) could be part of fiber’s future.
Taara
Some researchers are investigating where and when fiber connections are even needed at all. To that end, the Google Alphabet spin-off company Taara is rolling out point-to-point laser data connections. Taara’s tech is not meant to span every gap in tomorrow’s networks, but laser data links do potentially solve some difficult “middle-mile” problems. Taara’s CEO Mahesh Krishnaswamy spoke to Spectrum in July about the company’s near-term goals. Their tech, Krishnaswamy explained, can enable gigabit-per-second speeds across kilometers.
It is weather sensitive, however. Fog and rain, for instance, can scatter the beam. So it’s not perfect for every application, but the company is now providing crucial connectivity in some sub-Saharan African and Southeast Asian settings. Free-space optical (FSO) tech, in all, is fast to deploy and high-capacity. On the flip side, FSO doesn’t work without line-of-sight connections between sender and receiver. So where fiber connections may be expensive to make (think rivers and ravines, for instance), or where permitting is very challenging, FSO could provide just the solution.
Elisa McGhee
Beyond simply transmitting data, what other possibilities are emerging for tomorrow’s networks? In March, Spectrum contributor Charles Choi investigated fiber optic cables pulling double-duty as sensor networks. Los Alamos and Colorado State University researchers reported finding identifiable acoustic signals in fiber cables when the NASA OSIRIS-REx space probe returned to Earth to deliver its capsule of asteroid samples in September 2023. The proof-of-concept research revealed a potential for nearer-to-home applications such as railway intrusion alerts, earthquake early warnings, and perimeter security. Best of all, no new fiber cables need to be installed to realize the acoustic-sensing capabilities that the world’s high-capacity data lines may now contain.
Mirko Pittaluga, Yuen San Lo, et al.
In April, Choi reported on a Toshiba team in Germany that transmitted quantum cryptographic keys across 250 kilometers. That’s a big deal, because nobody’s quite solved the quantum signal repeater or quantum signal amplifier problem yet. (Choi reported on that topic for us in 2023!) So any qubits traveling from point A to point B need to do so along a stretch of fiber with no tech in between. As the story notes, governments and financial institutions will be some of the earliest customers for high-security, quantum cryptographic applications.
Christoph Burgstedt/Science Photo Library/Alamy
How far are new networking technologies prepared to go? In September, contributor Michelle Hampson reported on new and sophisticated deep-space communications codes that could extend terrestrial networks out to 180 million kilometers away. That’s equivalent to 1.2 times the distance between the earth and the sun. NASA, ESA, and commercial players like SpaceX and Blue Origin are looking at expanding and hardening networking protocols for the trying rigors of space communications.
While 6G phones may not be expected to quite be up to the task of linking lunar or Mars missions to Earth, the communications technologies being developed today are expanding networks’ range of possibilities in the years ahead. Networking technologies are no longer just about connecting people and their devices. They’re increasingly about building a sensing and computing data fabric that spans across Earth and extends far beyond into the solar system.
2025-12-22 21:00:01

For many years, doctors and technicians who performed medical ultrasound procedures viewed bubbles with wary concern. The phenomenon of cavitation—the formation and collapse of tiny gas bubbles due to changes in pressure—was considered an undesirable and largely uncontrollable side effect. But in 2001, researchers at the University of Michigan began exploring ways to harness the phenomenon for the destruction of cancerous tumors and other problematic tissue.
The trouble was, creating and controlling cavitation generated heat, which harmed healthy tissue beyond the target area. Zhen Xu, who was working on a Ph.D. in biomedical engineering at the time, was bombarding pig heart tissue in a tank of water with ultrasound when she made a breakthrough.
The key was using extremely powerful ultrasound to produce negative pressure of more than 20 megapascals, delivered in short bursts measured in microseconds—but separated by relatively long gaps, between a millisecond and a full second long. These parameters created bubbles that quickly formed and collapsed, tearing apart nearby cells and turning the tissue into a kind of slurry, while avoiding heat buildup. The result was a form of incisionless surgery, a way to wipe out tumors without scalpels, radiation, or heat.
“The experiments worked,” says Xu, now a professor at Michigan, “but I also destroyed the ultrasound equipment that I used,” which was the most powerful available at the time. In 2009, she cofounded a company, HistoSonics, to commercialize more powerful ultrasound machines, test treatment of a variety of diseases, and make the procedure, called histotripsy, widely available.
So far, the killer app is fighting cancer. In 2023, HistoSonics’ Edison system received FDA approval for treatment of liver tumors. In 2026, clinicians will conclude a pivotal kidney cancer study and apply for regulatory approval. They’ll also launch a large-scale pivotal trial for pancreatic cancer, considered one of the deadliest forms of the disease with a five-year survival rate of just 13 percent. An effective treatment for pancreatic cancer would represent a major advance against one of the most lethal malignancies.
HistoSonics is not the only developer of histotripsy devices or techniques, but it is first to market with a purpose-built device. “What HistoSonics has developed is a symphony of technologies, which combines physics, biology, and biomedical engineering,” says Bradford Wood, an interventional radiologist at the National Institutes of Health, who is not affiliated with the company. Its engineering effort has spanned multiple disciplines to produce robotic, computer-guided systems that turn physical forces into therapeutic effects.
Over the past decade, research has confirmed or found other benefits of histotripsy. With precise calibration, fibrous tissue—such as blood vessels—can be spared from damage even in the target zone. And while other noninvasive techniques may leave scar tissue, the liquefied debris created by histotripsy is cleared away by the body’s natural processes.
In HistoSonics’ early trials for pancreatic cancer, doctors used focused ultrasound pulses to ablate, or destroy, tumors deep within the pancreas. “It’s a great achievement for the entire field to show that it is possible to ablate pancreatic tumors and that it’s well tolerated,” says Tatiana Khokhlova, a medical ultrasound researcher at the University of Washington, in Seattle, who has worked on alternative histotripsy techniques.
Khokhlova says the key to harnessing histotripsy’s benefits “will be combining ablation of the primary tumor in the pancreas with some other therapy.” Combination treatment could fight recurrent cancer and tiny tumors that ultrasound might miss, while also tapping into a surprising benefit.
Histotripsy generally seems to stimulate an immune response, helping the body attack cancer cells that weren’t targeted directly by ultrasound. The mechanical destruction of tumors likely leaves behind recognizable traces of cancer proteins that help the immune system learn to identify and destroy similar cells elsewhere in the body, explains Wood. Researchers are now exploring ways to pair histotripsy with immunotherapy to amplify that effect.
The company’s capacity to explore the treatment‘s potential for different conditions will only improve with time, says HistoSonics CEO Mike Blue. The company has fresh resources to accelerate R&D: A new ownership group, which includes billionaire Jeff Bezos, acquired HistoSonics in August 2025 at a valuation of US $2.25 billion.
Engineers are already testing a new guidance system that uses a form of X-rays rather than ultrasound imaging, which should expand use cases. The R&D team is also developing a feedback system that analyzes echoes from the therapeutic ultrasound to detect tissue destruction and integrates that information into the live display, says Blue.
If those advances pan out, histotripsy could move well beyond the liver, kidney, and pancreas in the fight against cancer. What started as a curiosity about bubbles might soon become a new pillar of noninvasive medicine—a future in which surgeons wield not scalpels, but sound waves.
This article appears in the January 2026 print issue.
2025-12-21 22:00:01

IEEE Spectrum’s most popular biomedical stories of the last year centered both on incorporating new technologies and revamping old ones. While AI is all the rage in most sectors—including biomed, with applications like an in-brain warning system for worsening mental health and a model to estimate heart rate in real time—biomedical news this past year has also focused on legacy technologies. Tech like Wi-Fi, ultrasound, and lasers have all made comebacks or found new uses in 2025.
Whether innovation stems from new tech or old, IEEE Spectrum will continue to cover it rigorously in 2026.
Georgia Institute of Technology, Icahn School of Medicine at Mt. Sinai and TeraPixel
When Patricio Riva Posse, a psychiatrist at Emory University School of Medicine, realized that his patient’s brain implants were sending him signals about her worsening depression before she even recognized anything was wrong, he wished he could have taken action sooner.
That experience led him and colleagues to develop “an automatic alarm system” for signs of changing mental health. The tool monitors brain signals in real time, using implants to record electrical impulses, and AI to analyze the outputs and flag warning signs of relapse. Other research groups across the United States are experimenting with different ways to use these stimulating brain implants to help treat depression, both with and without the help of AI. “There are so many levers we can press here,” neurosurgeon Nir Lipsman says in the article.
Dmitry Kireev/University of Massachusetts Amherst
In Dmitry Kireev’s lab at the University of Massachusetts Amherst, researchers are developing imperceptibly thin graphene tattoos capable of monitoring your vital signs and more. “Electronic tattoos could help people track complex medical conditions, including cardiovascular, metabolic, immune system, and neurodegenerative diseases. Almost half of U.S. adults may be in the early stages of one or more of these disorders right now, although they don’t yet know it,” he wrote in an article for IEEE Spectrum.
How does it work? Graphene is conductive, strong, and flexible, able to measure features like heart rate and the presence of certain compounds in sweat. For now, the tattoos need to be plugged into a regular electronic circuit, but Kireev hopes that they will soon be integrated into smartwatches, and thus simpler to wear.
Erika Cardema/UC Santa Cruz
Wi-Fi can do more than just get you connected to the internet—it can help monitor your heart inexpensively and without requiring constant physical contact. The new approach, called Pulse-Fi, uses an AI model to analyze heartbeats to estimate heart rate in real time from up to 10 feet away.
The system is low cost, totaling around US $40, easy to deploy, and doesn’t introduce discomfort. It also works regardless of the user’s posture and in all kinds of environments. Katia Obraczka, a computer scientist at the University of California, Santa Cruz who led the development of Pulse-Fi, says the team plans to commercialize the technology.
Shonagh Rae
Sangeeta S. Chavan and Stavros Zanos, biomedical researchers at the Institute of Bioelectronic Medicine in New York, hypothesize that ultrasound waves may activate neurons, offering “a precise and safe way to provide healing treatments for a wide range of both acute and chronic maladies,” as they write in an article for Spectrum. Targeted ultrasound could then serve as a treatment for inflammation or diabetes, instead of medication with wide-ranging side effects, they say.
It works by vibrating a neuron’s membrane and “opening channels that allow ions to flow into the cell, thus indirectly changing the cell’s voltage and causing it to fire,” they write. The authors think that activating specific neurons can help address the root causes of specific illnesses.
Extreme Light group/University of Glasgow
If a doctor wants to see inside your head, they have to decide whether they want to do so cheaply or deeply—an electroencephalograph is inexpensive, but doesn’t penetrate past the outer layers of the brain, while functional magnetic resonance imaging (fMRI) is expensive, but can see all the way in. Shining a laser through a person’s head seems like the first step towards technology that accomplishes both.
For many years, this kind of work has seemed impossible because the human head is so good at blocking light, but researchers have now proven that lasers can send photons all the way through. “What was thought impossible, we’ve shown to be possible. And hopefully…that could inspire the next generation of these devices,” project lead Jack Radford says in the article.
Jiawei Ge
In the not-to-distant future, surgical patients may hear “The robot will see you now,” as the authors of this story suggest. The three researchers work at the Johns Hopkins University robotics lab responsible for developing Smart Tissue Autonomous Robot (STAR), which performed the first autonomous soft-tissue surgery in a live animal in 2016.
While there are certainly challenges remaining in the quest to bring autonomous robots into the operating room—like developing general purpose robotic controllers and collecting data within strict privacy regulations—the end goal is on the horizon. “A scenario in which patients are routinely greeted by a surgeon and an autonomous robotic assistant is no longer a distant possibility,” the authors write.