MoreRSS

site iconIEEE SpectrumModify

IEEE is the trusted voice for engineering, computing, and technology information around the globe. 
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of IEEE Spectrum

Virtual Power Plants Play the Imitation Game

2025-12-16 07:03:43



In 1950, the English mathematician Alan Turing devised what he called “the imitation game.” Later dubbed the Turing test, the experiment asks a human participant to conduct a conversation with an unknown partner and try to determine if it’s a computer or a person on the other end of the line. If the person can’t figure it out, the machine passes the Turing test.

Power grid operators are now preparing for their own version of the game. Virtual power plants, which concatenate small, distributed energy resources, are increasingly being tapped to balance electricity supply and demand. The question is: Can they do their job as well as conventional power plants?

Grid operators can now find out by running these power plants through a Turing-like test called the Huels. To pass the Huels test, the performance of a virtual power plant must be indistinguishable from that of a conventional power plant. A human grid operator serves as the judge.

Virtual power plant developer EnergyHub, based in Brooklyn, N.Y., developed the test and outlined it in a white paper released today. “What we’re really trying to do is fool the operators into feeling that these virtual power plants can act and feel and smell like conventional power plants,” says Paul Hines, chief scientist at EnergyHub. “This is a kind of first litmus test.”

What Are Virtual Power Plants (VPPs)?

The virtual-versus-conventional power plant question is a timely one. Virtual power plants, or VPPs, are networks of devices such as rooftop solar panels, home batteries, and smart thermostats that come together through software to collectively supply or conserve electricity.

Unlike conventional power generation systems, which might crank up one big gas plant when electricity demand peaks, VPPs tap into small, widely disbursed equipment. For example, a VPP might harness electricity from hundreds of plugged-in electric vehicles or rooftop solar panels. Or it might direct smart thermostats in homes or businesses to turn down heat or cooling systems to reduce demand.

The technology is emerging at a time when concerns over data centers’ electricity demand is hitting a fever pitch. The consultancy BloombergNEF estimates data-center energy demand in the United States will reach 106 gigawatts by 2035–a 36 percent jump from what it had projected just seven months ago.

How utilities and grid operators will meet the growing demand is unclear and faces challenges on many fronts. Turbines for natural gas plants are backordered and new nuclear reactors are still years away. Wind and solar, while cheap and fast to build, don’t produce the 24/7 electricity that data centers demand, and face an uphill political battle under the Trump administration.

All of this together has created an opening for VPPs, which could add gigawatts to the grid without significantly jacking up electricity rates. “It’s a political issue. If you said you’re going to get electricity costs under control, this is literally the only way to do it in 12 months,” says Jigar Shah, a clean energy investor at Multiplier in Washington, D.C., who led the U.S. Department of Energy’s Loan Programs Office under the Biden administration.

VPPs could also reduce utilities’ need to invest in distribution equipment, avoiding supply chain shortages and inflated costs, Shah says. “There is no other idea that you could possibly deploy in 12 months that would have that big of an impact,” he says.

According to a 2024 U.S. Department of Energy report, VPPs could provide between 80 and 160 gigawatts of capacity across the U.S. by 2030—enough to meet between 10 and 20 percent of peak grid demand.

How Can VPPs Gain Grid Operator Trust?

But first, VPP developers have to win over grid developers. Benchmarks like the Huels test are crucial to building that trust. “In order for us to build our reliance on VPPs, they do need to pass the Huels test and operators need to be able to count on” the VPPs delivering power when called upon, said Lauren Shwisberg, a principal in the nonprofit research group Rocky Mountain Institute who co-authored a recent report on VPPs and was not involved in the development of the test.

Matthias Huels, an engineer who spent more than four years at EnergyHub, first came up with the idea for the test in 2024. After workshopping the idea with colleagues and, somewhat ironically, ChatGPT, Huels presented the concept to the company.

Huels designed the test subjectively. Currently, in its earliest iteration, it appears to follow a guideline akin to the Supreme Court’s “I know it when I see it” test for what distinguishes pornography from erotic art. That is to say: passing the test depends on who’s judging. If a grid operator finds the power from a VPP as dependable as electricity from an actual power plant burning gas to produce electrons, then the VPP has passed.

There are four levels to the Huels test. To reach level 1, a VPP must be able to shave off demand from the grid by, for example, successfully scheduling smart thermostats to dial down when the grid faces maximum demand. To reach level 2, a VPP must be able to respond to market and grid data and dial down demand when prices hit a certain level or tap into solar panels or batteries when power is needed. Human decision makers are involved at these levels.

Passing the Huels test comes at level 3. That’s when a VPP can function automatically because it’s proven reliable enough to be indistinguishable from a gas peaker plant–the type of power station that comes online as backup only when the grid is under stress. Passing level 4 involves VPPs acting fully autonomously to adjust output based on a number of actively-changing variables throughout the day.

“The imitation game that Alan Turing came up with was: Can a computer fool an interrogator to think it’s actually human even though it’s a computer,” Hines says. “We propose this idea of a test that would allow us to say: Can we fool a grid operator into thinking that the thing that’s actually solving their problems is this aggregation of many devices instead of a big gas plant?”

Can VPPs Mimic Gas Peaker Plants?

Peaker plants only generate power about 5 percent of the time over their lifespans. That makes them easier for VPPs to mimic because, like peaker plants, the limited amount of power that can be made available by demand response or harvested from batteries only provides bursts of power that last a few hours at a time.

Far more difficult is stacking up to a full-scale gas plant, which operates 65 percent of the time or more, or a nuclear plant, which usually operates at least 95 percent of the time. Getting there would involve equipping a VPP network with long-duration storage that could be powered up during the day when solar panels are at peak output and discharged all night long. “You start talking about VPPs with large amounts of batteries that can run 365 days per year,” Hines says. “That’s a road we can go down.”

EnergyHub has been putting its VPP systems through the Huels test. Last year, EnergyHub successfully ran trials with Arizona Public Service, Duke Energy in North Carolina, and National Grid in Massachusetts. In Arizona, EnergyHub’s software dialed into homes with solar panels and smart thermostats and ran air conditioners to “pre-cool” houses during the day when the sun was generating lots of electricity. This allowed the state’s biggest utility to reduce demand during peak hours when residents would typically return home from work to turn on televisions and crank up their air conditioners.

“You have too much power in the middle of the day because of solar, then the early evening comes and you get people ramping up their evening loads right as the solar is ramping down,” Hines says. “You need something that can feather through that schedule. We created something that can do this.”

That lands the company somewhere between a 2 and 3 on the Huels testing scale. Passing level 3 “is going to take a few years,” Hines says.

Could This Technology Prevent Blackouts?

2025-12-16 03:15:36



Spain’s grid operator, Red Eléctrica, proudly declared that electricity demand across the country’s peninsular system was met entirely by renewable energy sources for the first time on a weekday, on 16 April 2025.

Just 12 days later, at 12.33 p.m. on Monday, 28 April, Spain and Portugal’s grids collapsed completely, plunging some 55 million people into one of the largest blackouts the region has ever seen. Entire cities lost electricity in the middle of the day. In the bustling airports of Madrid, Barcelona, and other key hubs, departure boards went blank. No power. No Internet. Even mobile phone service—something most people take for granted—was severely compromised. It was just disconnection and disruption. On the roads, traffic lights stopped functioning, snarling traffic and leaving people wondering when the power would return.

The size and scale of the impact were unsettling, but the scariest part was the speed at which it happened. Within minutes, the whole of the Iberian Peninsula’s energy generation dropped from roughly 25GW to less than 1.2GW.

While this may sound like a freak accident, incidents like this will continue to happen, especially given the rapid changes to the electrical grid over the past few decades. Worldwide, power systems are evolving from large centralized generation to a multitude of diverse, distributed generation sources, representing a major paradigm shift. This is not merely a “power” problem but is also a “systems” problem. It involves how all the parts of the power grid interact to maintain stability, and it requires a holistic solution.

Power grids are undergoing a massive transformation—from coal- and gas-fired plants to millions of solar panels and wind turbines scattered across vast distances. It’s not just a technology swap. It’s a complete reimagining of how electricity is generated, transmitted, and used. And if we get it wrong, we’re setting ourselves up for more catastrophic blackouts like the one that hit all of Spain and Portugal. The good news is that a solution developed by our group at Illinois Institute of Technology over the last two decades and commercialized by our company, Syndem, has achieved global standardization and is moving into large-scale deployment. It’s called Virtual Synchronous Machines, and it might be the key to keeping the lights on as we transition to a renewable future.

Rapid Deployment of Renewable Energy

The International Energy Agency (IEA) created a Net Zero by 2050 roadmap that calls for nearly 90% of global electricity generation to come from renewable, distributed sources, with solar photovoltaic (PV) and wind accounting for almost 70%. We are witnessing firsthand a paradigm shift in power systems, moving from centralized to distributed generation.

The IEA projects that renewable power installations will more than double between 2025 and 2030, underscoring the urgent need to integrate renewables smoothly into existing power grids. A key technical nuance is that many distributed energy resources (DERs) produce direct current (DC) electricity, while the grid operates on alternating current (AC). To connect these resources to the grid, inverters convert DC into AC. To understand this further, we need to discuss inverter technologies.

An array of touch screens and Syndem converters in a control room.Professor Beibei Ren’s team at Texas Tech University built modules for a SYNDEM test bed with 12 modules and a substation module, consisting of 108 converters. Beibei Ren/Texas Tech University

Most of the inverters currently deployed in the field directly control the current (power) injected to the grid while constantly following the grid voltage, often referred to as grid-following inverters. Therefore, this type of inverter is a current source, meaning that its current is controlled, but its terminal voltage is determined by what it connects to. Grid-following inverters rely on a stable grid to inject power from renewable sources and operate properly. This is not a problem when the grid is stable, but it becomes one when the grid is less stable. For instance, when the grid goes down or experiences severe disturbances, grid-following inverters typically trip off, meaning they don’t provide support when the grid needs them most.

In recent years, attempts to address grid instability have led to the rise of grid-forming inverters. As the name suggests, these inverters could help form the grid. These usually refer to an inverter that controls its terminal voltage, including both the amplitude and frequency, which indirectly controls the current injected into the grid. This inverter behaves as a voltage source, meaning that its terminal voltage is regulated, but its current is determined by what it is connected to. Unlike grid-following inverters, grid-forming inverters can operate independently from the grid. This makes them useful in situations where the grid goes down or isn’t available, such as during blackouts. They can also help balance supply and demand, support voltage, and even restart parts of the grid if it shuts down.

One issue is that the term “grid-forming” means different things to different people. Some of them lack clear physical meaning or robust performance under complex grid conditions. Many grid-forming controls are model-based and may not scale properly in large systems. As a result, the design and control of these inverters can vary significantly. Grid-forming inverters made by different companies may not be interoperable, especially in large or complex power systems, which can include grid-scale battery systems, high-voltage DC (HVDC) links, solar PV panels, and wind turbines. The ambiguity of the term is increasingly becoming a barrier for grid-forming inverters, and no standards have been published yet.

Systemic Challenges when Modernizing the Grid

Let’s zoom out for a moment to examine the broader landscape of structural challenges we need to address when transitioning today’s grid into its future state. This transition is often called the democratization of power systems. Just as in politics, where democracy means everyone has a say, this transition in power systems means that every grid player can play a role. The primary difference between a political democracy and a power system is that the power system needs to maintain the stability of its frequency and voltage. If we apply a purely democratic approach to manage the power grid, it will sow the seeds for potential systemic failure.

The second systemic challenge is compatibility. The current power grid was designed long ago for a few big power plants—not for millions of small, intermittent energy sources like solar panels or wind turbines. Ideally, we’d build a whole new grid to fit today’s needs, but that would bring too much disruption, cost too much, and take too long. The only feasible option is to somehow make various grid players compatible with the grid. To better conceptualize this, think about the invention of the modem, which solved the compatibility issues between computers and telephone systems, or the widespread adoption of USB ports. These inventions made many devices, such as cameras, printers, and phones, compatible with computers.

The third systemic challenge is scalability. It’s one thing to hook up a few solar panels to the grid. It’s entirely different to connect millions of them and still keep everything running safely and reliably. It’s like walking one large dog versus walking hundreds of Chihuahuas at once. It is crucial for future power systems to adopt an architecture that can operate at different scales, allowing a power grid to break into smaller grids when needed or reconnect to operate as one grid, all autonomously. This is crucial to ensure resilience during extreme weather events, natural disasters, and/or grid faults.

To address these systemic challenges, the technologies need to undergo a seismic transformation. Today’s power grids are electric-machine-based, with electricity generated by large synchronous machines in centralized facilities, often with slow dynamics. Tomorrow’s grid will run on power electronic converters—small, distributed, and with fast dynamics. It’s a significant change, and one we need to plan carefully for.

The Key is Synchronization

Traditional fossil fuel power plants use synchronous machines to generate electricity, as they can inherently synchronize with each other or the grid when connected. In other words, they autonomously regulate their speeds and the grid frequency around a preset value, meeting a top requirement of power systems. This synchronization mechanism has underpinned the stable operation and organic expansion of power grids for over a century. So, preserving the synchronization mechanism in today’s grids is crucial for addressing the systemic challenges as we transition from today’s grid into the future.

Unlike traditional power plants, inverters are not inherently synchronous, but they need to be. The key enabling technology is called virtual synchronous machines (VSMs). These are not actual machines, but instead are power electronic converters controlled through special software codes to behave like physical turbines. You can think of them as having the body of power converters with the brain of the older spinning synchronous machines. With VSMs, distributed energy resources can synchronize and support the grid, especially when something unexpected happens.

clear box with a green circuit board and some copper coils.Syndem’s all-in-one reconfigurable and reprogrammable power electronic converter educational kit.SYNDEM

This naturally addresses the systemic challenges of compatibility and scalability. Like conventional synchronous machines, distributed energy resources are now compatible with the grid and can be integrated at any scale. But it gets better. First, inverters can be added to existing power systems without major hardware changes. Second, VSMs support the creation of small, local energy networks—known as microgrids—that can operate independently and reconnect to the main grid when needed. This flexibility is particularly useful during emergencies or power outages. Lastly, VSMs provide an elegant solution for the common concern about inertia, traditionally provided by large spinning machines that help cushion the grid against sudden changes. By design, VSMs can offer similar or even better characteristics of inertia.

VSMs are poised to become mainstream in the coming decade, driven in part by the backing of a global standard. After years of hard work, IEEE approved and published the first global standard on VSM, IEEE Standard 2988-2024. It involved members affiliated with key manufacturers, including General Electric, Siemens, Hitachi Energy, Schneider Electric, and Eaton, in addition to regulators and utilities, including North American Electric Reliability Corporation (NERC), Midcontinent Independent System Operator (MISO), National Grid, Southern California Edison, Duke Energy Corporation, and Energinet.

The Holistic SYNDEM Architecture

Until now, much of the expert discourse has focused primarily on energy generation. But that’s only half of the equation—the other half is demand: how different loads consume the electricity. Their behavior also plays a crucial role in maintaining grid stability, in particular when generation is powered by intermittent renewable energy sources.

There are many different loads, including motors, Internet devices, and lighting, among others. They are physically different, but technically have one thing in common: They will all have a rectifier at the front end because motor applications are more efficient with a motor drive, which consists of a rectifier; and Internet devices and LED lights consume DC electricity, which needs rectifiers at the front end as well. Like inverters, these rectifiers can also be controlled as VSMs, with the only difference being the direction of the power flow. Rectifiers consume electricity while inverters supply electricity.

As a result, most generation and consumption facilities in a future grid can be equipped and unified with the same synchronization mechanism to maintain grid stability in a synchronized-and-democratized (SYNDEM) manner. Yes, you read that correctly. Even devices that use electricity—like motors, computers, and LED lights—can play a similar active role in regulating the grid by autonomously adjusting their power demand according to instantaneous grid conditions. A less critical load can adapt its power demand by a larger percentage as needed, even up to 100%. In comparison, a more critical load can adjust its power demand at a smaller percentage or maintain its power demand. As a result, the power balance in a SYNDEM grid no longer depends predominantly on adjusting the supply but on dynamically adjusting both the supply and the demand, making it easier to maintain grid stability with intermittent renewable energy sources.

For many loads, it is often not a problem to adjust their demand by 5-10% for a short period. Cumulatively, this offers significant support for the grid. Due to the rapid response of VSM, the support provided by such loads is equivalent to inertia and/or spinning reserve—extra power from synchronized generators not at full load. This can reduce the need for large spinning reserves that are currently necessary in power systems and reduce the effort to coordinate generation facilities. It also mitigates the impact of dwindling inertia caused by the retirement of conventional large generating facilities.

In a SYNDEM grid, all active grid players, regardless of size, whether conventional or renewable, supplying or consuming, would follow the same SYNDEM rule of law and play the same equal role in maintaining grid stability, democratizing power systems, and paving the way for autonomous operation. It is worth highlighting that the autonomous operation can be achieved without relying on communication networks or human intervention, lowering costs and improving security.

The SYNDEM architecture takes VSMs to new heights, addressing all three systemic challenges mentioned above: democratization, compatibility, and scalability. With this architecture, you can stack grids at different scales, much like building blocks. Each home grid can be operated on its own, multiple home grids can be connected to form a neighborhood grid, and multiple neighborhood grids can be connected to create a community grid, and so on. Moreover, such a grid can be decomposed into smaller grids when needed and can reconnect to form a single grid, all autonomously, without changing codes or issuing commands.

The holistic theory is established, the enabling technologies are in place, and the governing standard is approved. However, the full realization of VSMs within the SYNDEM architecture depends on joint ventures and global deployment. This isn’t a task for any one group alone. We must act together. Whether you’re a policymaker, innovator, investor, or simply someone who cares about keeping the lights on, you can play a role. Join us to make power systems worldwide stable, reliable, sustainable, and, eventually, fully autonomous.

Remembering Influential Engineering Educator Lyle Feisel

2025-12-16 03:00:02



Lyle Feisel, an influential engineering educator and dedicated IEEE volunteer, died on 5 November at the age of 90.

Feisel was a professor of electrical engineering and the founding dean of the Watson engineering school at the State University of New York, Binghamton. He established its organizational structure, academic programs, and culture. He also hired the majority of its faculty.

For more than six decades, the Life Fellow helped define IEEE’s long-term approach to education and professional development. He served in key leadership roles for IEEE Educational Activities, the IEEE History Committee, the IEEE Life Members Committee, and the IEEE Foundation.

“Lyle Feisel’s passion, compassion, and thoughtfulness were an inspiration to everyone who knew him,” says Karen Galuchie, the IEEE Foundation’s executive director. “I feel privileged to have known him. The IEEE community was fortunate to have such a dedicated and caring leader.”

Early career highlights

Feisel served in the U.S. Navy from 1954 to 1958 as a radio operator—which sparked his interest in electronics and communications. After his active duty ended, he earned bachelor’s, master’s, and doctoral degrees in electrical engineering from Iowa State University, in Ames.

In 1964 he joined the South Dakota School of Mines and Technology, in Rapid City, as a researcher and professor of electrical engineering. He taught there for 20 years.

Feisel’s research focused on thin-film materials, including tin-oxide films. He pursued early solar-cell research during a time when photovoltaics were still a niche experimental field. He supervised numerous undergraduate and graduate students and wrote an undergraduate curriculum that emphasized design projects and laboratory work.

By the late 1970s, Feisel was promoted to head of the EE department, where he helped update curriculum, recruit faculty, and expand research programs.

Building an institution from the ground up

Feisel left South Dakota Mines in 1983 to join SUNY Binghamton (now Binghamton University). That same year he founded the university’s Watson engineering school (now the Watson College of Engineering and Applied Science). He served as dean until his retirement in 2001.

In a 2023 interview about his career with the university’s newspaper, Feisel recalled that the school was “constantly growing and changing.”

“Every year, new programs were added,” he said. “If you look at the number of degrees in 1983 and the number we had in 2001, there’s no comparison at all.”

Feisel recognized the importance of interdisciplinary programs among EE, mechanical engineering, computer science, and industrial engineering to create new courses. He emphasized hands-on laboratory work and industry partnerships to better prepare students for their careers.

“Dean Feisel came at the right time, to the right place, and brought together all of the elements necessary to create superb academic programs and to attract faculty who excel in their fields,” Michael McGoff wrote in a tribute published on SUNY-Binghamton’s website. McGoff served as assistant dean and associate dean under Feisel for 17 years.

“Without his vision,” McGoff said, “there would not be a Watson College today.”

Extensive leadership across IEEE

Feisel dedicated decades to leadership and service across IEEE, especially in roles that shaped educational policy, historical preservation, and member engagement.

He was a longstanding member of the IEEE Education Society, serving as its 1978–1979 president.

As vice president of IEEE Educational Activities from 2000 to 2003, he played a central role in shaping programs that supported engineering faculty, students, and practitioners worldwide. He helped guide IEEE’s strategic plans for accreditation, continuing education, and the expansion of digital learning resources.

“Lyle Feisel’s passion, compassion, and thoughtfulness were an inspiration to everyone who knew him.” —Karen Galuchie

“IEEE is incredibly blessed to have volunteers, like Lyle, who passionately work on behalf of the mission of our organization,” says Jamie Moesch, managing director of IEEE Educational Activities.

Feisel also was involved with ABET, the organization responsible for accrediting engineering programs. In the late 1980s he served on the Engineering Accreditation Commission, ABET’s primary body responsible for setting accreditation criteria and overseeing evaluation processes. IEEE is a founding member of the organization.

Education wasn’t Feisel’s only passion. He served as 2006 chair of the IEEE Life Members Committee, which supports those at least 65 years old who have been members long enough so that their age and years of membership equal or exceed 100.

He was a member of the IEEE Foundation board of directors and the 2012—2013 chair of the IEEE History Committee. While chair, Feisel proposed creating a multimedia-based history program for young people, which evolved to the IEEE REACH Program. It offers preuniversity history teachers free access to educational resources so students can explore engineering and technology and how they impact society.

“Lyle exemplified the sort of long-term, engaged, active volunteer that makes IEEE be IEEE,” says Michael Geselowitz, senior director of the IEEE History Center.

A legacy of generosity

Feisel was recognized as a “Forever Generous” donor by the IEEE Foundation for his sustained support of scholarships, student programs, and educational initiatives. He and his wife, Dorothy, are members of the IEEE Goldsmith Legacy League, which recognizes IEEE members who have included the IEEE Foundation in their estate plans.

“The critical function of the IEEE Foundation—or any charity—is that it lets you help accomplish a goal that you could never achieve by yourself,” Feisel said in a recent interview with the Foundation. “Acting alone, we could never put a girl through high school in Guatemala, teach a class in New Jersey about the history of engineering, illuminate a light bulb in Haiti, or take a kid for a ride on a replica sailing ship.

“By giving to the IEEE Foundation and other charities,” he said, “we’re able to help do all of those things.”

7 Bell Labs Breakthroughs Honored as IEEE Milestones

2025-12-14 03:00:02



Bell Labs is already highly recognized, but in its centennial year, the organization hoped to add more awards to burnish its reputation as one of the world’s leading centers of technical innovation.

On 21 October, IEEE representatives, Nokia Bell Labs leaders, and alumni of the storied institution gathered to celebrate seven technological achievements recognized as IEEE Milestones:

The large number of milestones granted at once is due to an extraordinary effort to achieve the recognitions during Bell Labs’ 100th anniversary year, which IEEE Fellow Peter Vetter, president of Nokia Bell Labs core research, told the attendees was always intended as a full 12 months of celebrations.

Speakers emphasized that celebrating such history inspires today’s—and tomorrow’s—engineers.

“History gives us context,” IEEE President Kathleen Kramer said. “It reminds us why we do what we do.”

Theodore Sizer, Nokia Bell Labs executive vice president, said of the recognition, “We are also here to celebrate the 100 years ahead of us.”

Presenters at the event acknowledged the outsize role Bell Labs has played in the development of many technologies, noting that it helped make IEEE Region 1—the Eastern United States—a powerhouse of innovation. Sixty of the 285 IEEE milestones that have been granted so far were to technologies developed in Region 1, noted its director, Bala Prasanna, an IEEE life senior member.

“Bell Labs stands at the heart of that legacy,” Prasanna said.

IEEE Life Member Emad Farag, chair of the IEEE North Jersey Section, said, “This section has given birth to technology that is at the heart of modern life.”

Molecular beam epitaxy

The high-purity crystal growth process called molecular beam epitaxy (MBE) was developed by IEEE Fellow Alfred Y. Cho in the 1960s. Used to grow thin films of crystal atop one another, the process makes possible high-electron mobility transistors, vertical-cavity surface-emitting lasers (VCSELs), and other technologies.

With MBE, ultrapure elements such as gallium and arsenic are heated within the side compartments of a vacuum chamber. Inside the chamber sits a heated target semiconductor wafer. The elements sublimate, evaporating and flying at the target wafer, where they attach and combine, slowly growing into a layer of crystal.

“It sounds straightforward, but it’s difficult to get it right,” said IEEE Fellow David Nielson, group leader for optical transmission at Bell Labs. “The thermodynamics going on at the surface in MBE is incredibly complex.”

VCSELs are dependent on MBE, Nielson noted. They rely on multiple vertical semiconductor layers to form internal reflectors and other structures. VCSELs are key to the facial recognition systems used to unlock smartphones today. The tiny array of lasers paints your face with a pattern of dots to create a 3D map.

Because MBE happens one atomic layer at a time and under clean-room conditions, it gives scientists unprecedented control over the thickness, composition, and purity of each layer—similar to 3D printing but at the nanometer scale, according to the University of Iowa physics and astronomy department’s MBE Lab.

Building up enough layers to make a useful device—a process that happens at the glacial pace of 1 micrometer (or less) per hour—was a test of Bell Labs scientists’ patience and determination, Nielson said.

“In the lab, we used to say MBE didn’t just stand for molecular beam epitaxy; it also meant many boring evenings,” he joked.

The scientists’ steadfast attention and patience paid off.

“It unlocked all sorts of new materials,” Nielson said. “It allows you to build materials that don’t naturally exist. Some of the impacts in the scientific domain include fractional quantum Hall effects—another Bell Labs innovation being celebrated today.”

As Cho recounted in a 2010 interview for the Computer History Museum’s oral history series, he began working at the Murray Hill facility in 1968. His colleague John R. Arthur Jr. soon proposed a new approach to fine-tuning the semiconductor formulations: Evaporate pure elements such as gallium and arsenic in an ultrahigh vacuum, then let the resulting molecular beams travel unimpeded, allowing them to condense on a heated crystalline substrate. Cho said in the oral history that Arthur’s idea inspired him to connect insights gleaned from research papers, lectures, and his own graduate work.

When asked how he invented what became known as MBE, he described it as “linking ideas from one field to another to create something entirely new.”

Cho understood how early effusion cells (the combustion chambers in which the elements are heated until they break down into their molecular or atomic components) and cesium ion emitters (which improve the orderliness of the atomic layering) worked in an ultrahigh vacuum.

He applied that knowledge, along with his background in surface physics—the understanding of how to monitor and assess the quality of the atomic layers through electron diffraction and how to remove oxides to clean surfaces—to the growth of semiconductor materials. By connecting surface physics, ion engines, and crystal growth, he helped create a new field, he said in the oral history.

“History gives us context. It reminds us why we do what we do.” —IEEE President Kathleen Kramer

By the end of 1968, he and Arthur had built the first experimental MBE system. Their 1969 Bell Labs technical memo and follow-up Applied Physics Letters paper documented the first high-quality gallium arsenide layers with atomically sharp interfaces—something no other technique could achieve. What astonished their colleagues was the repeatability: By controlling shutter timing, temperature, and beam flux—the rate at which elements evaporate and their atoms flow toward the target wafer—they could reproduce identical structures repeatedly.

The invention had all the hallmarks of the Bell Labs tradition: a simple question pursued with rigor, a culture that valued exploration over deadlines, and an audacious belief that even the smallest layer of matter could be engineered to perfection.

The IEEE Milestone plaque honoring MBE reads:

“In 1968–1970, molecular beam epitaxy (MBE) techniques using reflection high-energy electron diffraction for growing epitaxial compound semiconductor films were introduced. MBE deposits single-crystal structures one atomic layer at a time, creating materials that cannot be duplicated through other known techniques. This precise crystal growth method revolutionized the fabrication of semiconductor devices, quantum structures, and electronic devices, including lasers for reading and writing optical disc media.”

Charge-coupled device

In 1969 two Bell Labs physicists and IEEE Life Fellows—Willard S. Boyle and George E. Smith—scribbled an idea on a blackboard that would quietly reshape the way the world records light. Their concept, sketched amid a one-hour conversation, would become the charge-coupled device, or CCD—a breakthrough that, as Scientific American noted in its February 1974 issue, seemed poised to improve TV cameras and astronomical imaging. It eventually ushered in the digital photography revolution and changed how scientists see the universe.

At the time, Bell Labs was in one of its most fertile phases, having already given the world the transistor, the laser, and information theory. The company was turning its attention to solid-state imaging and memory—technologies it hoped might one day support the burgeoning field of electronic communications. Boyle, then head of the device concepts department, and Smith, a physicist known for his intuitive design skills, were exploring how to create a new kind of semiconductor memory.

The spark came partly from internal competition. As Smith recalled during his Nobel lecture, Bell Labs’ Electronics division had two groups: William Boyle’s semiconductor department and another department which handled all other technologies. Under pressure to advance magnetic bubble memory, vice president Jack Morton urged Boyle’s group to develop a competing semiconductor device or see resources shift to the other group.

“To address this demand, on October 17, 1969, Bill and I got together in his office,” Smith later explained. “In a discussion lasting not much more than an hour, the basic structure of the CCD was sketched out on the blackboard, the principles of operation defined, and some preliminary ideas concerning applications were developed,” he said.

According to Bell Labs’ internal technical reports, the essence of their idea was that a grid of capacitors that could hold and shift electrical charges, one to the next, in a controlled sequence. The charge-coupled device would store data.

The CCD’s image-capture capability was an accidental discovery, Sizer said during his presentation at the Milestone ceremony.

Boyle and Smith were testing the CCD for use as a memory circuit “when they noticed that light in the room flipped bits in the device,” Sizer said. “That accident connected light and information—and turned a memory circuit into an imaging sensor.”

“Today the essence of that blackboard sketch lives in every smartphone camera. The CCD turned light into data. It taught machines to see.”

Within weeks, Boyle and Smith had a working prototype, which under laboratory lamps produced a faint but discernible pattern—a “ghostly image,” as Smith later described it.

Bell Labs quickly organized teams to refine the fabrication process, improve signal-to-noise ratio, and explore an array of uses including in video cameras and data storage arrays.

Management appeared to recognize the potential almost immediately, though commercial products were still years away. As noted at the time by former Bell Labs president Mervin J. Kelly, the CCD fit squarely within the institution’s mission: pushing the frontiers of solid-state electronics to strengthen communication systems.

“AT&T’s Bell Labs News wrote that it could be used in a small color TV camera for future videophones—a remarkably clairvoyant prediction,” Sizer said.

By the mid-1970s, companies including Fairchild Semiconductor, RCA, and Sony had taken the concept further, producing the first CCD video cameras and astronomical imagers, according to the Digital Camera Museum.

The device soon found its way into camcorders, telescopes, fax machines, and medical instruments. By the 1990s, CCDs had become the gold standard for digital imaging.

When Boyle and Smith received the Nobel Prize in Physics in 2009, they credited the company’s culture for their success.

“Bell Labs gave us the freedom to think in any direction,” Smith said in an interview about the Nobel Prize. “That was its genius.”

The IEEE Milestone plaque honoring the CCD reads:

“The charge-coupled device (CCD), originally conceived for digital memory applications, was later shown to offer a compact, sensitive, and efficient way to convert light into digital signals by storing light-generated charges in a series of tiny capacitors. Invented and developed by Bell Labs scientists Willard Boyle, George Smith, and Michael Tompsett, CCDs found wide use in astronomical instruments, medical imaging, and consumer electronics.”

Super-resolution fluorescence microscopy

According to accounts from Bell Labs archives and interviews published by the Nobel Foundation, by the early 1990s, Eric Betzig’s corner of the Bell Labs facility was alive with the hum of possibility. He received a 2014 Nobel Prize in Chemistry.

Fluorescence microscopy—a biologist’s window into the cell—had hit the diffraction limit of about 200 nanometers (or roughly half the wavelength of visible light), which had been postulated a century earlier by physicist Ernst Abbe. But Betzig suspected there was a way around it. His idea was radical for its time: If a single fluorescent molecule could be detected, he theorized, then perhaps an image could be built one molecule at a time, with each point localized far more precisely than the laws of optics previously seemed to allow.

Bell Labs continued to evolve through the 1990s, yet remained one of the world’s great research institutions. The breakup of AT&T ushered in a more commercially aware era. As a result, researchers were asked to balance blue-sky curiosity with a clearer line of sight to practical applications.

For Betzig and other researchers, whose passion leaned toward fundamental physics rather than communications or materials science, that balance was hard to strike, according to a 2012 Time magazine article written by Jon Gertner, adapted from his book The Idea Factory: Bell Labs and the Great Age of American Innovation.

The lab did not become hostile to discovery. Far from it. But management steered toward projects that promised tangible short-term returns in telecommunications and optoelectronics, Gertner said.

Betzig’s work on single-molecule fluorescence, while elegant, was difficult to justify within the emerging priorities. Over time, he felt his path diverging from that of the company.

“It wasn’t that they were wrong,” he said in a 2014 Nobel interview with the Royal Swedish Academy of Sciences. “Just that my interests no longer fit.”

After demonstrating single-molecule imaging in 1993, as documented in his paper in Optics Letters that year, Betzig found himself at a crossroads. Rather than retool his research to fit Bell Labs’ shifting agenda, he chose to step away. He left in 1995 to work at his father’s machine shop in Michigan—a move described in a September 2015 New York Times profile.

“In a discussion lasting not much more than an hour, the basic structure of the CCD was sketched out on the blackboard, the principles of operation defined, and some preliminary ideas concerning applications were developed.” —George E.Smith, 2009 Physics Nobel laureate

The story might have ended there if not for another promising physicist determined to break through Abbe’s theoretical boundary. Physicist Stefan W. Hell, an IEEE member, began publishing papers describing his stimulated emission depletion (STED) microscopy technique. It used a laser to make fluorescent molecules glow and a second, donut-shape laser to suppress fluorescence everywhere except a nanometer-scale central point so that telescopes could resolve features much smaller than half a wavelength.

Hell’s technique was among several advances in microscopy that spurred Betzig to resume his career in science. He joined the Howard Hughes Medical Institute’s Janelia Research Campus, in Ashburn, Va., where he continued his research.

Together with Harald Hess, another Bell Labs alumnus, Betzigl developed a working prototype demonstrating the feasibility of his microscopy method, which he called photoactivated localization microscopy, or PALM. It broke through the diffraction limit by precisely mapping thousands of blinking molecules to reconstruct nanometer-scale images.

Betzig shared the 2014 Nobel Prize in Chemistry for that work with Hell and IEEE Life Senior Member William E. Moerner. In 1988, while working at IBM’s Almaden Research Center in Silicon Valley, Moerner achieved the first optical detection of a single molecule.

For Betzig, the award was a reflection of Bell Labs’ enduring legacy—and the kind of deep, foundational curiosity it instilled in generations of scientists.

“Bell Labs taught me how to think,” he said in his Nobel Foundation biography and in interviews with The Washington Post. “Even after I left, I was still one of theirs.”

The IEEE Milestone plaque honoring super-resolution fluorescence microscopy reads:

“The first super-resolution image of a biological sample was obtained in 1992 by exciting and collecting light diffracted in the near field of the sample. This breakthrough achievement, called super-resolved fluorescence microscopy, exploited the properties of evanescent waves and made single-molecule microscopy possible. Its successful use in imaging single fluorophores inspired applications in cell biology, microbiology, and neurobiology.”

Fractional Quantum Hall Effect

In early 1982, in a low-temperature laboratory at Bell Labs, physicist Horst L. Störmer watched a set of electrical traces appear on an oscilloscope that defied every expectation. The measurements were taken from a wafer of gallium arsenide cooled to a few thousandths of a degree above absolute zero and placed in a powerful magnetic field. The pattern that emerged showed “beautiful, clean plateaus in Hall resistance, but at fractional values of e2/h”—the fundamental constant, where e represents the electrons’ charge and h equals Planck’s constant, the value of the smallest possible discrete packets of energy at atomic and subatomic scales, according to Störmer’s Nobel lecture in 1998.

To Störmer and his colleague Daniel C. Tsui, it was a moment both thrilling and disorienting. The electrons should have behaved like independent particles. Instead they were somehow acting as if they had split into smaller, correlated entities: quasiparticles with fractional charge. The phenomenon had no place in classical theory—at least not yet.

The discovery of the fractional quantum Hall effect (FQHE) led to “the development of new theoretical concepts of significance in many branches of modern physics,” as stated by the Royal Swedish Academy of Sciences in the news release announcing that Störmer and Tsui had been named Nobel laureates. As chronicled in the Bell Labs Technical Journal and the Nobel Foundation’s background material about the technology, FQHE emerged from the collaborative environment at Bell Labs.

Störmer joined the company in 1970 to study high-mobility two-dimensional electron systems—structures made possible by molecular beam epitaxy. The exquisitely pure gallium arsenide/aluminum–gallium arsenide heterostructures allowed electrons to move almost without scattering, making them ideal playgrounds for exploring quantum phenomena.

Working with Tsui, who had a well-honed feel for experimentation, Störmer began studying how the two-dimensional electron gases behaved under magnetic fields of several teslas. In 1980 Klaus von Klitzing at the Planck Institute for Solid State Research, in Stuttgart, Germany, discovered the integer quantum Hall effect. Von Klitzing showed that current flow, instead of varying smoothly across the magnetic field, forms plateaus at precise, quantized values in integer multiples of e2/h—a discovery that earned him the 1985 Nobel Prize in Physics.

Störmer and Tsui noted in a 1982 Physical Review Letters paper (“The Fractional Quantum Hall Effect”) that their data showed the plateaus appeared not just at integers but at simple fractions such as one-third. Something entirely new was happening.

At first, neither Störmer nor Tsui could believe the measurements. The duo was surprised by the data they were seeing, according to the news release announcing that they had been named Nobel laureates. The results did not conform with existing theories. Yet repeated experiments confirmed the result.

Within weeks, the pair had a preprint ready for Physical Review Letters. It was published in November 1982.

The theoretical explanation came soon after, from Robert B. Laughlin, then a young theorist at Stanford. In a landmark 1983 Physical Review Letters paper, Laughlin explained theoretically what the Bell Labs researchers were seeing with their experiments. Laughlin proposed that under extreme magnetic fields and low temperatures, electrons could condense into a new collective quantum state—a “liquidlike state of matter” (such as a Bose-Einstein condensate)—supporting subatomic particles carrying a fraction of the electron’s charge. Laughlin’s elegant wavefunction not only explained the 1/3 state but also predicted an entire family of fractional states—all later confirmed experimentally.

The work exemplified the Bell Labs ecosystem at its best: precision materials from Cho’s MBE group, cryogenic measurement expertise from the low-temperature labs, and an atmosphere that encouraged cross-disciplinary risk-taking.

“We were never told to stop,” Störmer recalled in a Physics World interview.

Störmer, Tsui, and Laughlin shared the 1998 Nobel Prize in Physics for their discovery and theoretical explanation of the FQHE.

The IEEE Milestone plaque honoring the discovery of the FQHE reads:

In 1982 Bell Labs researchers revealed a new phase of matter, an incompressible quantum fluid that supports fractional charges. Daniel Tsui and Horst Störmer experimentally observed this result in two-dimensional electron systems confined within gallium arsenide heterostructures engineered by Arthur Gossard. This discovery, named the fractional quantum Hall effect (FQHE), transformed key concepts in physics while opening new directions in quantum computation and other potential applications.”

Convolutional neural networks

In the late 1980s, when most of the artificial intelligence community had grown disenchanted with neural networks, a small group of researchers at the Bell Labs facility in Holmdel, N.J., would not let the idea die. Their goal was deceptively simple: Teach computers to see as humans do by recognizing patterns in raw pixels.

The U.S. Postal Service was looking for a faster, more accurate way to read handwritten ZIP codes. Yann LeCun’s Bell Labs team trained a neural network on thousands of digit samples with varying slants and handwriting pressure. By the early 1990s, the team had built a prototype that matched human-level digit-reading accuracy.

The technology behind it—convolutional neural networks (CNNs)—was inspired by the human visual cortex. As LeCun explained in his 1998 Proceedings of the IEEE paper, “Gradient-Based Learning Applied to Document Recognition,”CNNs learn their filters directly from images through the mathematical operation of convolution. The idea drew on earlier work by researcher Kunihiko Fukushima, whose 1980 “neocognitron” model proposed a similar layered structure. LeCun frequently credited Fukushima as an influence, but his Bell Labs team made the concept practical.

Working with colleagues including Yoshua Bengio, LeCun implemented multilayer CNNs on state-of-the-art workstations and trained them using backpropagation, a technique formalized in a 1986 Nature paper coauthored by Geoffrey Hinton—the Nobel laureate under whom LeCun served as a postdoctoral researcher at the University of Toronto before joining Bell Labs.

By 1993, Bell Labs’ parent company, AT&T, had deployed CNN technology commercially in its check-sorting and mail-reading systems. Millions of envelopes were processed daily by CNN-enabled machines, according to Klover.ai’s history of the technology.

Despite that success, neural networks soon fell out of favor. As Communications of the ACM reported, limited data and computing power made newer methods, such as support vector machines, appear more effective. After Bell Labs’ 1996 spinoff into Lucent Technologies, research priorities shifted to short-term, market-driven goals.

Yet the intellectual groundwork endured. LeCun’s 1998 publication of LeNet-5 became a cornerstone for the next generation of AI researchers. When deep learning reemerged in the 2010s—fueled by powerful GPUs and vast image datasets—CNNs became the foundation of modern computer vision, enabling self-driving cars, advanced medical imaging, and smartphone cameras.

In 2018 LeCun, Bengio, and Hinton received the Turing Award—referred to as the “Nobel Prize of computing”—from the Association for Computing Machinery for their contributions to deep learning. By then, LeCun was a professor at New York University and director of Meta AI research—the Facebook parent company’s AI lab. He often credits Bell Labs as the place where the modern neural network learned to see.

The IEEE Milestone plaque honoring convolutional neural networks reads:

“In 1989 research on computational technologies at Bell Laboratories helped establish deep learning as a branch of artificial intelligence. Key efforts led by Yann LeCun developed the theory and practice of convolutional neural networks, which included methods of backpropagation, pruning, regularization, and self-supervised learning. Named LeNet, this deep neural network architecture advanced developments in computer vision, handwriting recognition, and pattern recognition.”

Previously publicized breakthroughs

Two additional innovations, the Echo project and the Bellmac-32 microprocessor, were honored with IEEE Milestone plaques at the October gathering. Stories of those inventions were detailed and celebrated this year in The Institute.

Sung-Mo Kang speaking in front of a projector screen displaying information about the BellMac-32 development team.IEEE Life Fellow Sung-Mo “Steve” Kang, one of the lead developers of the Bellmac-32 microprocessor honored as an IEEE Milestone, gave a talk and answered questions about the 1980s-era chip. Ben Lowe

IEEE Life Fellow Sung-Mo “Steve” Kang, one of the lead engineers who worked on the development of the Bellmac-32—which pioneered CMOS chip architecture and featured several other firsts—spoke at the Milestone event.

The Bellmac-32 had 150,000 transistors—“massive for 1981,” Kang said. “Today, a student could do that in a semester with CAD tools, but at that time, it took a hundred engineers.”

Plaques recognizing the seven IEEE Milestones are displayed in the lobby at the Nokia Bell Labs facility in Murray Hill, N.J. The IEEE North Jersey Section sponsored the nominations.

Administered by the IEEE History Center and supported by donors, the Milestone program recognizes outstanding technical developments worldwide that are at least 25 years old.

IEEE Multimedia covered the Milestone dedication event. Click here to watch the ceremony.

Video Friday: Robot Dog Shows Off Its Muscles

2025-12-13 01:00:02



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2026: 1–5 June 2026, VIENNA

Enjoy today’s videos!

Suzumori Endo Lab, Science Tokyo has developed a dog musculoskeletal robot using thin McKibben muscles. This robot mimics the flexible “hammock-like” shoulder structure to investigate the biomechanical functions of dog musculoskeletal systems.

[ Suzimori Endo Robotics Laboratory ]

HOLEY SNAILBOT!!!

[ Freeform Robotics ]

We present a system that transforms speech into physical objects using 3D generative AI and discrete robotic assembly. By leveraging natural language, the system makes design and manufacturing more accessible to people without expertise in 3D modeling or robotic programming.

[ MIT ]

Meet the next generation of edge AI. A fully self-contained vision system built for robotics, automation, and real-world intelligence. Watch how OAK 4 brings compute, sensing, and 3D perception together in one device.

[ Luxonis ]

Thanks, Max!

Inspired by vines’ twisty tenacity, engineers at MIT and Stanford University have developed a robotic gripper that can snake around and lift a variety of objects, including a glass vase and a watermelon, offering a gentler approach compared to conventional gripper designs. A larger version of the robo-tendrils can also safely lift a human out of bed.

[ MIT ]

The paper introduces an automatic limb attachment system using soft actuated straps and a magnet-hook latch for wearable robots. It enables fast, secure, and comfortable self-donning across various arm sizes, supporting clinical-level loads and precise pressure control.

[ Paper ]

Thanks, Bram!

Autonomous driving is the ultimate challenge for AI in the physical world. At Waymo, we’re solving it by prioritizing demonstrably safe AI, where safety is central to how we engineer our models and AI ecosystem from the ground up.

[ Waymo ]

Built by Texas A&M engineering students, this AI-powered robotic dog is reimagining how robots operate in disaster zones. Designed to climb through rubble, avoid hazards and make autonomous decisions in real time, the robot uses a custom multimodal large language model (MLLM) combined with visual memory and voice commands to see, remember and plan its next move like a first responder.

[ Texas A&M ]

So far, aerial microrobots have only been able to fly slowly along smooth trajectories, far from the swift, agile flight of real insects — until now. MIT researchers have demonstrated aerial microrobots that can fly with speed and agility that is comparable to their biological counterparts. A collaborative team designed a new AI-based controller for the robotic bug that enabled it to follow gymnastic flight paths, such as executing continuous body flips.

[ MIT ]

In this audio clip generated by data from the SuperCam microphone aboard NASA’s Perseverance, the sound of an electrical discharge can be heard as a Martian dust devil flies over the Mars rover. The recording was collected on Oct. 12, 2024, the 1,296th Martian day, or sol, of Perseverance’s mission on the Red Planet.

[ NASA Jet Propulsion Laboratory ]

In this episode, we open the archives on host Hannah Fry’s visit to our California robotics lab. Filmed earlier this year, Hannah interacts with a new set of robots—those that don’t just see, but think, plan, and do. Watch as the team goes behind the scenes to test the limits of generalization, challenging robots to handle unseen objects autonomously.

[ Google DeepMind ]

This GRASP on Robotics Seminar is by Parastoo Abtahi from Princeton University, on “When Robots Disappear – From Haptic Illusions in VR to Object-Oriented Interactions in AR”.

Advances in audiovisual rendering have led to the commercialization of virtual reality (VR); however, haptic technology has not kept up with these advances. While a variety of robotic systems aim to address this gap by simulating the sensation of touch, many hardware limitations make realistic touch interactions in VR challenging. In my research, I explore how, by understanding human perception through the lens of sensorimotor control theory, we can design interactions that not only overcome the current limitations of robotic hardware for VR but also extend our abilities beyond what is possible in the physical world.
In the first part of this talk, I will present my work on redirection illusions that leverage the limits of human perception to improve the perceived performance of encountered-type haptic devices in VR, such as the position accuracy of drones and the resolution of shape displays. In the second part, I will share how we apply these illusory interactions to physical spaces and use augmented reality (AR) to facilitate situated and bidirectional human-robot communication, bridging users’ mental models and robotic representations.

[ University of Pennsylvania GRASP Laboratory ]

Webinar: Will AI End Distinct Programming Languages?

2025-12-12 23:50:07

Join Stephen Cass, Dina Genkina, and Kohava Mendelsohn as they discuss whether AI spells the end of distinct programming languages as we know it. IEEE Spectrum publishes a respected annual ranking of the year’s Top Programming Languages—but could this year be our last? This recording of the live webinar covers how AI is rapidly changing the landscape of programming languages, why knowing the best languages might not be necessary in your career, and what skills you’ll need instead.

- YouTube