2026-01-01 03:00:02

High school sophomore Abigail Merchant has made it her mission to use technology to reduce flood-related deaths. The 15-year-old lives in Orlando, Fla., a state where flooding is frequent in part because of its low elevation.
The changing climate is increasing the risk. Warmer air holds more water, leading to heavier-than-usual rainfall and more flooding, according to the U.S. Environmental Protection Agency.
School
Orlando Science Middle High Charter, in Florida
Grade
Sophomore
Hobbies
Basketball and playing the drums
Currently satellites, synthetic aperture radar, and GPS are used to collect data on flood damage, track the location of victims, and communicate with emergency responders. But technology failures and slow data transmission speeds lead to delays in response time, Merchant says. The increase in global flooding has intensified the need for more accurate and reliable methods.
Last year Merchant built what she says is a more effective way to track and collect data during floods: a small, inexpensive, standardized CubeSat integrated with artificial intelligence. The little satellites use a multiple of 10- by 10- by 10-centimeter units—which allows manufacturers to develop their batteries, solar panels, computers, and other parts as off-the-shelf components.
The CubeSat takes images of an area and uses pattern recognition to detect flooding, assess infrastructure damage, and track survivors.
Merchant presented her paper on the device at this year’s IEEE Region 3 annual conference, IEEE SoutheastCon.
“IEEE is a foundational part of my growth as a young researcher,” she says. “It turned engineering from my dream to reality.”
Merchant says her interest in disaster response was sparked after learning that it can take several hours for emergency workers to receive satellite data.
Determined to find a faster method, she began researching technologies and discovered what CubeSats can do.
“CubeSats are very agile, scalable, and capable of forming constellations (multiple-satellite groups) that update data in nearly real time,” she says. “The idea that these small satellites—which fit into the palm of your hand—could deliver life-saving insights faster than traditional systems really inspired me to push the concept further.”
Last year Merchant and three of her classmates were accepted into MIT’s Beaver Works Build a CubeSat Challenge, where teams of up to five U.S. high school students were given eight months to develop a satellite capable of completing a space-based research mission.
Merchant’s team—the Satellite Sentinels—built a CubeSat powered by a convolutional neural network (CNN) that can identify heavily impacted flood zones and remotely collect data for disaster relief and environmental monitoring. CNNs analyze image data for pattern recognition.
Merchant was the group’s payload programmer and led the mission’s design and simulation efforts, which included planning, configuring hardware, and developing autonomous software and algorithms to manage the payload.
The team began by creating a 3D model of the device to visualize and refine the placement of its parts. The technology used—including a Raspberry Pi, multiple sensors, and a camera—was housed in a clear plastic cube.
The middle CubeSat was developed by Merchant and her team during the MIT Beaver Works Build a CubeSat Challenge. On the left is a commercial 1U CubeSat while on the right is a prototype of Merchant’s current design. Abigail Merchant
The device, which cost US $310 to build, weighs about 495 grams and was remotely connected to a laptop via Bluetooth during ground-based testing. The computer contains a machine learning algorithm—written by Merchant using Python—that analyzes collected images to detect flooding.
The CubeSat takes a high-definition image of its surroundings every 2 minutes and transmits it to the laptop. The satellite transfers up to 1,500 images daily and stores them on a 16-gigabyte SD card.
The algorithm then analyzes patterns, including changes in the water’s color and the image’s pixel density. When the algorithm detects flooding, the device can alert emergency responders.
“While many existing systems operate on multihour cycles, the CubeSat captures high-resolution images every 2 minutes,” Merchant says. “The system can then trigger alerts that are delivered to first responders via SMS or email.”
To test their system, Merchant and her team built a city model made of Lego blocks in an empty bathtub. They positioned the CubeSat over it, and it took images of the scene. They then added water and dirt to make it look more like a real flood. The CubeSat successfully transferred the images to the laptop, and the algorithm detected the flooding.
Out of 30 teams, the Satellite Sentinels placed third.
Merchant is continuing her research on flood-prevention technologies at Accenture in Richmond, Va., where she works remotely as a payload owner and designer for the company’s CubeSat launch team.
After the MIT program ended, Merchant decided to scale her project. She reached out to her former mentor Chris Hudson, the global technical lead in space cybersecurity at Accenture. He offered her an internship.
Merchant is working to make the transition from prototype to functional product but, she says, needs to overcome obstacles she encountered with her MIT project.
The main one was that the model struggled to detect flooding in variable conditions. It’s because the CNN model needs context, she says. Without it, the model can misinterpret complex visual cues. To fix the issue, Merchant trained the algorithm to spot flooding by identifying colors in individual pixels.
Transmitting images using Bluetooth worked in her bathroom, but it isn’t quite as useful when CubeSats are orbiting 700 kilometers above the ground.
“If you’ve used a Bluetooth headset before, you know it disconnects the moment you walk away from the device it’s connected to,” she says. “That isn’t going to work when the CubeSat constellation is in orbit.”
She suggested the Accenture team switch to SubMiniature Version A (SMA) antennas. The RF antennas connect to the CubeSats using an SMA connector.
“The development process has been one of the most formative experiences of my career so far,” Merchant says. “Working through the payload design and validation and meeting with these teams has given me so much experience, especially for my age.”
Her payload is expected to be launched early next year.
Merchant is an intern at the MIT Computer Science and Artificial Intelligence Laboratory, the school’s largest interdisciplinary lab, with 60 research groups. CSAIL is led by IEEE Fellow Daniela Rus, recipient of the 2025 IEEE Edison Medal.
The internship is remote, and Merchant conducts research in a laboratory at the University of Central Florida, in Orlando.
“IEEE is a foundational part of my growth as a young researcher. It turned engineering from a dream to reality.”
In that role, Merchant is focusing on cognitive cartography, a method for structuring complex information into semantic maps that reveal how ideas and concepts relate to one another. She uses embedding models, a type of machine learning that converts information into numerical representations. The embeddings allow computers to recognize similarities and relationships between concepts, even when they are described in different ways. The approach helps an AI product understand how ideas connect, rather than treating each piece of data as isolated.
“Being one of the youngest people in the lab is daunting,” Merchant says. “However, I’m really excited to learn from engineers and researchers who are working at the cutting edge of the field.”
She says she is hoping to attend MIT or Stanford.
Merchant was introduced to IEEE by Joe Jusai, former finance chair of the IEEE Orlando Section.
Her first personal experience with the organization happened in 2023 while she was conducting research for a science fair project. She was working on a robotic arm that could pick up items using an electroencephalogram and Bluetooth. The project was inspired by her grandmother, who suffers from mobility issues and was wheelchair-bound.
“I kept seeing IEEE mentioned in every regulation and standard I found,” Merchant says. When she learned about an upcoming Orlando Section meeting, she asked her mother to take her.
At the meeting, several members presented their research. Merchant asked Masood Ejaz and Varadraj Gurupur—the chapter chair and cochair—if she could discuss her science fair project.
“After presenting my work, IEEE quickly became a community that has shaped my understanding of what engineering can accomplish,” she says.
She felt on top of the world, she says, when she presented her paper about her CubeSat project at IEEE SouthEastCon.
“It’s one of those experiences that really changes you,” she says.
She is excited to become an IEEE student member when she starts college, she says. She also has her sights set on being elected as its president someday.
“I met Kathleen Kramer at one of my local IEEE events before she was elected IEEE president, and we spoke about my work,” she says. “After she was elected, I realized that I would love to become the president of IEEE someday.
“I hope one day that I can step into the same shoes as her and continue to help IEEE the same way it helped me.”
2025-12-31 22:00:01

Artificial intelligence in 2025 was less about flashy demos and more about hard questions. What actually works? What breaks in unexpected ways? And what are the environmental and economic costs of scaling these systems further?
It was a year in which generative AI slipped from novelty into routine use. Many people got accustomed to using AI tools on the job, getting their answers from AI search, and confiding in chatbots, for better or for worse. It was a year in which the tech giants hyped up their AI agents, and the general public seemed generally uninterested in using them. AI slop also became impossible to ignore—it was even Merriam-Webster’s word of the year.
Throughout it all, IEEE Spectrum’s AI coverage focused on separating signal from noise. Here are the stories that best captured where the field stands now.
Alamy
AI coding assistants have moved from novelty to everyday infrastructure—but not all tools are equally capable or trustworthy. This practical guide by Spectrum contributing editor Matthew S. Smith evaluates today’s leading AI coding systems, examining where they meaningfully boost productivity and where they still fall short. The result is a clear-eyed look at which tools are worth adopting now, and which remain better suited to experimentation.
Amanda Andrade-Rhoades/The Washington Post/Getty Images
As AI’s energy demands raise concerns, water use has emerged as a quieter but equally pressing issue. This article explains how data centers consume water for cooling, why the impacts vary dramatically by region, and what engineers and policymakers can do to reduce the strain. Written by the AI sustainability scholar Shaolei Ren and Microsoft sustainability lead Amy Luers, the article grounds a noisy public debate in data, context, and engineering reality.
iStock
When AI systems fail, they don’t fail like people do. This essay, by legendary cybersecurity guru Bruce Schneier and his frequent collaborator Nathan E. Sanders, explores how machine errors differ in structure, scale, and predictability from human mistakes. Understanding these differences, the researchers argue, is essential for building AI systems that can be responsibly deployed in the real world.
Christie Hemm Klok
In this insider account, John Dean, the cofounder and CEO of WindBorne Systems, tells readers how his team built one of the most technically ambitious AI forecasting systems to date. The company’s approach combines autonomous, long-duration weather balloons that surf the wind with a proprietary AI model called WeatherMesh, which both sends the balloons high-level instructions on where to go next and analyzes the atmospheric data they collect.
WindBorne’s platform can produce high-resolution predictions faster, using far less compute, and with greater accuracy than conventional physics-based methods. In the article, Dean walks readers through the engineering trade-offs, design decisions, and real-world tests that shaped the system from concept to deployment.
Eddie Guy
This elegantly written article is my personal favorite from 2025. In it, Spectrum freelancer Matthew Hutson tackles one of the most consequential and contentious questions in AI today: how to define artificial general intelligence (AGI) and measure progress toward that elusive goal. Drawing on historical context, current debates about benchmarks, and insights from leading researchers, Hutson shows why traditional tests fall short and why creating meaningful benchmarks for AGI is so fraught. Along the way, he explores the deep conceptual challenges of comparing machine and human intelligence.
Bonus: Try the test that AIs take to see how smart they are!
IEEE Spectrum
Each year, I roll up my sleeves as Spectrum’s AI editor and go through the sprawling Stanford AI Index to surface the data that really matters for understanding AI’s progress and pitfalls. 2025’s visual roundup distills a 400-plus-page report into a dozen charts that illuminate key trends in AI economics, energy use, geopolitical competition, and public attitudes.
2025-12-31 21:00:01

Last September, the Defense Advanced Research Projects Agency (DARPA) unleashed teams of robots on simulated mass-casualty scenarios, including an airplane crash and a night ambush. The robots’ job was to find victims and estimate the severity of their injuries, with the goal of helping human medics get to the people who need them the most.
Kimberly Elenberg is a principal project scientist with the Auton Lab of Carnegie Mellon University’s Robotics Institute. Before joining CMU, Elenberg spent 28 years as an army and U.S. Public Health Service nurse, which included 19 deployments and serving as the principal strategist for incident response at the Pentagon.
The final event of the DARPA Triage Challenge will take place in November, and Team Chiron from Carnegie Mellon University will be competing, using a squad of quadruped robots and drones. The team is led by Kimberly Elenberg, whose 28-year career as an army and U.S. Public Health Service nurse took her from combat surgical teams to incident response strategy at the Pentagon.
Why do we need robots for triage?
Kimberly Elenberg: We simply do not have enough responders for mass-casualty incidents. The drones and ground robots that we’re developing can give us the perspective that we need to identify where people are, assess who’s most at risk, and figure out how responders can get to them most efficiently.
When could you have used robots like these?
Elenberg: On the way to one of the challenge events, there was a four-car accident on a back road. For me on my own, that was a mass casualty event. I could hear some people yelling and see others walking around, and so I was able to reason that those people could breathe and move.
In the fourth car, I had to crawl inside to reach a gentleman who was slumped over with an occluded airway. I was able to lift his head until I could hear him breathing. I could see that he was hemorrhaging and feel that he was going into shock because his skin was cold. A robot couldn’t have gotten inside of the car to make those assessments.
This challenge involves enabling robots to remotely collect this data—can they detect heart rate from changes in skin color or hear breathing from a distance? If I’d had these capabilities, it would have helped me identify the person at greatest risk and gotten to them first.
How do you design tech for triage?
Elenberg: The system has to be simple. For example, I can’t have a device that’s going to force a medic to take their hands away from their patient. What we came up with is a vest-mounted Android phone that flips down at chest height to display a map that has the GPS location of all of the casualties on it and their triage priority as colored dots, autonomously populated from the team of robots.
Are the robots living up to the hype?
Elenberg: From my time in service, I know the only way to understand true capability is to build it, test it, and break it. With this challenge, I’m learning through end-to-end systems integration—sensing, communications, autonomy, and field testing in real environments. This is art and science coming together, and while the technology still has limitations, the pace of progress is extraordinary.
What would be a win for you?
Elenberg: I already feel like we’ve won. Showing responders exactly where casualties are and estimating who needs attention most—that’s a huge step forward for disaster medicine. The next milestone is recognizing specific injury patterns and the likely life-saving interventions needed, but that will come.
This article appears in the January 2026 print issue as “Kimberly Elenberg.”
2025-12-31 03:00:02

As a college student, are you concerned that your knowledge alone won’t be enough to impress potential employers? Do you feel you lack the necessary hands-on technical skills to secure a job? Maybe you’ve thought of an engineering solution for a problem in your school or community but are unsure how to take the next step.
I struggled to bridge the gap between classroom theory and real-world application. But when you combine academic knowledge with practical projects that solve a societal problem with technology, you can ace any interview.
You don’t have to navigate the journey alone. Here are some lessons I learned as a student.
I’m a cloud support engineer at a company in Hyderabad, India. I’m also an active IEEE volunteer as one of its young professionals, an impact creator, and a brand ambassador.
In my role as impact creator, I share my insights on engineering, computing, and technology with the news media to highlight trends and consumer habits. As a brand ambassador, I educate students and professionals on how to display IEEE branding on websites, newsletters, banners, event materials, and other items.
When I was in my first semester as a computer engineering student at Guru Gobind Singh Indraprastha University, in New Delhi, I became frustrated by the long lines to check books in and out of the library of the affiliated college, the HMR Institute of Technology and Management. Even getting a new library card took a long time. I was determined to solve the problem.
For six months, I singlehandedly developed a software program to scan student ID cards and speed up the processes. I received the school’s first Technocrat Award for my efforts.
Word got out about my programming skills, and I received many requests to help solve other problems. An intriguing one was from the director of India’s largest national broadcasting company, All India Radio. I was asked to streamline its accounting process. At the time, the company used only Microsoft Excel along with a pen-and-paper system. It took me just six months to build a full-stack accounting software program to make the process significantly more efficient.
“When you combine academic knowledge with practical projects that solve a societal problem with technology, you can ace any interview.”
That opportunity was a big break for me. The technology I created redefined the broadcaster’s operations and could be used in its other offices, expanding my reach.
In my first corporate job interview after graduating from university, the interviewer was surprised to learn that I’d published 15 research papers, completed 15 projects, and even had a pending patent application. (The government has since granted the patent.)
The human resources representative and the technical-round interviewer weren’t expecting a recent graduate to have research published, and they were impressed. I was excited to see their reactions.
Students need to understand the importance of doing something exceptional beyond learning theory and concepts. Having practical skills before leaving school is a great way to set yourself apart from other new engineering graduates.
Before taking on any new projects, I ask myself five simple questions. They might seem obvious, but some of the details are often forgotten. Even as a student, when you start working with clients, you must have a process for gathering the information you need.
When it comes to getting the correct information, I focus on the five W’s: who, what, why, when, and where.
Once I get those answers, I begin using design thinking to strategize.
My clients generally are looking to improve existing solutions rather than starting from scratch. I must know what is and isn’t working with the current program.
Remember that although the process might be easy for you, it might be new for your client.
Here are what I consider to be the five stages of the process.
Understand the problem. Once you identify the client’s issue, the next step is to listen to the client in full without making judgments. You need to really understand the pain points and why the current application isn’t working. Listen fully, ask questions, and try to empathize with the client’s issues.
Research and ideation. Do your own research. It’s essential to conduct field research to better understand the client’s requirements. One of my projects was to help farmers secure loans directly from the Indian government, rather than go through loan service agencies and banks. The farmers were frustrated over how long it took to get loans. While doing my research, I was shocked by the high fees the agencies charged to process the necessary paperwork.
I wouldn’t have learned about that from just reading research papers. You have to explore your client’s pain points.
Next, start brainstorming. Consider how you can improve the current model. Maybe you should conduct research to find other products that might solve the problem. Also consider redesigning the current version. Let yourself think of as many ideas as possible, then review them with your client and request feedback.
That can give you a clear idea of what the client likes about the options, and it can help you better direct the rest of your research and ideation.
Technology research and prototyping. By this stage, you’ve created a short list of ideas to address your client’s pain points. Next, research all the technologies you need to use. If you need training, use learning platforms such as Coursera, EdX, the IEEE Learning Network, Udacity, and Udemy.
Once you identify and learn the technology needed, it’s time to create the first prototype.
Test and improve. Test the prototype, gather feedback from your client while you take meticulous notes, and then revise it according to the feedback.
That helps you understand what improvements are needed and helps identify gaps in your model. It gets you closer to the client’s requirements. Use the information to refine the design and build the product.
It is important to note that this stage might go through multiple iterations. You might have to continue to improve the results until the design works for the client. Refer back often to your original notes on the pain points to ensure you haven’t forgotten anything in the final design.
Protect your intellectual property. Many students and young professionals skip the important step of safeguarding their idea such as copyrighting it, publishing a paper, or filing a patent. I have seen many students who present their ideas at hackathons and competitions and assume that receiving cash prizes is enough to list on their résumé. They really should protect their ideas.
After speaking at more than 1,000 IEEE workshops and other events in more than 25 countries, I’m concerned that students aren’t using their technical knowledge to its fullest potential. To learn more about how to use your time and skills as a new engineer, view my YouTube channel.
Don’t wait for an opportunity to knock on your door. Create your own opportunities by participating in IEEE technical and nontechnical events and getting involved with the organization’s student service-learning program, EPICS in IEEE.
The participation, volunteering, and networking (PVN) model of IEEE—which I coined—works.
2025-12-30 22:00:02
![]()
This year’s top semiconductor stories were mostly about the long and twisting trips a technology takes from idea (or even raw material) to commercial deployment. I’ve been at IEEE Spectrum long enough to have seen some of the early days of things that became commercial only this year.
In chip-making that includes the production of the next evolution of transistor design—nanosheet transistors—and the arrival of nanoimprint lithography. In optoelectronics, it was the commercialization of optical fiber links that go directly into the processor package.
Of course there were also great new technologies recently born, like growing diamond inside ICs to cool them. But there were also, unfortunately, developments that are getting in the way of moving technologies from the laboratory to the semiconductor fab.
Still, if anything, the year’s best semiconductor stories showed that technology is full of fascinating tales.
Peter Crowther
It seems one of our readers’ favorite things was this cool idea. Perhaps you read it while chilling out with a print copy of Spectrum or maybe while on your phone and icing a sore knee. (Okay. I’ll stop.) Stanford professor Srabanti Chowdhury explained how her team has come up with a way to grow diamonds inside ICs, mere nanometers from heat generating transistors. The result was radio devices that were more than 50 degrees Celsius cooler, and a pathway to integrate the highly heat-conductive material in 3D chips. The article was part of a special report on the problem of heat in computing that includes an article on cooling chips with lasers and other great reads.
Left: Stefan Ziegenbalg; Right: ASML
This one had a little bit of everything. It’s the story of how ASML figured out a key unknown in the development of one of the most crucial (and craziest) contraptions in technology today, the light source for extreme ultraviolet lithography. But it’s also a sweet story of a man and his grandfather—but with supernovas, atomic bomb blasts, high-powered lasers, and a cameo by computer pioneer John von Neumann.
Mingrui Ao, Xiucheng Zhou et al.
In past years, we’ve reported plenty about advances in making individual 2D transistors work well. But in April we delivered a story of some 2D semiconductor integration heroics. Researchers in China managed to integrate nearly 6,000 molybdenum disulfide devices to make a RISC-V processor. Amazingly, despite using just laboratory-level manufacturing, the chip’s creators achieve a 99.7 percent yield of good transistors.
Canon
Our Japan correspondent, John Boyd, described an exciting potential competitor to EUV lithography. Canon announced that it had sold the first nanoimprint lithography system for chip making. Instead of carrying the chip’s features as a pattern of light, this machine literally stamps them onto the silicon. It’s a technology that’s been decades in the making. In fact, one of my first reporting trips for IEEE Spectrum was to visit a startup using nanoimprint lithography to make specialized optics. I got in a minor car accident on my way there and never got to see the tech in person. But if you want a look, there’s one in Austin, Texas.
IEEE Spectrum; Source image: Natcast
The U.S. CHIPS and Science Act promised to be transformational—not just for chip manufacturing, but for providing R&D and infrastructure that would help close the dreaded lab-to-fab gap that captures and kills so many interesting ideas. The main vehicle for that R&D and infrastructure was the National Semiconductor Technology Center, a legally mandated, US $7.4 billion program to be administered by a public-private partnership. But the Commerce Department ended the latter entity, called Natcast, in late Summer. The vitriol with which it was done shocked many chip experts. Now Commerce has killed another CHIPS Act center, the SMART USA Institute, which was dedicated to digital twins for chip manufacturing.
Nvidia
The idea of bringing speedy, low-power optical interconnects all the way to the processor has fired the imagination of engineers for years. But high cost, low-reliability, and serious engineering issues have kept it from happening. This year we saw the first hint that it was really coming. Broadcom and Nvidia—separately—developed optical transceivers integrated in the same package as network switch chips, which sling data from server rack to server rack inside data centers.
IEEE Spectrum
TSMC and Intel have begun manufacturing new types of transistors, called nanosheets or gate-all-around. We got the first look at what this means for shrinking the next generation of logic chips when both companies reported details of SRAM memory for such new chips. Amazingly, both companies produced memory cells exactly as small as each other right down to the nanometer. Even more amazingly, Synopsys designed a cell using the previous generation of transistors that hit that density as well, but they didn’t perform nearly as well.
Optics Lab
2025-12-30 21:00:02

Charging an EV at home doesn’t seem like an inconvenience—until you find yourself dragging a cord around a garage or down a rainy driveway, then unplugging and coiling it back up every time you drive the kids to school or run an errand. For elderly or disabled drivers, those bulky cords can be a physical challenge.
As it was for smartphones years ago, wireless EV charging has been the dream. But there’s a difference of nearly four orders of magnitude between the roughly 14 watt-hours of a typical smartphone battery and that of a large EV. That’s what makes the wireless charging on the 108-kilowatt-hour pack in the forthcoming Porsche Cayenne Electric so notable.
To offer the first inductive charger on a production car, Porsche had to overcome both technical and practical challenges—such as how to protect a beloved housecat prowling below your car. The German automaker demonstrated the system at September’s IAA Mobility show in Munich.
With its 800-volt architecture, the Cayenne Electric can charge at up to 400 kW at a public DC station, enough to fill its pack from 10 to 80 percent in about 16 minutes. The wireless system delivers about 11 kW for Level 2 charging at home, where Porsche says three out of four of its customers do nearly all their fill-ups. Pull the Cayenne into a garage and align it over a floor-mounted plate, and the SUV will charge from 10 to 80 percent in about 7.5 hours. No plugs, tangled cords, or dirty hands. Porsche will offer a single-phase, 48-ampere version for the United States after buyers see their first Cayennes in mid-2026, and a three-phase, 16-A system in Europe.
The concept of inductive charging has been around for more than a century. Two coils of copper wire are positioned near one another. A current flowing through one coil creates a magnetic field, which induces voltage in the second coil.
In the Porsche system, the floor-mounted pad, 78 centimeters wide, plugs into the home’s electrical panel. Inside the pad, which weighs 50 kilograms, grid electricity (at 60 hertz in the United States, 50 Hz in most of the rest of the world) is converted to DC and then to high-frequency AC at 2,000 V.The resulting 85-kilohertz magnetic field extends from the pad to the Cayenne, where it is converted again to DC voltage.
The waterproof pad can also be placed outdoors, and the company says it’s unaffected by leaves, snow, and the like. In fact, the air-cooled pad can get warm enough to melt any snow, reaching temperatures as high as 50 °C.
The Cayenne’s onboard charging hardware mounts between its front electric motor and battery. The 15-kg induction unit wires directly into the battery.
In most EVs, plug-in (conductive) AC charging tops out at around 95 percent efficiency. Porsche says its wireless system delivers 90 percent efficiency, despite an air gap of roughly 12 to 18 cm between the pad and vehicle.
Last year, Oak Ridge National Laboratory transferred an impressive 270 kilowatts to a Porsche Taycan with 95 percent efficiency.
“We’re super proud that we’re just below conductive AC in charging efficiency,” says Simon Schulze, Porsche’s product manager for charging hardware. Porsche also beats inductive phone chargers, which typically max out at about 70 percent efficiency, Schulze says.
When the car gets within 7.5 meters of the charging pad, the Cayenne’s screen-based parking-assist system turns on automatically. Then comes a kind of video game that requires the driver to align a pair of green circles on-screen, one representing the car, the other the pad. It’s like a digital version of the tennis ball some people hang in their garage to gauge parking distance. There’s ample wiggle room, with tolerances of 20 cm left to right, and 15 cm fore and aft. “You can’t miss it,” according to Schulze.
Induction loops detect any objects between the charging plate and the vehicle; such objects, if they’re metal, could heat up dangerously. Radar sensors detect any living things near the pad, and will halt the charging if necessary. People can walk near the car or hop aboard without affecting a charging session.
Christian Holler, Porsche’s head of charging systems, says the system conforms to International Commission on Non-Ionizing Radiation Protection standards for electromagnetic radiation. The field remains below 15 microteslas, so it’s safe for people with pacemakers, Porsche insists. And the aforementioned cat wouldn’t be harmed even if it strayed into the magnetic field, though “its metal collar might get warm,” Schulze says.
The Porsche system’s 90 percent efficiency is impressive but not record-setting. Last year, Oak Ridge National Laboratory (ORNL) transferred 270 kW to a Porsche Taycan with 95 percent efficiency, boosting its state of charge by 50 percent in 10 minutes. That world-record wireless rate relied on polyphase windings for coils, part of a U.S. Department of Energy project that was backed by Volkswagen, Porsche’s parent company.
That effort, Holler says, spawned a Ph.D. paper from VW engineer Andrew Foote. Yet the project had different goals from the one that led to the Cayenne charging system. ORNL was focused on maximum power transfer, regardless of cost, production feasibility, or reliability, he says.
By contrast, designing a system for showroom cars “requires a completely different level of quality and processes,” Holler says.
Cayenne buyers in Europe will pay around €7,000 (roughly US $8,100) for the optional charger. Porsche has yet to price it for the United States.
Loren McDonald, chief executive of Chargeonomics, an EV-charging analysis firm, said wireless charging “is clearly the future,” with use cases such as driverless robotaxis, curbside charging, or at any site “where charging cables might be an annoyance or even a safety issue.”
But for now, inductive charging’s costly, low-volume status will limit it to niche models and high-income adopters, McDonald says. Public adoption will be critical “so that drivers can convenience-charge throughout their driving day—which then increases the benefits of spending more money on the system.”
Porsche acknowledges that issue; the system conforms to wireless standards set by the Society of Automotive Engineers so that other automakers might help popularize the technology.
“We didn’t want this to be proprietary, a Porsche-only solution,” Schulze says. “We only benefit if other brands use it.”