2026-02-23 21:00:03

AI has driven an explosion of new number formats—the ways in which numbers are represented digitally. Engineers are looking at every possible way to save computation time and energy, including shortening the number of bits used to represent data. But what works for AI doesn’t necessarily work for scientific computing, be it for computational physics, biology, fluid dynamics, or engineering simulations. IEEE Spectrum spoke with Laslo Hunhold, who recently joined Barcelona-based Openchip as an AI engineer, about his efforts to develop a bespoke number format for scientific computing.
Laslo Hunhold is a senior AI accelerator engineer at Barcelona-based startup Openchip. He recently completed a Ph.D. in computer science from the University of Cologne, in Germany.
What makes number formats interesting to you?
Laslo Hunhold: I don’t know another example of a field that so few are interested in but has such a high impact. If you make a number format that’s 10 percent more [energy] efficient, it can translate to all applications being 10 percent more efficient, and you can save a lot of energy.
Why are there so many new number formats?
Hunhold: For decades, computer users had it really easy. They could just buy new systems every few years, and they would have performance benefits for free. But this hasn’t been the case for the last 10 years. In computers, you have a certain number of bits used to represent a single number, and for years the default was 64 bits. And for AI, companies noticed that they don’t need 64 bits for each number. So they had a strong incentive to go down to 16, 8, or even 2 bits [to save energy]. The problem is, the dominating standard for representing numbers in 64 bits is not well designed for lower bit counts. So in the AI field, they came up with new formats which are more tailored toward AI.
Why does AI need different number formats than scientific computing?
Hunhold: Scientific computing needs high dynamic range: You need very large numbers, or very small numbers, and very high accuracy in both cases. The 64-bit standard has an excessive dynamic range, and it is many more bits than you need most of the time. It’s different with AI. The numbers usually follow a specific distribution, and you don’t need as much accuracy.
What makes a number format “good”?
Hunhold: You have infinite numbers but only finite bit representations. So you need to decide how you assign numbers. The most important part is to represent numbers that you’re actually going to use. Because if you represent a number that you don’t use, you’ve wasted a representation. The simplest thing to look at is the dynamic range. The next is distribution: How do you assign your bits to certain values? Do you have a uniform distribution, or something else? There are infinite possibilities.
What motivated you to introduce the takum number format?
Hunhold: Takums are based on posits. In posits, the numbers that get used more frequently can be represented with more density. But posits don’t work for scientific computing, and this is a huge issue. They have a high density for [numbers close to one], which is great for AI, but the density falls off sharply once you look at larger or smaller values. People have been proposing dozens of number formats in the last few years, but takums are the only number format that’s actually tailored for scientific computing. I found the dynamic range of values you use in scientific computations, if you look at all the fields, and designed takums such that when you take away bits, you don’t reduce that dynamic range
This article appears in the March 2026 print issue as “Laslo Hunhold.”
2026-02-23 19:00:02

AI is revolutionizing the cybersecurity landscape. From accelerating threat detection to enabling real-time automated responses, artificial intelligence is reshaping how organizations defend against increasingly sophisticated attacks.But with these advancements come new and complex risks—AI systems themselves can be exploited, manipulated, or biased, creating fresh vulnerabilities.
In this session, we’ll explore how AI is being applied in real-world cybersecurity scenarios—from anomaly detection and behavioral analytics to predictive threat modeling. We’ll also confront the challenges that come with it, including adversarial AI, data bias, and the ethical dilemmas of autonomous decision-making.
Looking ahead, we’ll examine the future of intelligent cyber defense and what it takes to stay ahead of evolving threats. Join us to learn how to harness AI responsibly and effectively—balancing innovation with security, and automation with accountability.
2026-02-23 17:00:03

Social media is going the way of alcohol, gambling, and other social sins: societies are deciding it’s no longer kids’ stuff. Lawmakers point to compulsive use, exposure to harmful content, and mounting concerns about adolescent mental health. So, many propose to set a minimum age, usually 13 or 16.
In cases when regulators demand real enforcement rather than symbolic rules, platforms run into a basic technical problem. The only way to prove that someone is old enough to use a site is to collect personal data about who they are. And the only way to prove that you checked is to keep the data indefinitely. Age-restriction laws push platforms toward intrusive verification systems that often directly conflict with modern data-privacy law.
This is the age-verification trap. Strong enforcement of age rules undermines data privacy.
Most age-restriction laws follow a familiar pattern. They set a minimum age and require platforms to take “reasonable steps” or “effective measures” to prevent underage access. What these laws rarely spell out is how platforms are supposed to tell who is actually over the line. At the technical level, companies have only two tools.
The first is identity-based verification. Companies ask users to upload a government ID, link a digital identity, or provide documents that prove their age. Yet in many jurisdictions, 16-year-olds do not have IDs. In others, IDs exist but are not digital, not widely held, or not trustworthy. Storing copies of identity documents also creates security and misuse risks.
The second option is inference. Platforms try to guess age based on behavior, device signals, or biometric analysis, most commonly facial age estimation from selfies or videos. This avoids formal ID collection, but it replaces certainty with probability and error.
In practice, companies combine both. Self-declared ages are backed by inference systems. When confidence drops, or regulators ask for proof of effort, inference escalates to ID checks. What starts as a light-touch checkpoint turns into layered verification that follows users over time.
This pattern is already visible on major platforms.
Meta has deployed facial age estimation on Instagram in multiple markets, using video-selfie checks through third-party partners. When the system flags users as possibly underaged, it prompts them to record a short selfie video. An AI system estimates their age and, if it decides they are under the threshold, restricts or locks the account. Appeals often trigger additional checks, and misclassifications are common.
TikTok has confirmed that it also scans public videos to infer users’ ages. Google and YouTube rely heavily on behavioral signals tied to viewing history and account activity to infer age, then ask for government ID or a credit card when the system is unsure. A credit card functions as a proxy for adulthood, even though it says nothing about who is actually using the account. The Roblox games site, which recently launched a new age-estimate system, is already suffering from users selling child-aged accounts to adult predators seeking entry to age-restricted areas, Wired reports.
For a typical user, age is no longer a one-time declaration. It becomes a recurring test. A new phone, a change in behavior, or a false signal can trigger another check. Passing once does not end the process.
These systems fail in predictable ways.
False positives are common. Platforms identify as minors adults with youthful faces, or who are sharing family devices, or have otherwise unusual usage. They lock accounts, sometimes for days. False negatives also persist. Teenagers learn quickly how to evade checks by borrowing IDs, cycling accounts, or using VPNs.
The appeal process itself creates new privacy risks. Platforms must store biometric data, ID images, and verification logs long enough to defend their decisions to regulators. So if an adult who is tired of submitting selfies to verify their age finally uploads an ID, the system must now secure that stored ID. Each retained record becomes a potential breach target.
Scale that experience across millions of users, and you bake the privacy risk into how platforms work.
This is where emerging age-restriction policy collides with existing privacy law.
Modern data-protection regimes all rest on similar ideas: collect only what you need, use it only for a defined purpose, and keep it only as long as necessary.
Age enforcement undermines all three.
To prove they are following age verification rules, platforms must log verification attempts, retain evidence, and monitor users over time. When regulators or courts ask whether a platform took reasonable steps, “we collected less data” is rarely persuasive. For companies, defending themselves against accusations of neglecting to properly verify age supersedes defending themselves against accusations of inappropriate data collection.
It is not an explicit choice by voters or policymakers, but instead a reaction to enforcement pressure and how companies perceive their litigation risk.
Outside wealthy democracies, the tradeoff is even starker.
Brazil’s Statute of Child-rearing and Adolescents (ECA in Portuguese) imposes strong child-protection duties online, while its data protection law restricts data collection and processing. Now providers operating in Brazil must adopt effective age-verification mechanisms and can no longer rely on self-declaration alone for high-risk services. Yet they also face uneven identity infrastructure and widespread device sharing. To compensate, they rely more heavily on facial estimation and third-party verification vendors.
In Nigeria many users lack formal IDs. Digital service providers fill the gap with behavioral analysis, biometric inference, and offshore verification services, often with limited oversight. Audit logs grow, data flows expand, and the practical ability of users to understand or contest how companies infer their age shrinks accordingly. Where identity systems are weak, companies do not protect privacy. They bypass it.
The paradox is clear. In countries with less administrative capacity, age enforcement often produces more surveillance, not less, because inference fills the void of missing documents.
Some policymakers assume that vague standards preserve flexibility. In the U.K., then–Digital Secretary Michelle Donelan, argued in 2023 that requiring certain online safety outcomes without specifying the means would avoid mandating particular technologies. Experience suggests the opposite.
When disputes reach regulators or courts, the question is simple: can minors still access the platform easily or not? If the answer is yes, authorities tell companies to do more. Over time, “reasonable steps” become more invasive.
Repeated facial scans, escalating ID checks, and long-term logging become the norm. Platforms that collect less data start to look reckless by comparison. Privacy-preserving designs lose out to defensible ones.
This pattern is familiar, including online sales tax enforcement. After courts settled that large platforms had an obligation to collect and remit sales taxes, companies began continuous tracking and storage of transaction destinations and customer location signals. That tracking is not abusive, but once enforcement requires proof over time, companies build systems to log, retain, and correlate more data. Age verification is moving the same way. What begins as a one-time check becomes an ongoing evidentiary system, with pressure to monitor, retain, and justify user-level data.
None of this is an argument against protecting children online. It is an argument against pretending there is no tradeoff.
Some observers present privacy-preserving age proofs involving a third party, such as the government, as a solution, but they inherit the same structural flaw: many users who are legally old enough to use a platform do not have government ID. In countries where the minimum age for social media is lower than the age at which ID is issued, platforms face a choice between excluding lawful users and monitoring everyone. Right now, companies are making that choice quietly, after building systems and normalizing behavior that protects them from the greater legal risks. Age-restriction laws are not just about kids and screens. They are reshaping how identity, privacy, and access work on the Internet for everyone.
The age-verification trap is not a glitch. It is what you get when regulators treat age enforcement as mandatory and privacy as optional.
2026-02-22 21:00:02

The first time she tried to seduce me,
(atoms falling in a vacuum)
she asked about blackberries—
(every mass exerts some gravity)
Did I know their season, where they grow?
(galvanometers, gravimeters)
I could answer both easily—
(tools to measure small attractions)
down the dirt road in September.
(devices that report, don’t interfere)
She eagerly went there with me,
(variations in readings occur)
We ate more berries than we kept.
(electron exchange may explain this)
The sweet dark juice painted our lips.
(equilibrium then entropy)
2026-02-21 22:00:02

Data centers for AI are turning the world of power generation on its head. There isn’t enough power capacity on the grid to even come close to how much energy is needed for the number being built. And traditional transmission and distribution networks aren’t efficient enough to take full advantage of all the power available. According to the U.S. Energy Information Administration (EIA), annual transmission and distribution losses average about 5 percent. The rate is much higher in some other parts of the world. Hence, hyperscalers such as Amazon Web Services, Google Cloud and Microsoft Azure are investigating every avenue to gain more power and raise efficiency.
Microsoft, for example, is extolling the potential virtues of high-temperature superconductors (HTS) as a replacement for copper wiring. According to the company, HTS can improve energy efficiency by reducing transmission losses, increasing the resiliency of electrical grids, and limiting the impact of data centers on communities by reducing the amount of space required to move power.
“Because superconductors take up less space to move large amounts of power, they could help us build cleaner, more compact systems,” Alastair Speirs, the general manager of global infrastructure at Microsoft wrote in a blog post.
Copper is a good conductor, but current encounters resistance as it moves along the line. This generates heat, lowers efficiency, and restricts how much current can be moved. HTS largely eliminates this resistance factor, as it’s made of superconducting materials that are cooled to cryogenic temperatures. (Despite the name, high-temperature superconductors still rely on frigid temperatures—albeit significantly warmer than those required by traditional superconductors.)
The resulting cables are smaller and lighter than copper wiring, don’t lower voltage as they transmit current, and don’t produce heat. This fits nicely into the needs of AI data centers that are trying to cram massive electrical loads into a tiny footprint. Fewer substations would also be needed. According to Speirs, next-gen superconducting transmission lines deliver capacity that is an order of magnitude higher than conventional lines at the same voltage level.
Microsoft is working with partners on the advancement of this technology including an investment of US $75 million into Veir, a superconducting power technology developer. Veir’s conductors use HTS tape, most commonly based on a class of materials known as rare-earth barium copper oxide (REBCO). REBCO is a ceramic superconducting layer deposited as a thin film on a metal substrate, then engineered into a rugged conductor that can be assembled into power cables.
“The key distinction from copper or aluminum is that, at operating temperature, the superconducting layer carries current with almost no electrical resistance, enabling very high current density in a much more compact form factor,” says Tim Heidel, Veir’s CEO and co-founder.
Ruslan Nagimov, the principal infrastructure engineer for Cloud Operations and Innovation at Microsoft, stands near the world’s first HTS-powered rack prototype.Microsoft
HTS cables still operate at cryogenic temperatures, so cooling must be integrated into the power delivery system design. Veir maintains a low operating temperature using a closed-loop liquid nitrogen system: The nitrogen circulates through the length of the cable, exits at the far end, is re-cooled, and then recirculated back to the start.
“Liquid nitrogen is a plentiful, low cost, safe material used in numerous critical commercial and industrial applications at enormous scale,” says Heidel. “We are leveraging the experience and standards for working with liquid nitrogen proven in other industries to design stable, data center solutions designed for continuous operation, with monitoring and controls that fit critical infrastructure expectations rather than lab conditions.”
HTS cable cooling can either be done within the data center or externally. Heidel favors the latter as that minimizes footprint and operational complexity indoors. Liquid nitrogen lines are fed into the facility to serve the superconductors. They deliver power to where it’s needed and the cooling system is managed like other facility subsystem.
Rare earth materials, cooling loops, cryogenic temperatures—all of this adds considerably to costs. Thus, HTS isn’t going to replace copper in the vast majority of applications. Heidel says the economics are most compelling where power delivery is constrained by space, weight, voltage drop, and heat.
“In those cases, the value shows up at the system level: smaller footprints, reduced resistive losses, and more flexibility in how you route power,” says Heidel. “As the technology scales, costs should improve through higher-volume HTS tape manufacturing and better yields, and also through standardization of the surrounding system hardware, installation practices, and operating playbooks that reduce design complexity and deployment risk.”
AI data centers are becoming the perfect proving ground for this approach. Hyperscalers are willing to spend to develop higher-efficiency systems. They can balance spending on development against the revenue they might make by delivering AI services broadly.
“HTS manufacturing has matured—particularly on the tape side—which improves cost and supply availability,” says Husam Alissa, Microsoft’s director of systems technology. “Our focus currently is on validating and derisking this technology with our partners with focus on systems design and integration.”
2026-02-21 03:00:02

IEEE has enhanced its standing as a trusted, neutral authority on the role of technology in climate change mitigation and adaption. Last year it became the first technical association to be invited to a U.N. Conference of the Parties on Climate Change.
IEEE representatives participated in several sessions at COP30, held from 11 to 20 November in Belém, Brazil. More than 56,000 delegates attended, including policymakers, technologists, and representatives from industry, finance, and development agencies.
Following the conference, IEEE helped host the selective International Symposium on Achieving a Sustainable Climate. The International Telecommunication Union and IEEE hosted ISASC on 16 and 17 December at ITU’s headquarters in Geneva. Among the more than 100 people who attended were U.N. agency representatives, diplomats, senior leaders from academia, and experts from government, industry, nongovernment organizations, and standards development bodies.
Power and energy expert Saifur Rahman, the 2023 IEEE president, led IEEE’s delegation at both events. Rahman is the immediate past chair of IEEE’s Technology for a Sustainable Climate Matrix Organization, which coordinates, communicates, and amplifies the organization’s efforts.
IEEE first attended a COP in 2021.
“Over successive COPs, IEEE’s role has evolved from contributing individual technical sessions to being recognized as a trusted partner in climate action,” Rahman noted in a summary of COP30. “There is [a] growing demand for engineering insight, not just to discuss technologies but [also] to help design pathways for deployment, capacity-building, and long-term resilience.”
Joining Rahman at COP30 were IEEE Fellow Claudio Canizares and IEEE Member Filipe Emídio Tôrres.
Canizares is a professor of electrical and computer engineering at the University of Waterloo, in Ontario, Canada, and the executive director of the university’s sustainable energy institute.
Tôrres chairs the IEEE Centro-Norte Brasil Section (Brazil Chapter). An entrepreneur and a former professor, he is pursuing a Ph.D. in biomedical engineering at the University of Brasilia. He also represented the IEEE Young Professionals group while attending the conference.
In the Engineering for Climate Resilience: Water Planning, Energy Transition, Biodiversity session, Rahman showed a video from his 2024 visit to Shennongjia, China, where he monitored a clean energy project designed to protect endangered snub-nosed monkeys from human encroachment. The project integrates renewable energy, which helps preserve the forest and its wildlife.
Rahman also chaired a session at the Sustainable Development Goal Pavilion on balancing decarbonization efforts between industrialized and emerging economies.
Additionally, he participated in a joint panel discussion hosted by IEEE and the World Federation of Engineering Organizations on engineering strategies for climate resilience, including energy transition and biodiversity.
Rahman, Canizares, and Tôrres took part in a session on clean-tech solutions for a sustainable climate, hosted by the International Youth Nuclear Congress. The topics included fossil fuel–free electricity for communications in remote areas and affordable electricity solutions for off-grid areas.
The three also joined several panels organized by the IYNC that addressed climate resilience, career pathways in sustainability, and a mentoring program.
“Over successive COPs, IEEE’s role has evolved from contributing individual technical sessions to being recognized as a trusted partner in climate action.” —Saifur Rahman, 2023 IEEE president
The IYNC hosted the Voices of Transition: Including Pathways to a Clean Energy Future session, for which Tôrres and Rahman were panelists. They discussed the need to include underrepresented and marginalized groups, which often get overlooked in projects that convert communities to renewable energy.
Rahman, Canizares, and Tôrres visited the COP Village, where they met several of the 5,000 Indigenous leaders participating in the conference and discussed potential partnerships and collaborations. Climate change has made the land where the Indigenous people live more susceptible to severe droughts and wildfires, particularly in the Amazon region.
Rahman and Tôrres took a field trip to the Federal University of Para, where they met several faculty members and students and toured the LASSE engineering lab.
Tôrres, who says representing IEEE at COP30 was transformative, wrote a detailed report about the event.
“The experience reaffirmed my belief that engineering and technology, when combined with respect for cultural diversity, can play a critical role in shaping a more sustainable and equitable world,” he wrote. “It highlighted the importance of combining cutting-edge technological solutions with Indigenous wisdom and cultural knowledge to address the climate crisis.”
Rahman and Canizares give an overview of their COP30 experiences in an IEEE webinar.
“IEEE has a place at the table,” Rahman says in the video. “We want to showcase outside our comfort zone what IEEE can do. We go to all these global events so that our name becomes a familiar term. We are the first technical association organization ever to go to COP and talk about engineering.”
Canizares added that IEEE is now collaborating closely with the United Nations.
“This is an important interaction. And I think, moving forward, IEEE will become more relevant, particularly in the context of technology deployment,” he said. “As governments start technology deployments, they will see IEEE as a provider of solutions.”
Rahman was the general chair of the ISASC event, which focused on the delivery and deployment of clean energy. Among the presenters were IEEE members including Canizares, Paulina Chan, Surekha Deshmukh, Ashutosh Dutta, Tariq Durrani, Samina Husain, Bruce Kraemer, Bruno Meyer, Carlo Alberto Nucci, and Seizo Onoe.
Sessions were organized around six themes: energy transition, information and communication technology, financing, case studies, technical standards, and public-private collaborations. A detailed report includes the discussions, insights, and opportunities identified throughout ISASC.
Here are some key takeaways.
As part of ISASC, IEEE presented a technology assessment tool prototype. The web-based platform is designed to help policymakers, practitioners, and investors compare technology options against climate goals.
The tool can run a comparative analysis of sustainable climate technologies and integrate publicly available, expert-validated data.
The ISASC report concluded that by connecting engineering expertise with real-world deployment challenges, IEEE is working to translate global climate goals into measurable actions.
The discussions highlighted that the path forward lies less in inventing new technologies and more in aligning systems to deliver ones that already exist.
Summaries of COP30 and ISASC are available on the IEEE Technology for a Sustainable Climate website.