MoreRSS

site iconIEEE SpectrumModify

IEEE is the trusted voice for engineering, computing, and technology information around the globe. 
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of IEEE Spectrum

See the Sky Like Never Before With a DIY Eyepiece

2026-01-04 21:00:01



When it comes to viewing nebulae, galaxies, and other deep-sky objects, amateur astronomers on a budget have had two options. They can view with the naked eye through a telescope and perceive these spectacular objects as faint smudges that don’t even begin to capture their majesty, or they can capture long-exposure images with astrocameras and display the results on a view screen or computer, which robs the immediacy of the stargazing experience.

Stand-alone telescope eyepieces with active light amplification do exist for a real-time viewing, but commercial products are pricey, costing hundreds to thousands of dollars. I wanted something I could use for the public-astronomy observation nights that I organize in my community. So I decided to build a low-cost DIY amplifying eyepiece, to make it easier for visitors to observe deep-sky objects but without requiring a large financial investment on my part.

I quickly realized there was already an industry replete with hardware for handling low-light conditions—the security-camera industry. Faced with the challenge of monitoring areas with a variety of lighting, often using cameras spread out over a large facility, makers of closed-circuit television (CCTV) cameras created a video standard that uses digital sensors to capture images but then transmits them as HD-resolution analog signals over coaxial cables. By using this Analog High Definition (AHD) transmission standard, you can attach new cameras to preexisting long cable runs and still get a high-quality image.

Primary components for the amplifying eyepiece.A CMOS-image sensor module from a security camera [top left], a USB capture card [bottom left], and an OLED viewfinder [right] process analog video data.James Provost

While I didn’t need the long-distance capability of these cameras, I was very interested in their low price and ability to handle dim conditions. The business end of these cameras is a module that integrates a CMOS image sensor with supporting electronics. After some research, I settled on a module that combined a 2-megapixel Sony IMX307 sensor with a supporting NVP2441 chipset.

The key factor was choosing a sensor-chipset combination that supports something called Starlight or Sens-Up mode. This makes the camera more sensitive to light than the human eye, albeit at the cost of a little speed. Images are created by integrating approximately 1.2 seconds of exposure time on the sensor. That might make for choppy security footage, but it’s not noticeable when making observations of nebulae and other astronomical objects (unless of course something really weird happens in the sky!)

From Astronomy To Security Cameras and Back Again

The existence of the Sens-up mode is actually part of the technical heritage of digital imaging sensors. CMOS sensors were developed as a successor to charge-coupled devices (CCDs), which were eagerly embraced by the astronomical community following their introduction in 1970, replacing long-exposure photographic plates. However, the ability to take exposure frames as long as one second is rarely something that CCTV cameras are designed for: It can be more of a drawback than a feature, leading to blurred images of moving objects or people.

As a result, this capability is rarely mentioned in the product descriptions, and so finding the right module was the most challenging part: I had to buy three different camera modules before finally landing on one that worked.

The output from the camera module is passed to a digital viewfinder, which displays both the video and control menus generated by the module. These menus are navigated using a four-way, press-to-select joystick that connects to a dedicated header on the module.

The output of the camera is also passed to a capture card that converts the analog signal to digital and provides a USB-C interface, which allows images to be seen and saved using a smartphone. All the electronics can be powered via battery for complete stand-alone operation or from a USB cable attached to the capture card.

Diagram showing connections between CMOS sensor, joystick, OLED, analog HD, viewfinder, USB.The analog HD module can be controlled directly using a joystick to navigate onscreen menus. Power can be provided externally via USB-C connector on the capture card or via an optional battery pack.James Provost

The components fit in an enclosure I made from 3D-printed parts, designed to match the 32-millimeter diameter of most telescope eyepieces for easy mounting. The whole thing cost less than US $250.

Testing out the Amplifying Eyepiece

I took my new amplifying eyepiece out with my Celestron C11 telescope to give it a try. Soon I had in my viewfinder the Dumbbell Nebula, also known as Messier 27/M27, which is normally quite hard to see. It was significantly brighter compared to a naked-eye observation. Certainly the difference wasn’t as marked as with a commercial rig that has noise-reducing cooling for the sensor electronics. But it was still an enormous improvement and for a fraction of the cost.

A photograph of a cloudy purple and white deep-sky object.The Orion Nebula, some 1,340 light years away.Jordan Blanchard

The amplifier is also more versatile: You can remove it from the telescope, and with a 2.8-mm HD lens fitted to the camera-module sensor, you can use it as a night-vision camera. That’s handy when trying to operate in dark outdoor conditions on starry nights!

For the future, I’d like to upgrade the USB-C capture module to one that can handle the sensor’s digital output directly, rather than just the analog signal. This would give a noticeable boost in resolution when recording or streaming to a phone or computer. Beyond that, I’m interested in finding another low-cost camera module with a longer exposure, and refining the 3D-printed housing so it’s easier to build and adapt to other observing setups. That way the eyepiece will stay affordable, but people can still push it toward more serious electronically assisted astronomy.

CES 2026 Preview: E-ink Smartphone, Allergen Detector, and More

2026-01-03 22:00:01



In a few days, Las Vegas will be inundated with engineers, executives, investors, and members of the press—including me—for the annual Consumer Electronics Show, one of the largest tech events of the year.

If you can dream it, there’s a good chance it’ll be on display at CES 2026 (though admittedly, much of this tech won’t necessarily make it to the mainstream). There will be a range of AI toys, AI notetakers, and “AI companions,” exoskeletons and humanoid robots, and health tech to track your hormones, brain activity, and... bathroom activity.

This year’s event will have keynote addresses from the CEOs of tech giants including AMD and Lenovo, and thousands of booths from companies spanning legacy brands to brand new startups.

I’m excited to stumble across unexpected new tech while wandering the show floor. But as I prepare for this year’s event, here are some of the devices that have already caught my eye.

Headphones that can read your brainwaves

Electroencephelography, or EEG, has been used in healthcare for decades to monitor neural activity. It usually involves a person wearing a whole helmet of electrodes, but scaled down versions of the tech are now being integrated into consumer devices and may soon be ready for users.

Several neurotech companies using EEG will be at CES this year. For instance, Neurable (a company we’ve had on our radar for years) will be there with its over-ear EEG headphones, which help users hone their focus and are now available for preorder. Naox will also bring its in-ear EEG tech to consumer-oriented earbuds. And Elemind, another company we’ve covered, aims to help you sleep with its headband. With wearables already monitoring vital signs, sleep, and activity, 2026 may be the year our brainwaves join the list of biosignals we can track on a daily basis.

A toothbrush to sniff out health issues

Sonic toothbrush company Y-Brush is introducing dental tech analyzing another biomarker: smelly breath. The toothbrush itself is y-shaped and looks almost like a dental retainer on a stick, with bristles surrounding the teeth. In the latest version, the Y-Brush Halo, a gas sensor called SmartNose is integrated to analyze breath biomarkers. The company says this allows the toothbrush to detect more than 300 health conditions, including early-stage diabetes and liver disorders.

Automatic massage roller to sooth sore muscles

RheoFit’s A1 massage roller basically looks like what would happen if a foam roller and a massage chair had a baby. The device automatically rolls itself down your spine and offers two replaceable surfaces: one harder material that mimics a masseuse’s knuckles, and a gentler option that’s meant to feel more like an open palm.

This will be the second year the RheoFit A1 is on display. To find out whether it’s worth the US $449 price tag, I’ll have to try it out on site.

Minimalist e-ink smartphone to cut out distractions

Just before last year’s CES, several devices were awarded a new certification meant to reward tech designed to be less distracting. While most exhibitors will keep competing for our attention, a few are tapping into a desire for calm and clarity.

Minimalist tech company Mudita, for instance, will be showing its e-ink smartphone, which began shipping in 2025. The phone is designed to reduce screen time with its black and white, paper-like display, and the operating system is designed to run without Google. Like other not-as-smart phones (such as the Wisephone or Light Phone), the Mudita Kompakt offers the essentials—messaging, maps, camera, and so forth—without constant notifications.

Food allergen detector to avoid anaphylaxis

Some new tech surprises users with an experience they didn’t know they wanted. Others aim to offer the solutions you’ve dreamt about. For me, French startup Allergen Alert falls into the second category.

The startup, one of the listed exhibitors at an early press event on 4 January, is developing a portable system to test food allergens in real time. I’ve been eating gluten-free for most of my life (not by choice), and I know how easily allergens can sneak into a dish with just a sprinkle of flour, or a dash of soy sauce—especially when you’re traveling or eating out. For many people with severe allergies, a device like this could be a lifesaver.

Jacob’s Ladder

2026-01-03 21:00:01



I know now how the sparks can climb,
in broadening arcs of ions—
the heat they grow inside themselves
like some permission or belief.

But at ten, it seemed mystical;
their frown, glowing, then invisible.
Gone. Save the odor of ozone.
I was young and scared and alone.

But the buzz and brightness began
anew in darker shades of blue. Then
electrons leaping spoke to me,
not in words, but in dignity:

how they escaped the box where they
were born. Joined in a plasma haze,
they rose unafraid. So it seemed.
I imagined them as sunbeams,

then as disrupting solar flares—
distant but, in time, reaching here
as unseen bursts to recombine,
smaller parts of the grand design.

Video Friday: Watch Scuttle Evolve

2026-01-03 02:00:02



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2026: 1–5 June 2026, VIENNA

Enjoy today’s videos!

I always love seeing robots progress from research projects to commercial products.

[ Ground Control Robotics ]

Well this has to be one of the most “watch a robot do this task entirely through the magic of jump cuts” I’ve ever seen.

[ UBTECH ]

Very satisfying sound on this one.

[ Pudu Robotics ]

Welcome to the AgileX Robotics Data Collection Facility—where real robots build the foundation for universal embodied intelligence. Our core mission? Enable large-scale data sharing and reuse across dual-arm teleoperation robots of diverse morphologies, breaking down data silos that slow down AI progress.

[ AgileX ]

I’m not sure how much thought was put into this, but giving a service robot an explicit cat face could be a good way of moderating expectations on its behavior and interactivity.

[ Pudu Robotics ]

UBTECH says they have built 1000 of their Walker S2 humanoid robots, over 500 of which are “delivered & working.” I would very much like to know what “working” means in this context.

[ UBTECH ]

Every story has its beginning, and ours started in 2023—a year defined by the unknown. Let technology return to passion; let trials catalyze evolution. Embracing growth, embarking on a new journey. We’ll see you at the next stop.

Please, please hire someone to do some HRI (human-robot interface) design.

[ PNDbotics ]

This Engineer Builds Bespoke Accordions and Autonomous Car Systems

2026-01-02 21:00:02



When Sergey Antonovich rediscovered a childhood passion for music, he found an unexpected application for his skills as an embedded systems engineer: building bespoke digital accordions.

Antonovich admits the accordion isn’t the coolest instrument. It was chosen for him by his mother when he was 8, and he quickly lost interest as a teenager. While growing up close to Moscow, his adolescent passions were instead channeled into electronics and tinkering with gadgets in after-school classes. This led to a career working on environmental-monitoring devices, sensors for commercial drones, and most recently, sensor systems at autonomous-vehicle developer Avride’s R&D hub in Austin, Texas.

Sergey Antonovich


Employer

Avride

Occupation

Embedded systems developer

Education

Master’s degree in engineering physics, Moscow Engineering Physics Institute

But when Antonovich picked his accordion back up as an adult, he discovered both latent musical skills and a newfound appreciation for the instrument. Like any good tinkerer, he had some ideas about how he could improve it and he soon began using his electronics knowledge to build custom devices.

And Antonovich says he’s found a surprising harmony between his day job and his hobby. Whether you’re ensuring that an autonomous vehicle spots obstacles in the road in time or translating a musician’s nimble finger work into a melodious tune, you need to rapidly process digital signals from the underlying hardware.

“Both systems, self-driving cars and accordions, are real-time embedded systems,” says Antonovich. “A self-driving car is more complicated because it contains many more components, but the principles are more or less equal.”

Electronics Trumps Music

Antonovich grew up in Chekhov, a small town outside Moscow, and says he had a pretty ordinary childhood. His father passed away when he was only 1, so he was brought up by his mother, who worked in the printing industry, and his grandmother, a school principal who taught Russian.

At 8 he was enrolled in a local music school where he learned the fundamentals of music theory and the accordion. He was a dutiful student, he says, but never felt much passion for the instrument his mother picked for him and stopped playing when he was about 15.

Sergey Antonovich shows off the digital instruments he makes in his free time. With one lightweight instrument, he becomes a one-man band. Sergey Antonovich

That was also when Antonovich had his first encounter with the world of electronics. He started attending after-school classes where he was taught to solder and build simple electronic systems. Antonovich quickly caught the bug and was soon knocking together digital doorbells, code locks, and basic radio receivers in his spare time.

His family encouraged him to enroll at a technical secondary school, which taught engineering skills alongside the standard curriculum. When it came to picking a university, he decided he wanted a grounding in physics, so he enrolled at the Moscow Engineering Physics Institute in 2004, choosing a program that taught a combination of hardware, software, and digital-signal processing.

Antonovich originally planned to become a software developer but quickly fell in love with hardware. “When you develop software, there is a level of abstraction between you and the thing itself,” he says. “But when you work with hardware, you understand how this particular thing actually works.”

Embedding Into a Career

Toward the end of his degree studies, in 2009, Antonovich started working for Moscow-based Ecosfera, a company focused on environmental and labor-safety measurement devices. He continued working for the company after graduating in 2010, designing hardware and software to measure conditions like temperature, humidity, and wind speed to ensure safe workplaces.

It was a niche field, but one with strict regulatory requirements, and he had to shepherd his devices through rigorous certification procedures, the first major accomplishment of his career. From there, he worked for a variety of companies on different embedded and Internet of Things systems, including ATMs, medical devices, sensors for commercial drones, and digital price tags. Then in 2021 he interviewed at internet company Yandex, which operates Russia’s most popular search engine, to work on its autonomous vehicle program.

“I remember I was approaching the office entrance and I saw a car which was driving itself,” Antonovich says. “You see it on YouTube, but it’s not such an inspiring experience. It’s really inspiring when you see it live.”

He got the job and started work as a software engineer developing vehicle sensor systems and testing infrastructure. A corporate restructuring saw Yandex’s autonomous vehicle division spun off as a new company called Avride. Antonovich worked for the company in Israel for about a year, then in 2024 moved to its new headquarters in Austin.

Antonovich says he works at Arvide primarily on the data that feed the vehicle’s perception algorithms, which includes radar and lidar. Both kinds of sensors have strengths and weaknesses—radar has long range but low resolution, while lidar is great at picking out shapes but only up to a certain distance—so the algorithmic perception system combines the data. Antonovich’s job is to build the diagnostic systems that ensure these sensors are working in perfect synchrony and deliver data within tight time limits.

Person holding a laptop standing in front of a white autonomous vehicle.In his day job, Antonovich works on the sensor systems for self-driving cars. Sergey Antonovich

Moving to the United States has been a positive change for Antonovich. On a professional front, the country’s soft-touch regulatory approach toward autonomous vehicles has allowed the company to make rapid progress on its technology. But he says the move has also helped him indulge his tinkering instincts in his spare time.

“As a maker, I would say [the United States] is a paradise,” he says. “Electronic components are very accessible. You just order them and they arrive very quickly and everything just works.” Antonovich has taken full advantage of this to dive into his other passion—building musical instruments.

A Musical Repris

In 2017, when he was still living in Russia, Antonovich noticed a new generation of digital accordions emerging and it sparked his curiosity. “I thought, why not try to modify my own [acoustic] accordion?” he says.

He dusted off his instrument and was gratified to find that he could still play and read sheet music. So, he tried to tackle some of the problems that beset digital accordions. Commercially available instruments are typically large and heavy, rely on bulky external modules to add musical accompaniment such as a drum beat, and connect to amplifiers with wires that restrict the performer’s movement.

“I decided that maybe I can build a self-contained device,” he says. Starting with an acoustic accordion as a base, he added a synthesizer, installed internal microphones to capture acoustic sounds that could then be blended with digital ones, and integrated wireless transmitters that could free performers from cables and let them move about the stage freely.

Surprisingly, Antonovich found a lot of overlap with his work on self-driving cars—in particular, the need to manage latency along the signal-processing chain. To provide a seamless experience to the player, a digital accordion needs to rapidly route input from dozens of buttons and keys on two separate keyboards to the synthesizer, which has its own processing delay.

“Your main task as a developer is to keep latency as low as possible,” he says. “A high quality system should produce sound in less than 10 milliseconds, and if you come over this threshold it’s very uncomfortable to play.”

Antonovich now has a growing menagerie of both hybrid acoustic-digital and fully digital accordions. But while he’s built accordions for friends, he’s in no hurry to turn his hobby into a business. “Making them a commercial product will turn my curiosity to necessity,” he says. “When you do something for a living, you do it because you have to and not because you choose to.”

Tech to Track in 2026

2026-01-01 23:00:02



Every September as we plan our January tech forecast issue, IEEE Spectrum’s editors survey their beats and seek out promising projects that could solve seemingly intractable problems or transform entire industries.

Often these projects fly under the radar of the popular technology press, which these days seems more interested in the personalities driving Big Tech companies than in the technology itself. We go our own way here, getting out into the field to bring you news of the hidden gems that genuinely—as the IEEE motto goes—advance technology for the benefit of humanity.

A look back at the last 20 years of January issues reveals that while we’ve certainly covered our share of huge tech projects, like the James Webb Space Telescope, many of the stories touch on subjects most people would have otherwise missed.

Last January, Senior Associate Editor Emily Waltz reported on startups that are piloting ocean-based carbon capture. This issue, she’s back with another CO2-centric story, this time focused on grid-scale storage, which is poised to blow up—literally. Waltz traveled to Sardinia to check out Milan-based Energy Dome’s “bubble battery,” which can store up to 200 megawatt-hours by compressing and decompressing pure carbon dioxide inside an inflatable dome.

This kind of modular, easy-to-deploy energy storage could be especially useful for AI data centers, says Senior Editor Samuel K. Moore, who curated this issue and wrote about gravity energy storage back in January 2021.

Big bubbles could help with grid-scale storage; tiny bubbles can liquefy cancer tumors.

“When we think about energy storage, our minds usually go to grid-scale batteries,” Moore says. “Yet these bubbles, which are in many ways more capable than batteries, will be sprouting up all over the place, often in association with computing infrastructure.”

For his story in this issue, Moore dove into the competition between two startups that are developing radio-based cables to replace conventional copper cables and fiber optics in data centers. These radio systems can connect processors 10 to 20 meters apart using a third of the power of optical-fiber cables and at a third of the cost. The next step is to integrate the radio connections directly with GPUs, to ease cooling burdens and help data centers and the AI models running on them continue to scale up.

Big bubbles could help with grid-scale storage; tiny bubbles can liquify cancer tumors, as Greg Uyeno found when reporting on HistoSonics’ ultrasound treatment. Feared for its aggressive nature and extremely low survival rate, pancreatic cancer kills almost half a million people per year worldwide. HistoSonics uses noninvasive, focused ultrasound to create cavitation bubbles that destroy tumors without dangerously heating surrounding tissue. This year, the company is concluding kidney trials as well as launching pancreatic cancer trials.

Over the last two decades, Spectrum has regularly covered the rise of drones. In 2018, for instance, we reported that the startup Zipline would deploy autonomous drones to deliver blood and medical supplies in rural Rwanda. Today, Zipline has a market cap of about US $4 billion and operates in several African countries, Japan, and the United States, having completed almost 2 million drone deliveries. In this issue, journalist Robb Mandelbaum takes us inside the Wildfire XPrize competition, aimed at providing another life-saving service: dousing wildfires before they grow out of control. Zipline succeeded because it could make deliveries to remote locations much faster than land vehicles. This year’s XPrize teams plan to detect and suppress fires faster than conventional firefighting methods.

In addition to these emerging technologies, we’ve packed this issue with a dozen others, including Porsche’s wireless home charger for EVs, the world’s first electric air taxi service, neutral-atom quantum computers, interoperable mesh networks, and robotic baseball umpires. Let’s see which of this year’s picks make it to the big leagues.