2026-01-09 05:37:44
When Meta first announced its display-enabled smart glasses last year, it teased a handwriting feature that allows users to send messages by tracing letters with their hands. Now, the company is starting to roll it out, with people enrolled in its early access program getting it first,
I got a chance to try the feature at CES and it made me want to start wearing my Meta Ray-Ban Display glasses more often. When I reviewed the glasses last year, I wrote about how one of my favorite tings about the neural band is that it reduced my reliance on voice commands. I've always felt a bit self conscious at speaking to my glasses in public.
Up to now, replying to messages on the display glasses has still generally required voice dictation or generic preset replies. But handwriting means that you can finally send custom messages and replies somewhat discreetly.
Sitting at a table wearing the Meta Ray-Ban Display glasses and neural band, I was able to quickly write a message just by drawing the letters on the table in front of me. It wasn't perfect — it misread a capital "I" as an "H" — but it was surprsingly intuitive. I was able to quickly trace out a short sentence and even correct a typo (a swipe from left to right will let you add a space, while a swipe from right to left deletes the last character).
Alongside handwriting, Meta also announced a new teleprompter feature. Copy and paste a bunch of text — it supports up to 16,000 characters (roughly a half-hour's worth of speech) — and you can beam your text into the glasses' display.
If you've ever used a teleprompter, Meta's version works a bit differently in that the text doesn't automatically scroll while you speak. Instead, the text is displayed on individual cards you manually swipe through. The company told me it originally tested a scrolling version, but that in early tests, people said they preferred to be in control of when the words appeared in front of them.
Teleprompter is starting to roll out now, though Meta says it could take some time before everyone is able to access.
The updates are the among the first major additions Meta has made to its display glasses since launching them late last year and a sign that, like its other smart glasses, the company plans to keep them fresh with new features. Elsewhere at CES, the company announced some interesting new plans for the device's neural band and that it was delaying a planned international rollout of the device.
This article originally appeared on Engadget at https://www.engadget.com/wearables/handwriting-is-my-new-favorite-way-to-text-with-the-meta-ray-ban-display-glasses-213744708.html?src=rss2026-01-09 05:26:08
While wave upon wave of smartglasses and face-based wearables crash on the shores of CES, traditional glasses really haven’t changed much over the hundreds of years we’ve been using them. The last innovation, arguably, was progressive multifocals that blended near and farsighted lenses — and that was back in the 1950s. It makes sense that autofocusing glasses maker IXI thinks it’s time to modernize glasses.
After recently announcing a 22-gram (0.7-ounce) prototype frame, the startup is here in Las Vegas to show off working prototypes of its lenses, a key component of its autofocus glasses, which could be a game-changer.
IXI’s glasses are designed for age-related farsightedness, a condition that affects many, if not most people over 45. They combine cameraless eye tracking with liquid crystal lenses that automatically activate when the glasses detect the user’s focus shifting. This means that, instead of having two separate prescriptions, as in multifocal or bifocal lenses, IXI’s lenses automatically switch between each prescription. Crucially — like most modern smartglasses — the frames themselves are lightweight and look like just another pair of normal glasses.
With a row of prototype frames and lenses laid out in front of him, CEO and co-founder Niko Eiden explained the technology, which can be separated into two parts. First, the IXI glasses track the movement of your eyes using a system of LEDs and photodiodes, dotted around the edges of where the lenses sit. The LEDs bounce invisible infrared light off the eyes and then measure the reflection, detecting the subtle movements of your eye and how both eyes converge when focusing on something close.
Using infrared with just a "handful of analog channels" takes far less power than the millions of pixels and 60-times-per-second processing required by camera-based systems. IXI’s system not only tracks eye movements, but also blinking and gaze direction, while consuming only 4 milliwatts of power.
Most of the technology, including memory, sensors, driving electronics and eye tracker, is in the front frame of the glasses and part of the arms closest to the hinge. The IXI prototype apparently uses batteries similar in size to those found in AirPods, which gives some sense of the size and weight of the tech being used. The charging port is integrated into the glasses’ left arm hinge. Naturally, this does mean they can’t be worn while charging. IXI says that a single charge should cover a whole day’s usage.
The prototype frames I saw this week appeared to be roughly the same weight as my traditional chunky specs. And while these are early iterations, IXI’s first frames wouldn’t look out of place in a lineup of spectacle options.
The team has also refined the nose pieces and glasses arms to accommodate different face shapes. Apparently, when testing expanded from Finland to the UK, British faces were “...different.” A little harsh when talking to me, a Brit.
Eiden pulled out some prototype lenses, made up of layers of liquid crystal and a transparent ITO (indium tin oxide) conductive layer. This combination is still incredibly thin, and it was amazing to watch the layers switch almost instantly into a prescription lens. It seemed almost magical. As they’re so thin, they can be easily integrated into lenses with existing prescriptions. It can also provide cylindrical correction for astigmatism too.
Autofocus lenses could eliminate the need for multiple pairs of glasses, such as bifocals and progressives. Even if the glasses were to run out of power, they’d still function as a pair of traditional specs with your standard prescription, just lacking the near-sighted boost. IXI’s sensor sensitivity can also offer insight into other health conditions, detect dry eyes, estimate attentiveness and, by tracking where you’re looking, even posture and neck movement. According to Eiden, blink rate changes with focus, daydreaming and anxiety, and all that generates data that can be shown in the companion app.
Hypothetically, the product could even potentially adapt prescriptions dynamically, going beyond the simple vision correction of Gen 1. For example, it could offer stronger corrections as your eyes get fatigued through the day.
IXI appears to be putting the pieces in place to make these glasses a reality. It still needs to obtain the necessary medical certifications in order to sell its glasses and get all the production pieces in place. It’s already partnered with Swiss lens-maker Optiswiss for manufacturing. Eiden says the final product will be positioned as a high-end luxury glasses option, selling through existing opticians. The company hopes to finally launch its first pair sometime next year.
This article originally appeared on Engadget at https://www.engadget.com/wearables/ixis-autofocusing-lenses-multifocal-glasses-ces-2026-212608427.html?src=rss2026-01-09 05:07:37
After years of testing its humanoid robot (and forcing it to dance), Boston Dynamics' Atlas is entering production. The robotics company said at CES 2026 that the final product version of the robot is being built now, and the first companies that will receive deployments are Hyundai, Boston Dynamics' majority shareholder, and Google DeepMind, the firm's newly minted AI partner.
This final enterprise version of Atlas "can perform a wide array of industrial tasks," according to Boston Dynamics, and is specifically designed with consistency and reliability in mind. Atlas can work autonomously, via a teleoperator or with "a tablet steering interface," and the robot is both strong and durable. Boston Dynamics says Atlas has a reach of up to 7.5 feet, the ability to lift 110 pounds and can operate at temperatures ranging from minus 4 to 104 degrees Fahrenheit. "This is the best robot we have ever built," Boston Dynamics CEO Robert Playter said in the Atlas announcement. "Atlas is going to revolutionize the way industry works, and it marks the first step toward a long-term goal we have dreamed about since we were children."
Boston Dynamics has been publicly demoing its work on humanoid robots since at least 2011, when it first debuted Atlas as a DARPA project. Since then, the robot has gone through multiple prototypes and revisions, most notably switching from a hydraulic design to an all-electric design in 2024. Later that year, Boston Dynamics demonstrated the robot's ability to manipulate car parts, which appears to be one of the first ways Atlas will be put to work.
Hyundai plans to use Atlas in its car plants in 2028, focused on tasks like parts sequencing. In 2030, the car maker hopes to have the robot's responsibilities "extend to component assembly, and over time, Atlas will also take on tasks involving repetitive motions, heavy loads, and other complex operations," Hyundai says. Google DeepMind, meanwhile, is receiving Atlas robots so it can work on integrating its Gemini Robotics AI foundation models into Boston Dynamics' system.
This article originally appeared on Engadget at https://www.engadget.com/big-tech/boston-dynamics-unveils-production-ready-version-of-atlas-robot-at-ces-2026-234047882.html?src=rss2026-01-09 04:53:15
Last year Razer showed off Project Ava as a digital assistant that lived inside your computer to help adjust settings or provide gaming tips. But now at CES 2026, the company’s AI companion platform has gotten a major glow-up while moving into some new digs.
Now, in lieu of being constrained entirely to your PC’s screen, Razer has given Project Ava a real home in the form of a small tube that can display a 5.5-inch animated hologram of the AI’s avatar. You’ll still need to connect it to your computer via USB-C to provide Ava with the power and data it needs. However, all of your companion’s other components are built into its abode, including dual far-field mics so you can talk to it, a down-firing full-range speaker so it can talk and an HD camera with an ambient light sensor so the AI can see and react to its surroundings.
But perhaps the biggest upgrade to the project is that instead of just Ava, who Razer describes as “a calm, reliable source of energy to help you keep things clear, efficient, and always on point,” there are three or four new personas (depending on how we’re counting) joining the roster. Kira looks like a TikTok e-girl decked out in a frilly outfit complete with Razer neon green accents, while Zane is her edgy masculine alternative who kind of reminds me of the Giga Chad meme, but with extra snake tattoos. Then there’s Sao, who appears to be directly inspired by iconic Japanese salary woman Saori Araki. Finally, there’s an avatar made in the likeness of Faker (Lee Sang-hyeok), the most successful and well-known League of Legends player of all time and one of Razer's sponsored esports athletes.

The idea now is that instead of being trapped inside your computer, Ava or one of Razer’s other personas can sit on your desk and be your companion for everything. They can remind you of upcoming events, respond to questions or even comment on your outfit using Razer’s built-in camera. That said, if you need some privacy, the device’s mics can be muted and the company says its planning on putting a physical camera shutter on final retail models. Of course, Ava or any of the other avatars can still hang out while you game and give you advice. During my demo, Kira helped pick out a loadout in Battlefield 6 based on user criteria and even provided pros and cons for some of the game’s other equipment options.

Unfortunately, while I did get to see Kira and Zane talk, dance and sway in their little bottles, Sao and Faker weren’t quite ready to make their holographic debuts. But according to Razer, that’s sort of by design as Project Ava is very much a work in progress. Currently, the avatars’ responses are generated by X AI’s Grok (yikes!), but the platform was created as a sort of open-source project that will support other models like Gemini or ChatGPT.
Down the line, Razer is hoping to add the ability for users to create their own unique avatars and companions based on their input or inspiration from real-world objects. Meanwhile, for avatars like Faker's because he’s also an actual person, Razer wants additional time to make the AI companion helpful with topics like real-time League of Legends coaching.

That said, while some folks might find Project Ava a bit weird or unnerving, it actually feels pretty tame (almost cute even) in an era where people are already marrying their AI partners. And if you’re the kind of person who prefers digital companions over flesh-and-blood alternatives (you know, people), I guess it’s kind of nice to have a more tangible representation of your electronic waifus and husbandos.

Sadly, Razer has not provided full pricing for Project Ava’s holographic peripheral, though a representative said that it will be in the same ballpark as the company’s other peripherals. I’m estimating a final cost of around $200. Reservations for Project Ava are currently live with a $20 deposit before official shipments begin sometime in the second half of 2026.
This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/razer-put-a-waifu-in-a-bottle-at-ces-2026-205315908.html?src=rss2026-01-09 04:45:00
YouTube introduced some new filters to its advanced search tools today. Possibly the most exciting change is that Shorts are now listed as a content type, so the three-minute-or-less videos can be excluded as results in your searches.
This is a welcome update for any of us who have been on the hunt for a long-form explainer only to wade through dozens of ten-second clips before finding anything close to our goal. Especially with the addition of even more AI slop last year thanks to the Google Veo 3 engine, an option to exclude Shorts may look even more appealing.
The other updates include a pair of renamed features within advanced search. The "Sort By" menu will now be called "Prioritize." Likewise, the "View Count" option has been renamed to "Popularity;" this will allow YouTube's algorithms to account for other metrics such as watch time to gauge how much other users are engaging with a particular video. A pair of former filter options have also been removed; there will no longer be choices to search for "Upload Date - Last Hour" and "Sort by Rating."
This article originally appeared on Engadget at https://www.engadget.com/entertainment/youtube/youtube-will-let-you-exclude-shorts-from-search-results-204500097.html?src=rss2026-01-09 04:31:04
Fender Audio may have announced its new headphones and speakers right before CES, but Las Vegas afforded us the first opportunity to see the brand’s new lineup in person. Fender Audio is a Fender-owned brand from Riffsound that’s designing and making new devices after licensing the name. It’s been a while since the guitar and amplifier company made any general-use speakers of its own, and this new arrangement is similar to what Zound was doing with Marshall for a spell.
Logistics out of the way, let’s get down to what the Mix and Ellie are like in the flesh. First, the Mix headphones offer a modular construction that allows you to replace nearly every piece as needed. The ear cups detach from the headband and the ear pads are replaceable. You can also swap out the battery, thanks to an easy-to-access slot behind one ear pad. And on the other side, a USB-C dongle for wireless lossless audio is stowed for safe keeping (wired lossless audio over USB-C is also available).

Fender Audio kept the controls simple on the Mix, opting for a single joystick for volume and playback changes. The joystick also serves as the power and pairing control as the only other button cycles through active noise cancellation (ANC) modes. In terms of sound, the Mix will satisfy listeners who crave deep bass, and vocals cut through clearly. In my brief demo, I would’ve liked more mid-range, but I’ll wait until I get a review unit for a full assessment there. I should mention the other standout feature is battery life: the Mix will offer up to 52 hours of use with ANC enabled (up to 100 hours with it off).
Then there are the Elie speakers. Both offer a similar set of features, which includes two wireless inputs for microphones (the company is working on its own model) and a combination XLR and 1/4-inch input for instruments. The Elie 06 is the smaller unit, housing a tweeter, full-range driver and subwoofer with 60 watts of output. The larger Elie 12 doubles all of that, serving as a more robust but still very portable option.

Both Elie units can be used in a single configuration or as a stereo pair. You can also connect up to 100 of the speakers via a Multi mode. Fender Audio has done a nice job here of checking all of the usual Bluetooth speaker boxes while offering something unique in terms all of those inputs. It’s like the company combined “regular” portable speakers with larger party boxes, offering something for customers who don’t want a massive device or any of the flashing lights.
Of course, none of these specs matter if the company didn’t ace the sound quality. While I’ll wait until I can spend several hours with some review units before I make any final judgement on these, I can tell you that both Elie speakers made a great first impression. There’s ample bass in the tuning for both, but obviously the larger Elie 12 offers more thump. Both units also provide excellent vocal clarity and nice details in the highs, as I made sure to select test tracks with lots of subtle sounds — like Bela Fleck’s banjo tunes.

Fender Audio says the arrival of the entire new lineup is imminent. Both the headphones and the Elie 6 will cost $299 and the Elie 12 is $399.
This article originally appeared on Engadget at https://www.engadget.com/audio/hands-on-with-fender-audios-headphones-and-speakers-at-ces-2026-203104561.html?src=rss