2026-01-15 02:00:41

For those born with certain types of congenital deafness, the cochlear implant has been a positive and enabling technology. It uses electronics to step in as a replacement for the biological ear that doesn’t quite function properly, and provides a useful, if imperfect, sense of hearing to its users.
New research has promised another potential solution for some sufferers of congenital deafness. Instead of a supportive device, a gene therapy is used to enable the biological ear to function more as it should. The result is that patients get their sense of hearing, not from a prosthetic, but from their own ears themselves.

There are a number of causes of congenital deafness, each of which presents in its own way. In the case of OTOF-related hearing loss, it comes down to a genetic change in a single critical protein. The otoferlin gene is responsible for making the protein of the same name, and this protein is critical for normal, functional hearing in humans. It’s responsible for enabling the communication of signals between the inner hair cells in the ear, and the auditory nerve which conducts these signals to the brain. However, in patients with a condition called autosomal recessive deafness 9, a non-functional variant of the otoferlin gene prevents the normal production of this protein. Without the proper protein available, the auditory nerve fails to receive the proper signals from the hair cells in the ear, and the result is profound deafness.
The typical treatment for this type of congenital hearing loss is the use of a cochlear implant. This is an electronic device that uses a microphone to pick up sound, and then translates it into electrical signals which are sent to electrodes embedded in the cochlear. These simulate the signals that would normally come from the ear itself, and provide a very useful sense of hearing to the user. However, quality and fidelity is strictly limited compared to a fully-functional human ear, and they do come with other drawbacks as is common with many prosthetic devices.
The better understanding that we now have of OTOF-related hearing loss presented an opportunity. If it were possible to get the right protein where it needed to be, it might be possible to enable hearing in what are otherwise properly-formed ears.

The treatment to do that job is called DB-OTO. It’s a virus-based gene therapy which is able to deliver a working version of the OTOF gene. It uses a non-pathogenic virus to carry the proper genetic code that produces the otoferlin protein. However, it’s no good if this gene is expressed in just any context. Thus, it’s paired with a special DNA sequence called a Myo15 promoter which ensures the gene is only expressed in cochlear hair cells that would normally express the otoferlin protein. Treatment involves delivering the viral gene therapy to one or both ears through a surgical procedure using a similar approach to implanting cochlear devices.

An early trial provided DB-OTO treatment to twelve patients, ranging in age from ten months to sixteen years. eleven out of twelve patients developed improved hearing within weeks of treatment with DB-OTO. Nine patients were able to achieve improvements to the point of no longer requiring cochlear implants and having viable natural hearing.
Six trial participants could perceive soft speech, and three could hear whispers, indicating a normal level of hearing sensitivity. Notably, hearing improvements were persistent and there were some signs of speech development in three patients in the study. The company behind the work, Regeneron, is also eager to take the learnings from its development and potentially apply it to other kinds of hearing loss from genetic causes.
DB-OTO remains an experimental treatment for now, but regulatory approvals are being pursued for its further use. It could yet prove to be a viable and effective treatment for a wide range of patients affected by this genetic issue. It’s just one of a number of emerging treatments that use viruses to deliver helpful genetic material when a patient’s own genes don’t quite function as desired.
2026-01-15 00:30:00

The ESP32-P4 is the new hotness on the microcontroller market. With RISC-V architecture and two cores running 400 MHz, to ears of a certain vintage it sounds more like the heart of a Unix workstation than a traditional MCU. Time’s a funny thing like that. [DynaMight] was looking for an excuse to play with this powerful new system on a chip, so put together what he calls the GB300-P4: a commercial handheld game console with an Expressif brain transplant.
Older ESP32 chips weren’t quite up to 16-bit emulation, but that hadn’t stopped people trying; the RetroGo project by [ducalex] already has an SNES and Genesis/Mega Drive emulation mode, along with all the 8-bit you could ask for. But the higher-tech consoles can run a bit slow in emulation on other ESP32 chips. [DynaMight] wanted to see if the P4 performed better, and to no ones surprise, it did.
If the build quality on this handheld looks suspiciously professional, that’s because it is: [DynaMight] started with a GB300, a commercial emulator platform. Since the ESP32-P4 is replacing a MIPS chip clocked at 914 MHz in the original — which sounds even more like the heart of a Unix workstation, come to think of it — the machine probably doesn’t have better performance than it did from factory unless its code was terribly un-optimized. In this case, performance was not the point. The point was to have a handheld running RetroGo on this specific chip, which the project has evidently accomplished with flying colours. If you’ve got a GB300 you’d rather put an “Expressif Inside” sticker on, the project is on github. Otherwise you can check out the demo video below. (DOOM starts at 1:29, because of course it runs DOOM.)
The last P4 project we featured was a Quadra emulator; we expect to see a lot of projects with this chip in the new year, and they’re not all going to be retrocomputer-related, we’re sure. If you’re cooking up something using the new ESP32, or know someone who is, you know what to do.
2026-01-14 23:00:29

If you search the Internet for “Clone Wars,” you’ll get a lot of Star Wars-related pages. But the original Clone Wars took place a long time ago in a galaxy much nearer to ours, and it has a lot to do with the computer you are probably using right now to read this. (Well, unless it is a Mac, something ARM-based, or an old retro-rig. I did say probably!)
IBM is a name that, for many years, was synonymous with computers, especially big mainframe computers. However, it didn’t start out that way. IBM originally made mechanical calculators and tabulating machines. That changed in 1952 with the IBM 701, IBM’s first computer that you’d recognize as a computer.
If you weren’t there, it is hard to understand how IBM dominated the computer market in the 1960s and 1970s. Sure, there were others like Univac, Honeywell, and Burroughs. But especially in the United States, IBM was the biggest fish in the pond. At one point, the computer market’s estimated worth was a bit more than $11 billion, and IBM’s five biggest competitors accounted for about $2 billion, with almost all of the rest going to IBM.
So it was somewhat surprising that IBM didn’t roll out the personal computer first, or at least very early. Even companies that made “small” computers for the day, like Digital Equipment Corporation or Data General, weren’t really expecting the truly personal computer. That push came from companies no one had heard of at the time, like MITS, SWTP, IMASI, and Commodore.
The story — and this is another story — goes that IBM spun up a team to make the IBM PC, expecting it to sell very little and use up some old keyboards previously earmarked for a failed word processor project. Instead, when the IBM PC showed up in 1981, it was a surprise hit. By 1983, there was the “XT” which was a PC with some extras, including a hard drive. In 1984, the “AT” showed up with a (gasp!) 16-bit 80286.
The personal computer market had been healthy but small. Now the PC was selling huge volumes, perhaps thanks to commercials like the one below, and decimating other companies in the market. Naturally, others wanted a piece of the pie.
Anyone could make a PC-like computer, because IBM had used off-the-shelf parts for nearly everything. There were two things that really set the PC/XT/AT family apart. First, there was a bus for plugging in cards with video outputs, serial ports, memory, and other peripherals. You could start a fine business just making add-on cards, and IBM gave you all the details. This wasn’t unlike the S-100 bus created by the Altair, but the volume of PC-class machines far outstripped the S-100 market very quickly.
In reality, there were really two buses. The PC/XT had an 8-bit bus, later named the ISA bus. The AT added an extra connector for the extra bits. You could plug an 8-bit card into part of a 16-bit slot. You probably couldn’t plug a 16-bit card into an 8-bit slot, though, unless it was made to work that way.
The other thing you needed to create a working PC was the BIOS — a ROM chip that handled starting the system with all the I/O devices set up and loading an operating system: MS-DOS, CP/M-86, or, later, OS/2.

IBM didn’t think the PC would amount to much so they didn’t do anything to hide or protect the bus, in contrast to Apple, which had patents on key parts of its computer. They did, however, have a copyright on the BIOS. In theory, creating a clone IBM PC would require the design of an Intel-CPU motherboard with memory and I/O devices at the right addresses, a compatible bus, and a compatible BIOS chip.
But IBM gave the world enough documentation to write software for the machine and to make plug-in cards. So, figuring out the other side of it wasn’t particularly difficult. Probably the first clone maker was Columbia Data Products in 1982, although they were perceived to have compatibility and quality issues. (They are still around as a software company.)
Eagle Computer was another early player that originally made CP/M computers. Their computers were not exact clones, but they were the first to use a true 16-bit CPU and the first to have hard drives. There were some compatibility issues with Eagle versus a “true” PC. You can hear their unusual story in the video below.

One of the first companies to find real success cloning the PC was Compaq Computers, formed by some former Texas Instruments employees who were, at first, going to open Mexican restaurants, but decided computers would be better. Unlike some future clone makers, Compaq was dedicated to building better computers, not cheaper.
Compaq’s first entry into the market was a “luggable” (think of a laptop with a real CRT in a suitcase that only ran when plugged into the wall; see the video below). They reportedly spent $1,000,000 to duplicate the IBM BIOS without peeking inside (which would have caused legal problems). However, it is possible that some clone makers simply copied the IBM BIOS directly or indirectly. This was particularly easy because IBM included the BIOS source code in an appendix of the PC’s technical reference manual.
Between 1982 and 1983, Compaq, Columbia Data Products, Eagle Computers, Leading Edge, and Kaypro all threw their hats into the ring. Part of what made this sustainable over the long term was Phoenix Technologies.
Phoenix was a software producer that realized the value of having a non-IBM BIOS. They put together a team to study the BIOS using only public documentation. They produced a specification and handed it to another programmer. That programmer then produced a “clean room” piece of code that did the same things as the BIOS.

This was important because, inevitably, IBM sued Phoenix but lost, as they were able to provide credible documentation that they didn’t copy IBM’s code. They were ready to license their BIOS in 1984, and companies like Hewlett-Packard, Tandy, and AT&T were happy to pay the $290,000 license fee. That fee also included insurance from The Hartford to indemnify against any copyright-infringement lawsuits.
Clones were attractive because they were often far cheaper than a “real” PC. They would also often feature innovations. For example, almost all clones had a “turbo” mode to increase the clock speed a little. Many had ports or other features as standard that a PC had to pay extra for (and consume card slots). Compaq, Columbia, and Kaypro made luggable PCs. In addition, supply didn’t always match demand. Dealers often could sell more PCs than they could get in stock, and the clones offered them a way to close more business.
Not all clone makers got everything right. It wasn’t odd for a strange machine to have different interrupt handling than an IBM machine or different timers. Another favorite place to err involved AT/PC compatibility.
In a base-model IBM PC, the address bus only went from A0 to A19. So if you hit address (hex) FFFFF+1, it would wrap around to 00000. Memory being at a premium, apparently, some programs depended on that behavior.
With the AT, there were more address lines. Rather than breaking backward compatibility, those machines have an “A20 gate.” By default, the A20 line is disabled; you must enable it to use it. However, there were several variations in how that worked.
Intel, for example, had the InBoard/386 that let you plug a 386 into a PC or AT to upgrade it. However, the InBoard A20 gating differed from that of a real AT. Most people never noticed. Software that used the BIOS still worked because the InBoard’s BIOS knew the correct procedure. Most software didn’t care either way. But there was always that one program that would need a fix.
The original PC used some extra logic in the keyboard controller to handle the gate. When CPUs started using cache, the A20 gating was moved into the CPU for many generations. However, around 2013, most CPUs finally gave up on gating A20.
The point is that there were many subtle features on a real IBM computer, and the clone makers didn’t always get it right. If you read ads from those days, they often tout how compatible they are.
IBM started a series of legal battles against… well… everybody. Compaq, Corona Data Systems, Handwell, Phoenix, AMD, and anyone who managed to put anything on the market that competed with “big blue” (one of IBM’s nicknames).
IBM didn’t win anything significant, although most companies settled out of court. Then they just used the Phoenix BIOS, which was provably “clean.” So IBM decided to take a different approach.
In 1987, IBM decided they should have paid more attention to the PC design, so they redid it as the PS/2. IBM spent a lot of money telling people how much better the PS/2 was. They had really thought about it this time. So scrap those awful PCs and buy a PS/2 instead.
Of course, the PS/2 wasn’t compatible with anything. It was made to run OS/2. It used the MCA bus, which was incompatible with the ISA bus, and didn’t have many cards available. All of it, of course, was expensive. This time, clone makers had to pay a license fee to IBM to use the new bus, so no more cheap cards, either.
You probably don’t need a business degree to predict how that turned out. The market yawned and continued buying PC “clones” which were now the only game in town if you wanted a PC/XT/AT-style machine, especially since Compaq beat IBM to market with an 80386 PC by about a year.
Not all software was compatible with all clones. But most software would run on anything and, as clones got more prevalent, software got smarter about what to expect. At about the same time, people were thinking more about buying applications and less about the computer they ran on, a trend that had started even earlier, but was continuing to grow. Ordinary people didn’t care what was in the computer as long as it ran their spreadsheet, or accounting program, or whatever it was they were using.
Dozens of companies made something that resembled a PC, including big names like Olivetti, Zenith, Hewlett-Packard, Texas Instruments, Digital Equipment Corporation, and Tandy. Then there were the companies you might remember for other reasons, like Sanyo or TeleVideo. There were also many that simply came and went with little name recognition. Michael Dell started PC Limited in 1984 in his college dorm room, and by 1985, he was selling an $800 turbo PC. A few years later, the name changed to Dell, and now it is a giant in the industry.
It is interesting to play “what if” with this time in history. If IBM had not opened their architecture, they might have made more money. Or, they might have sold 1,000 PCs and lost interest. Then we’d all be using something different. Microsoft retaining the right to sell MS-DOS to other people was also a key enabler.
IBM stayed in the laptop business (ThinkPad) until they sold to Lenovo in 2005. They would also sell them their server business in 2014.
Things have changed, of course. There hasn’t been an ISA card slot on a motherboard in ages. Boot processes are more complex, and there are many BIOS options. Don’t even get us started on EMS and XMS. But at the core, your PC-compatible computer still wakes up and follows the same steps as an old school PC to get started. Like the Ship of Theseus, is it still an “IBM-compatible PC?” If it matters, we think the answer is yes.
If you want to relive those days, we recently saw some new machines sporting 8088s and 80386s. Or, there’s always emulation.
2026-01-14 20:00:34

Getting a significant energy return from tokamak-based nuclear fusion reactors depends for a large part on plasma density, but increasing said density is tricky, as beyond a certain point the plasma transitions back from the much more stable high-confinement mode (H-mode) into L-mode. Recently Chinese researchers have reported that they managed to increase the plasma density in the EAST tokamak beyond the previously known upper Greenwald Density Limit (GDL), as this phenomenon is known.
We covered these details with nuclear fusion reactors in great detail last year, noting the importance of plasma edge stability, as this causes tokamak wall erosion as well as loss of energy. The EAST tokamak (HT-7U) is a superconducting tokamak that was upgraded and resumed operations in 2014, featuring a 1.85 meter major radius and 7.5 MW heating power. As a tokamak the issue of plasma and edge stability are major concerns, even in H-mode, requiring constant intervention.

In the recent EAST findings, the real news appears to be more confirmation of the plasma-wall self-organization (PWSO) theory that postulates that one of the causes behind plasma wall (edge) instability is due to the interaction between plasma dynamics and wall conditions through impurity radiation. By using electron cyclotron resonance heating (ECRH) and/or pre-filled gas pressure this impurity level might be reduced, enabling higher densities and thus exceeding the empirical GDL.
What’s interesting is that the paper also compares EAST and the Wendelstein 7-X (W7-X) stellarator, making the argument that tokamaks can operate in a way that’s more similar to stellarators, though W7-X is of course gifted with the same advantages as every current stellarator, such as no real GDL or the necessity of dealing with H- or L-mode. It’s therefore not surprising that W7-X is so far the most efficient fusion reactor to achieve the highest triple product.
2026-01-14 17:00:06

Drawing tablets have been a favorite computer of artists since its inception in the 1980s. If you have ever used a drawing tablet of this nature, you may have wondered, how it works, and if you can make one. Well, wonder no longer as [Yukidama] has demonstrated an open source electromagnetic resonance (EMR) drawing tablet build!
The principle of simple EMR tablets is quite straight forward. A coil in the tablet oscillates from around 400 kHz to 600 kHz. This induces a current inside a coil within the pen at its resonant frequency. This in turn, results in a voltage spike within the tablet around the pen’s resonant frequency. For pressure sensing, a simple circuit within the pen can shift its resonant frequency, which likewise is picked up within the tablet. The tablet’s input buttons work in similar ways!
But this is merely one dimensional. To sample two dimensions, two arrays of coils are needed. One to sample the horizontal axis, and one the vertical. The driver circuit simply sweeps over the array and samples every coil at any arbitrary speed the driver can achieve.
Finally, [Yukidama] made a last demo by refining the driver board, designed to drive a flexible circuit containing the coils. This then sits behind the screen of a Panasonic RZ series laptop, turning the device into a rather effective drawing tablet!
If tablets aren’t your style, check out this drawing pen.
Thanks [anfractuosity] for the tip!
2026-01-14 14:00:36

[The 8-Bit Guy] tells us how 8-bit Atari computers work.
The first Atari came out in 1977, it was originally called the Atari Video Computer System. It was followed two years later, in 1979, by the Atari 400 and Atari 800. The Atari 800 had a music synthesizer, bit-mapped graphics, and sprites which compared favorably to the capabilities of the other systems of the day, known as the Trinity of 1977, being the Apple II, Commodore PET, and TRS-80. [The 8-Bit Guy] says the only real competition in terms of features came from the TI-99/4 which was released around the same time.
The main way to load software into the early Atari 400 and 800 computers was to plug in cartridges. The Atari 400 supported one cartridge and the Atari 800 supported two. The built-in keyboards were pretty terrible by today’s standards, but as [The 8-Bit Guy] points out there wasn’t really any expectations around keyboards back in the late 1970s because everything was new and not many precedents had been set.
[The 8-Bit Guy] goes into the hardware that was used, how the video system works, how the audio system works, and what peripheral hardware was supported, including cassette drives and floppy disk drives. He covers briefly all ten of the 8-bit systems from Atari starting in 1979 through 1992.
If you’re interested in Atari nostalgia you might like to read Electromechanical Atari Is A Steampunk Meccano Masterpiece or Randomly Generating Atari Games.