2026-02-03 17:00:13

Originally released for the Sony PlayStation in 1998, Resident Evil 2 came on two CDs and used 1.2 GB in total. Of this, full-motion video (FMV) cutscenes took up most of the space, as was rather common for PlayStation games. This posed a bit of a challenge when ported to the Nintendo 64 with its paltry 64 MB of cartridge-based storage. Somehow the developers managed to do the impossible and retain the FMVs, as detailed in a recent video by [LorD of Nerds]. Toggle the English subtitles if German isn’t among your installed natural language parsers.
Instead of dropping the FMVs and replacing them with static screens, a technological improvement was picked. Because of the N64’s rather beefy hardware, it was possible to apply video compression that massively reduced the storage requirements, but this required repurposing the hardware for tasks it was never designed for.
The people behind this feat were developers at Angel Studios, who had 12 months to make it work. Ultimately they achieved a compression ratio of 165:1, with software decoding handling the decompressing and the Reality Signal Processor (RSP) that’s normally part of the graphics pipeline used for both audio tasks and things like upscaling.

In the video you can see the side by side comparisons of the PS and N64 RE2 cutscenes, with differences clearly visible, but not necessarily for the worse. Uncompressed, the about fifteen minutes of FMVs in the game with a resolution of 320×160 pixels at 24 bits take up 4 GB. For the PS this was solved with some video compression and a dedicated video decoder, since its relatively weak hardware needed all the help it could get.
On the N64 port, however, only 24 MB was left on a 64 MB cartridge after the game’s code and in-game assets had been allocated. The first solution was chroma subsampling, counting on the human eye’s sensitivity to brightness rather than color. One complication was that the N64 didn’t implement color clamping, requiring brightness to be multiplied rather than simply added up before the result was passed on to the video hardware in RGB format.
Very helpful here was that the N64 relied heavily on DMA transfers, allowing the framebuffer to be filled without a lot of marshaling which would have tanked performance. In addition to this the RSP was used with custom microcode to enable upscaling as well as interpolation between frames and audio, with about half the frames of the original dropped and instead interpolated. All of this helped to reduce the FMVs to fit in 24 MB rather than many hundreds of MBs.
For the audio side of things the Angel Studios developers got a break, as the Factor 5 developers – famous for Star Wars titles on the N64 – had already done the heavy lifting here with their MusyX audio tools. This enables sample-based playback, saving a lot of memory for music, while for speech very strong compression was used.
Also argued in the video is that the N64 version is actually superior to the PS version, due to its superior Z-buffering and anti-aliasing feature, as well as new features such as randomized items. The programmable RSP is probably the real star on the N64, which preceded the introduction of programmable pipelines on PC videocards like the NVIDIA GeForce series.
2026-02-03 14:00:17

Over on YouTube you can see [Yang-Hui He] present to The Royal Institution about Mathematics: The rise of the machines.
In this one hour presentation [Yang-Hui He] explains how AI is driving progress in pure mathematics. He says that right now AI is poised to change the very nature of how mathematics is done. He is part of a community of hundreds of mathematicians pursuing the use of AI for research purposes.
[Yang-Hui He] traces the genesis of the term “artificial intelligence” to a research proposal from J. McCarthy, M.L. Minsky, N. Rochester, and C.E. Shannon dated August 31, 1955. He says that his mantra has become: connectivism leads to emergence, and goes on to explain what he means by that, then follows with universal approximation theorems.
He goes on to enumerate some of the key moments in AI: Descartes’s bête-machine, 1617; Lovelace’s speculation, 1842; Turing test, 1949; Dartmouth conference, 1956; Rosenblatt’s Perceptron, 1957; Hopfield’s network, 1982; Hinton’s Boltzmann machine, 1984; IBM’s Deep Blue, 1997; and DeepMind’s AlphaGo, 2012.
He continues with some navel-gazing about what is mathematics, and what is artificial intelligence. He considers how we do mathematics as bottom-up, top-down, or meta-mathematics. He mentions about one of his earliest papers on the subject Machine-learning the string landscape (PDF) and his books The Calabi–Yau Landscape: From Geometry, to Physics, to Machine Learning and Machine Learning in Pure Mathematics and Theoretical Physics.
He goes on to explain about Mathlib and the Xena Project. He discusses Machine-Assisted Proof by Terence Tao (PDF) and goes on to talk more about the history of mathematics and particularly experimental mathematics. All in all a very interesting talk, if you can find a spare hour!
In conclusion: Has AI solved any major open conjecture? No. Is AI beginning to help to advance mathematical discovery? Yes. Has AI changed the speaker’s day-to-day research routine? Yes and no.
If you’re interested in more fun math articles be sure to check out Digital Paint Mixing Has Been Greatly Improved With 1930s Math and Painted Over But Not Forgotten: Restoring Lost Paintings With Radiation And Mathematics.
2026-02-03 11:00:27

The KDE desktop’s new login manager (PLM) in the upcoming Plasma 6.6 will mark the first time that KDE requires that the underlying OS uses systemd, if one wishes for the full KDE experience. This has especially the FreeBSD community upset, but will also affect Linux distros that do not use systemd. The focus of the KDE team is clear, as stated in the referenced Reddit thread, where a KDE developer replies that the goal is to rely on systemd for more tasks in the future. This means that PLM is just the first step.
In the eyes of KDE it seems that OSes that do not use systemd are ‘niche’ and not worth supporting, with said niche Linux distros that would be cut out including everything from Gentoo to Alpine Linux and Slackware. Regardless of your stance on systemd’s merits or lack thereof, it would seem to be quite drastic for one of the major desktop environments across Linux and BSD to suddenly make this decision.
It also raises the question of in how far this is related to the push towards a distroless and similarly more integrated, singular version of Linux as an operating system. Although there are still many other DEs that will happily run for the foreseeable future on your flavor of GNU/Linux or BSD – regardless of whether you’re more about about a System V or OpenRC init-style environment – this might be one of the most controversial divides since systemd was first introduced.
Top image: KDE Plasma 6.4.5. (Credit: Michio.kawaii, Wikimedia)
2026-02-03 08:00:29

[XYZAiden]’s concept for a flexible robotic gripper might be a few years old, but if anything it’s even more accessible now than when he first prototyped it. It uses only a single motor and requires no complex mechanical assembly, and nowadays 3D printing with flexible filament has only gotten easier and more reliable.
The four-armed gripper you see here prints as a single piece, and is cable-driven with a single metal-geared servo powering the assembly. Each arm has a nylon string threaded through it so when the servo turns, it pulls each string which in turn makes each arm curl inward, closing the grip. Because of the way the gripper is made, releasing only requires relaxing the cables; an arm’s natural state is to fall open.
The main downside is that the servo and cables are working at a mechanical disadvantage, so the grip won’t be particularly strong. But for lightweight, irregular objects, this could be a feature rather than a bug.
The biggest advantage is that it’s extremely low-cost, and simple to both build and use. If one has access to a 3D printer and can make a servo rotate, raiding a junk bin could probably yield everything else.
DIY robotic gripper designs come in all sorts of variations. For example, this “jamming” bean-bag style gripper does an amazing, high-strength job of latching onto irregular objects without squashing them in the process. And here’s one built around grippy measuring tape, capable of surprising dexterity.
2026-02-03 05:00:21

Hackers have been building their own basic oscilloscopes out of inexpensive MCUs and cheap LCD screens for some years now, but microcontrollers have recently become fast enough to actually make such ‘scopes useful. [NJJ], for example, used a pair of Raspberry Pi Picos to build Picotronix, an extensible combined oscilloscope and logic analyzer.
This isn’t an open-source project, but it is quite well-documented, and the general design logic and workings of the device are freely available. The main board holds two Picos, one for data sampling and one to handle control, display, and external communication. The control unit is made out of stacked PCBs surrounded by a 3D-printed housing; the pinout diagrams printed on the back panel are a helpful touch. One interesting technique was to use a trimmed length of clear 3D printer filament as a light pipe for an indicator LED.
Even the protocol used to communicate between the Picos is documented; the datagrams are rather reminiscent of Ethernet frames, and can originate either from one of the Picos or from a host computer. This lets the control board operate as an automatic testing station reporting data over a wireless or USB-connected network. The display module is therefore optional hardware, and a variety of other boards (called picoPods) can be connected to the Picotronix control board. These include a faster ADC, adapters for various analog input spans, a differential analog input probe, a 12-bit logic state analyzer, and a DAC for signal generation.
If this project inspired you to make your own, we’ve also seen other Pico-based oscilloscopes before, including one that used a phone for the display.
2026-02-03 03:30:00

[Dave] over at Usagi Electric has a mystery on his hands in the form of a computer. He picked up a Motorola 68000 based machine at a local swap meet. A few boards, a backplane, and a power supply. The only information provided is the machines original purpose: gas station pump control.
The computer in question is an embedded system. It uses a VME backplane, and all the cards are of the 3u variaety. The 68k and associated support chips are on one card. Memory is on another. A third card contains four serial ports. The software lives across three different EPROM chips. Time for a bit of reverse engineering!
[Dave] quickly dumped the ROMs and looked for strings. Since the 68k is a big endian machine, some byte swapping was required to get things human readable. Once byte swapped, huge tables of human readable strings revealed themselves, including an OS version. The computer runs pSOS, an older 68k based real time operating system – exactly what one would expect a machine from the 80’s to run.
The next step was to give it some power and see if the gas station computer would pump once again. The LEDs lit up, and a repeating signal showed up from one of the serial ports. The serial connections on this machine are RS-485. Not common for home computers, but used quite a bit in industrial embedded systems. Unfortunately, the machine wouldn’t respond to commands sent from a terminal. The communication protocol remained a mystery.
Since this video has gone up though, several people have provided a wealth of information at the vintage-micros channel over on [Dave’s] Usagi Electric Discord.
Gas pumps are a bit of a departure from [Dave’s] usual minicomputer work. We’re no strangers to embedded systems here though.