> manufacturing replacement parts (prohibitingly expensive)
Some are more than others. PALs and ASICs can be reimplemented in CPLDs and FPGAs. At come point it becomes Theseus' computer, but, if the goal is to preserve the boat, that works. If you want to preserve the wood planks, then you need to keep it powered down in a controlled environment for future generations to apply atomic-resolution sensing eventually.
> For example, you can't emulate different displays properly. As anyone who enjoyed the Vectrex's beautiful vector graphics will agree, you simply cannot emulate this on a modern display. The bright phosphorus coating with the smooth analog beam is not reproducible on something like LCD.
Not right now, but, soon-ish, we'll be able to do that very well with HDR displays. I already can't see pixels or jaggies on my LCDs. With proper computation, one could simulate the analog aspects of CRT tubes - its a lot for GPUs to do now, but, in a decade or so, I assume a mobile GPU would be able to do that without breaking a sweat.
Not long ago I was thinking about a CRT+deflection replacement that would take the analog signals used in CRTs (from an adapter on the neck, where the pins have a lot of variation, and the inputs to the coils), and, maybe, an extra power supply input to power the electronics, and spit out an HDMI signal on the other end.
This should be possible with modern flat panels to a point the image is hard to distinguish for all but the most extreme (think X-Y displays and Tektronix DVSTs) cases.
Curvature is an issue, but flat and Trinitron CRTs should be trivial.
That really depends on what you’re trying to emulate about a display. You can see artifacts from how the electron beam on a CRT paints the image by holding your hand in front of the screen fingers spread and shaking your fingers back and fort. Emulating that well might take a ~10,000fps display which I doubt anyone is ever going to produce.
I suspect even most hardcore retrocomputer hobbyists care most about emulating the parts of a display that actually came up in use of the machine. If I were an eccentric billionaire who wanted a replica of the Mona Lisa to hang on my wall and enjoy regularly without the inconvenience of weekly flights to France, I'd care much more about my money going into making a product that got the visible details of the canvas right (https://hyper-resolution.org/view.html?pointer=0.055,0.021&i...) than spoofed the proper results if I carbon-dated it or something. I think the same concept applies here. I don't really care as much if the only thing an emulator can't replicate is a clever but (to an observer) comically specific physics-based test for authenticity if I get everything I'd notice while using the computer correct at a fraction of the price. In the context of preservation, just knowing some other (far richer than me) person or org keeping a single-digit number of the actual artifact maintained for future reference is good enough for me.
I'm not sure I follow. CRTs draw the image to the screen in a fundamentally different way than modern displays due to how the electron beam moves sequentially left to right/top to bottom. This analog process, happening at 15 or 25hz, is what gives authentic arcade machines their look and feel. Same for old computer terminals. My understanding is that to reproduce this effect on a modern display, you'd need an extremely high refresh rate. To properly replicate this requires some pretty low level aspects of the system to be addressed. Hardware limitations are bound by the laws of physics after all.
Beyond just the aesthetics, there are practical reasons why this is important, whether it be lighgun idiosyncrasies or how the game "feels," which can affect timing and such for competitive players. There's a lot more to preserving the look, feel, and compatibility of displays for old computer systems than most realize and the rabbit hole can go quite deep on this one.
there are practical reasons why [how tthe electron gun works is]
important, whether it be lighgun idiosyncrasies or how the
game "feels,"
This is always interesting to discuss because there are so many factors at play! To put it in less than a zillion words,
The way a game "feels" in this context is essentially a function of input latency. The old-style "chasing the beam" hardware, plus a CRT display, equals something very close to a true zero lag environment.
In an ideal emulation situation, you could theoretically recreate something close to a zero-lag analog environment (in terms of latency) without necessarily simulating the path of the electron beam itself.
Although, as the linked article implies, there are a lot of bits in the emulation stack that would need to be optimized for low latency. High refresh rate displays get you part of the way there "for free."
Sure, and even many games don't particularly benefit from it. However, it's a really remarkable thing to play e.g. Mega Man or Smash Bros. in a true lag-free environment.
Perhaps. One issue I foresee is the way CRTS glow. The phosphor doesn't light/dim immediately the way an LED does. So there's some amount of fade in/out that happens on a CRT as the beam moves across the screen. I imagine this could be difficult or impossible to reproduce with a traditional OLED screen. Some old games rely on this technique along with the slow refresh rates to to create a sort of dithering/aliasing effect.
Phosphor decay is not terribly difficult to simulate to an acceptable degree. Doing it at the pixel level is pretty easy, doing it at the phosphor level is computationally harder but not much more complicated.
The larger issue w.r.t. this specific quirk of CRTs is that we're running out of human beings that are familiar with what this is "supposed" to look like, and actually care.
I'm not aware of any cases where it's been emulated in any acceptable manner. I can't be bothered to do the math myself, but I imagine doing this well would be beyond the capabilities of modern displays (probably in the 1000s of hz refresh rate). Maybe some special FPGA based controller with an OLED like was suggested above could make it possible. I'm not sure.
Each individual phosphor dot on a CRT is not terribly tricky to emulate.
The brightness at any given moment is a fairly simple decay function based on how long it's been since you lit it up with the electron gun. On top of that, you would typically want to apply some level of bloom to simulate the way light is diffused by the glass. Sure, you've got a few million dots to simulate, but this is also anembarrassingly parallel problem.
Now of course, admittedly, you're only simulating that phosphor glow decay at the refresh rate of your monitor -- 60hz, 144hz, 240hz, whatever -- instead of an effectively infinite level of steps as would be the case in real life. However, I don't think that is a practical issue.
You're clearly thinking of factors I'm not and I'm genuinely interested. To my mind, the visual aspects of CRTs are pretty easy to simulate, but not the near-zero lag.
The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining. And in vector graphics you don't have pixels at all, the light shines quite beautifully in a way I don't think is possible at all with backlit displays.
> The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining.
It would need to redraw the whole screen to account for the phosphor decay. To do that with line resolution and an NTSC signal, you’d have to redraw it roughly 1500 times per second (60 fields of about 250 lines). You’d draw the current line at full brightness and “decay” the rest of the frame according to that phosphor persistence. Since there is some quantization, you could reduce the frequency of decays as the line gets older.
On the concept of these very weird displays, I remember an HP oscilloscope that had a monochrome CRT and a polarizer in front cycling between R, G, and B components on every refresh cycle. Overall, the screen resembled a DLP projection when you'd see different color frames when your eyes moved, but a stable color when you were looking at a part of the screen. A very neat way of producing crazy small color pixels on a 7"ish CRT.
And yes, that device cost about the same as my house back then (2002).
I'll give you an example from the LCM. They had a PLATO terminal with its original plasma flat panel display. I'd been reading about PLATO for years and had even run it in emulation but I'd never seen actual hardware before visiting the LCM.
The experience on the original terminal was way different than emulation. The way the screen worked and the tactile feel of the keyboard was the core of the experience of it. Sitting at an actual terminal really changed my understanding the system because it gave me a physical context that software emulation could not provide. You'd be hard pressed to emulate the eye melting nature of the original plasma display or the stiffness of the keyboard.
The physical experience is a huge part of the overall thing. I have a C64 Maxi and it's absolutely amazing, exquisitely close to the original (but with an HDMI output and USB ports)
I'd care much more about my money going into
making a product that got the visible details
of the canvas right than spoofed the proper
results if I carbon-dated it or something
You've inadvertently highlighted one of the challenges of preservation: identifying which aspects matter.
Does fooling a carbon dating test matter? This is purely subjective, but for most people surely not.
But interestingly you've linked to an ultra high resolution image viewer that lets the viewer drill down into a nearly microscopic view of the painting. If a person doesn't know much about art, they might think that if you could take something like this and hang it on your wall, it would be a pretty damn good replica of the real thing. It would certainly be cool, I have to admit. Hell, I'd love it on my wall.
And yet, it's utterly different than the real thing. Paintings in real life are three dimensional. Van Gogh in particular is one who used thick gobs of paint. Each fraction of a micron of the painting has height and its own reflective properties which interact with the light in the room as you walk around and observe it.
if I get everything I'd notice while using the
computer correct at a fraction of the price.
Well, that's the thing. It's certainly up to the individual whether or not they give a crap about any particular detail.
If you don't care about how oil paintings actually look in real life, or what video games actually looked and felt like, and you choose to brand all of the things you don't understand or don't care about as "comical", then... well, more power to you. That's your choice.
Not right now, but, soon-ish, we'll be able
to do that very well with HDR displays [...]
flat and Trinitron CRTs should be trivial.
Visually I think we're really close in most of the ways that matter, with advanced shaders like CRT-Royale.
However, there's an entire additional dimension missing from this discussion so far - latency. When paired with original game hardware up until the 16-bit era or so, CRTs offer close to a true zero latency experience.
It's not possible to recreate this with modern tech. When we add up everything in the stack (display, input drivers, everything) we're looking at over 100ms of lag.
We're not totally without hope. As the (ancient!) article notes, higher refresh rate displays reduce many of these latencies proportionally. And for non-action games, latency doesn't matter too much in the first place.
At come point it becomes Theseus' computer,
but, if the goal is to preserve the boat,
[CPLDs and FPGAs] work
> However, there's an entire additional dimension missing from this discussion so far - latency. When paired with original game hardware up until the 16-bit era or so, CRTs offer close to a true zero latency experience.
It's hard to even match the input latency and general responsiveness of an NES hooked to a CRT TV with a composite cable with modern hardware, let alone something more-integrated.
My usual test is "is it very, very hard to get past Piston Honda on Punch Out?" Often, with an initial, naive set-up, it's nearly impossible. Get it dialed in a little and he becomes easy (if you spent like a billion hours playing that game as a kid, anyway). But with many display + computer + controller combos it's just impossible to get it close enough to right, no matter how you tune it.
That's my test because it's really easy to tell if the latency is bad, but if it is I'll find myself falling off things constantly in Mario, too, it's just harder to tell if I'm playing poorly or if the system's the problem. The NES is hard enough, add just a little latency and it's hellish.
Latency is one dimension. Another one is peripheral compatibility. Devices such as light pens and light guns are incompatible with LCDs. These peripherals depend on the timing information encoded by the raster scan of CRTs.
This means classic light gun games such as Duck Hunt for the NES are impossible to play without a CRT.
Doesn't duck hunt use an entire frame of black then white? You don't need raster scan emulation for that, just very low latency.
Edit: Apparently there is a 15kHz filter inside the light gun. So it's not really about beam accuracy or brightness, it's about pulsing at the right frequency.
LCDs aren't capable of pulsing at 15kHz. The twisting/untwisting of the liquid crystals is an electromechanical process and very slow (compared to a racing electron beam). Even though the fastest gaming LCD monitors claim a 360Hz refresh rate, they cannot get anywhere near this 2.8ms latency (implied) when going from black to white (0 to 100% brightness). Of course, the monitor manufacturers go to great lengths to avoid talking about it, so the whole industry is flooded with a bunch of marketing material to distract from the issue.
Yeah, I know. But you don't need the LCD to pulse, you need the backlight to pulse.
An LCD might still have issues switching fast enough, but an HDR OLED tuned to 15kHz PWM might be able to handle it. If it was designed with minimum latency in mind of course. Most screens buffer a full frame and that won't work. But playing duck hunt doesn't require timing the actual beam as it sweeps across the screen. You just need to be displaying rows with a buffer that's no more than a few rows high, and have the rows flicker. Also many third party controllers don't care that much about the flicker.
Oh, with OLED you could probably design it to mimic the raster scan of a CRT perfectly, cascading the row and column signals along at 15kHz. The issue is, who is going to build that? I don't think Duck Hunt is too high on the priority list for OLED panel makers.
The really sad thing is that some day all of the CRTs will be dead and all of the expertise to build them too. The tooling and factories are already gone, so it's unlikely new CRTs will ever be built, unless some Ben Krasnow-esque super-hobbyist gets really passionate about it.
> Oh, with OLED you could probably design it to mimic the raster scan of a CRT perfectly, cascading the row and column signals along at 15kHz.
You could but it wouldn't have the same brightness as it sweeps so I don't know if that's good enough to trick a light gun by itself.
But I still think you shouldn't discount LCD. If you can get an LCD to switch a good fraction of the way in half a millisecond, and use the right backlight, you could make duck hunt work.
> The issue is, who is going to build that? I don't think Duck Hunt is too high on the priority list for OLED panel makers.
To trick the frequency filter might be too much effort, but the latency of having each row come along instantly might get some effort into it. Marketing loves low latency.
The low latency claimed by LCD marketers concerns narrow grey-to-grey transitions. Black to white remains as slow as ever.
The other issue is all of the other causes of latency along pipeline. The NES emits an composite video signal directly from its video chip, the PPU. This composite signal travels along a coax cable into the back of the TV where it’s split by the analogue circuitry into signals driving the luminance and colour, synchronized to the horizontal and vertical retrace. The whole process happens in less time than it takes to convert that signal into digital before it could even be sent to an LCD.
That is, before our LCD display even receives a frame it’s already on the screen of the CRT. The NES is explicitly designed around the NTSC timing structure, with the rendering in the PPU happening “just in time” to be sent to the CRT. There is no place in the NES to buffer a frame.
> The whole process happens in less time than it takes to convert that signal into digital before it could even be sent to an LCD.
While that's true, doing the conversion doesn't need to add more than a microsecond of latency.
> That is, before our LCD display even receives a frame it’s already on the screen of the CRT. The NES is explicitly designed around the NTSC timing structure, with the rendering in the PPU happening “just in time” to be sent to the CRT. There is no place in the NES to buffer a frame.
An LCD doesn't have to buffer a frame either. I believe there are models that don't. It can display as it receives, limited by the crystal speed which is still an order of magnitude improvement.
Current consumer LCDs, sure, but there's no realson a high-refresh-rate LCD couldn't emulate the flying spot of a CRT, and thus be compatible with light pens / guns.
In order for that to work, it’d need to be able to switch individual pixels in sequence, one at a time. The display panel would need to be designed for this - current panels aren’t, but as long as a screen position could switch to 100% in about 250ns, a sensor could tell precisely which pixel it’s looking at.
Liquid crystals cannot switch from 0 to 100% in less than 10ms, never mind 250ns. They’re electromechanical devices that need to physically twist/untwist to affect the polarization of light.
Contrast that with a CRT which uses a 25kV acceleration voltage to drive an electron beam up to 30% the speed of light (takes about 3.3 nanoseconds to travel 1 foot from the back of the CRT to the screen), which then strikes a phosphor that glows due to its valence electrons falling from excited states (which takes a few nanoseconds).
> ...then you need to keep it powered down in a controlled environment
Mold and water damage can be mitigated with environmental controls but even then you're going to have decay issues because so many components just break down overtime. Many plastics just become brittle and disintegrate over time with or without exposure to UV. 10 year old devices have their softtouch rubber turning to a gooey sticky melting mess. Electrolytic capacitors and batteries leaking are commonly known but lesser known issues occur. The MFM Drive testing that Adrian's Digital Basement did recently comes to mind, 5 of 5 drives he tested were dead. One was a drive he'd validated a year prior and stored correctly.
Some are more than others. PALs and ASICs can be reimplemented in CPLDs and FPGAs. At come point it becomes Theseus' computer, but, if the goal is to preserve the boat, that works. If you want to preserve the wood planks, then you need to keep it powered down in a controlled environment for future generations to apply atomic-resolution sensing eventually.
> For example, you can't emulate different displays properly. As anyone who enjoyed the Vectrex's beautiful vector graphics will agree, you simply cannot emulate this on a modern display. The bright phosphorus coating with the smooth analog beam is not reproducible on something like LCD.
Not right now, but, soon-ish, we'll be able to do that very well with HDR displays. I already can't see pixels or jaggies on my LCDs. With proper computation, one could simulate the analog aspects of CRT tubes - its a lot for GPUs to do now, but, in a decade or so, I assume a mobile GPU would be able to do that without breaking a sweat.
Not long ago I was thinking about a CRT+deflection replacement that would take the analog signals used in CRTs (from an adapter on the neck, where the pins have a lot of variation, and the inputs to the coils), and, maybe, an extra power supply input to power the electronics, and spit out an HDMI signal on the other end.
This should be possible with modern flat panels to a point the image is hard to distinguish for all but the most extreme (think X-Y displays and Tektronix DVSTs) cases.
Curvature is an issue, but flat and Trinitron CRTs should be trivial.