It’s really amazing how many fallback layers there are in a CD to work around errors.
There’s a first Reed-Solomon layer that can theoretically correct 2 errors per 28 byte frame, but they actually only fix 1 error and use the remaing error detection as an erasure code (instead of an error correction code) to make a second layer of Reed-Solomon correction much better due to interleaving.
If that fails, they have the audio samples swizzled in a way so that you can do interpolation between neighboring samples, and if that fails, there was yet another mechanism (that I forgot the details about) for a final effort.
I reminds me a bit of spacecraft, where there’s so much redundancy and flexibility to make things still work in case of a catastrophic failure.
I still feel awe at the compact disc. When I was a teenager in the mid – late 90s, scanning microscopic pits with a freaking laser beam, and storing hundreds of megabytes on a storage medium, still seemed like a big deal, real cutting edge stuff. And yet that format had been invented well over a decade before.
I’m also surprised that if the CD was already invented by the early 1980s and widely sold commercially for music, it took until the early 1990s for applications in computing with CD-ROM. Was the bottleneck here CPUs, RAM, and graphics cards, which would not have been able to handle the amount of data that a CD-ROM could store?
CPU IO access speed and memory transfer delay, in a world before the advent of DMA.
There’s a reason that game consoles of the era had dedicated “Audio Processing Units” that rendered sound directly from sequencer instructions + a small set of loaded samples: trying to feed “one large continuous sample” (i.e. a raw PCM stream) through the CPU to a DAC, wouldn’t have left any time for doing actual logic. If you compressed the audio, decompressing it would take the whole CPU too.
Sure, CD-ROMs weren’t usually continuous streams of data in the way that audio CDs were; but the kinds of stuff that couldn’t just be usefully fit into a floppy — or a set of floppies installed onto a hard drive! — was usually asset files that were individually large in the same way that PCM audio was individually large, in a way that would still choke either CPUs or RAM. No point in large image files you can’t even display.
There were a few consumer use-cases for random access to individually large files — mostly public databases, of the kind we’d expect to find online today. But people who really needed these, either did use them “online” even back then (by dialing in directly to mainframe database services); or, if they were a more industrial consumer, they got these distributions on tape! These were the first things to be ported over to CD-ROM when CD-ROM drives did start becoming available.
The whole “multimedia” hype took a bit longer, only really taking off once high-quality images and low-quality video were able to actually be displayed on PC (requiring PCs to first achieve display resolutions higher than 640x480; color depths higher than 256-color; at least 2MB of VRAM; and CPUs fast enough for MPEG1 decoding of at least small 100x100 video clips.)
Speaking to the processing needed just to pass the PCM data around being a massive fraction of the overall processing power, you'll notice in a lot of early CD-ROM games they just completely sidestep this and just use the CD drive to read actual CD audio off the disk and have it decode and play back the audio directly to the sound output device instead of going through the CPU. Essentially the computer would just tell the CD drive "play track 11" when you got to the 11th level of the game.
And that was a decade past the creation of the audio CD that normal PCs (or gaming PCs!) still needed a lot of processing power just handling that data stream.
That, and a mismatch between the huge capacity of a CD, and typical software back then.
Remember, popular games like Doom fitted on a couple of 3.5" floppies. 3D graphics was just coming. PC's with the cpu to process many MB's of game data (or decode full screen video) were often business machines with limited graphics capabilities. Homecomputers like the Amiga or Atari ST were in similar price range as a CD drive. Harddisks were like 1 or 2 GB.
CD writers were much more expensive. So out of most people's reach as backup medium.
Only when prices came down and PC's gained in cpu, storage & graphics (and software ballooned with that), it made sense to add a CD drive.
I do kind of miss the days where you'd buy a magazine with cover CD, and spend weeks playing 100s of shareware games. Then buy the full box for the best-of those. Or try a Linux distro without needing to download 100s of MB's over a 33k modem.
And it's a shame those mini-CD's (similar to GameCube discs) weren't more popular. I always liked those. Almost impossible to find these days (whether used for music, software, or as -R/RW media).
> That, and a mismatch between the huge capacity of a CD, and typical software back then. Remember, popular games like Doom fitted on a couple of 3.5" floppies. 3D graphics was just coming.
I think this is why encyclopedias were one of the first "killer applications" for CD-ROMs. Wikipedia didn't yet exist, paper encyclopedia sets cost several hundred dollars at least, and floppy disks weren't really big enough for it. Having a whole encyclopedia on one or two CDs was a game changer.
The game changer was multimedia encyclopedia. You can have a lot of text on compressed 3.5", but having a digital sound and video was the main selling point.
The first CD-ROM connected to a computer that I ever remember was Grolier's (Compton's?) encyclopedia we had in the school library. It was a DOS program that ran on an IBM 286, external caddy-loading CD-ROM, and then a laser printer. Was probably an expensive and amazing setup at the time.
I remember thinking those cd caddies were so cool in concept, but after actually trying out a drive that needed the caddies, it was just so cumbersome.
The first MPC minimum standard, set in 1991, was:
16 MHz 386SX CPU
2 MB RAM
30 MB hard disk
256-color, 640×480 VGA video card
1× (single speed) CD-ROM drive using no more than 40% of CPU to read, with < 1 second seek time
Sound card (Creative Sound Blaster recommended as closest available to standard at the time[2]) outputting 22 kHz, 8-bit sound; and inputting 11 kHz, 8-bit sound
Windows 3.0 with Multimedia Extensions.
Though for a CD game you really wanted something in range of DX4/100 with 16MB RAM, which surpasses MPC-II specs.
Also 4 incompatible CD interfaces on the sound cards doesn't bode well.
I got my first x86 PC in August 1994. It came with a 2X CD-ROM drive with the brand new ATAPI / IDE interface. I wanted to install Linux from CD but it was too new to have Linux support so I ended up using 10 floppies to install each Slackware disk set, back to the computer lab for the next set, and repeat.
Pressing CDs was expensive, which worked for the audio mass market, but not for the PC niche market. That limited the appeal of buying an (expensive) CD drive, and they weren't even using a standard protocol (so: lots of issues).
When CD creation became cheaper and the market grew, the economies started to work out in the mid-90s, until the late-90s saw computer mags with cover-CDs (a signal that everybody in that market had CD drives, and pressing CDs was cheap enough for one-off low-count productions like that)
My first CD-ROM drive used a printer parallel port. It worked well enough to run my first CD-ROM software purchase, Mad Dog McCree (1993). My PC was a microchannel PS/2 and it was difficult finding reasonably priced accessories.
I think some older than me may remember differently, but I'm slightly older than you and I'd actually say I think for people who could afford computers there weren't that many years between getting their first CD-ROM equipped computer and their first CD Audio player.
It was within a year or two in my family, and my father was in the computer/internet industry by the mid-80s so had access to stuff pretty early. I think we got our first CD audio player in the family in 1990 and got a 486 with a CD-rom drive in late 1992.
To me, CDRW was the thing that changed everything... yeah sure, i could burn a CD some time before that too, but every file, every video, song, etc., was measured literally in money for blank CDs, and it was relatively expensive.
CDRW? 2MB file? Meh, just burn it, lend it to a friend, get it back, erase, 50meg file, other friend, get some other random stuff back, etc.
Then flash storage became cheap and the CDs slowly died (well, there were iomega zips and other stuff in between, but not really that popular).
I had a zip disk for all my work in college. It's completely impractical for anything these days, but it was so much fun. Having 100mb in your bag in the era of floppies made you feel like a king.
There two dead end technologies of the 1990s that combined magnetism and lasers in different ways.
That ZIP disc was a "floptical" disk that worked basically like a floppy but used a laser to read grooves on the disc that improved alignment and let them pack tracks in denser.
There also were "magneto-optical" discs that would use a laser to heat a spot on a disc and allow recording spots much smaller than the magnetic head. It could read out finer too because the laser could read the magnetic fields with
Some examples of those were the MO drive on the NEXT cube and Sony's mini-discs for music which I think are fun to collect but I had the laser burn out on my portable which can write tracks with metadata out of my PC and gotta either find another one with a working laser or settle for recording on my big decks without metadata.
The first time I head sampled digital audio on a computer was on a Sun 3 workstation circa 1990 which had a high end 68k micro processor. It was a clip from "Layla" by Derek and the Dominoes
and got somebody in the computer lab to ask "Hey man, is that Freedom Rock?"
The thing was that computer progress was frozen in midair for a time in the early 1980s; computers like the TRS-80, Apple ][, C-64 and game consoles like the Atari 2600 all outlasted the projections of their makers because all of the parts from software to the CPU to video system (e.g. the CPU clock was usually slaved to the video refresh of NTSC television) were tightly coupled and making any substantial improvements would mean dumping everything and starting over, particularly with all new software because you didn't have real operating systems back then.
It was by the late 1980s, when the PC AT clones were established that you could really buy a computer one year that was a lot better than last year's model because architectures had advanced to the point where they could do that and keep compatibility.
So processing PCM audio went from impossible to difficult to easy to trivial and video was not far behind.
The free ride of clock rates going up and power consumption going down didn't stop until 2005, but each die shrink kept resulting in cheaper transistors until... just about now. It's not accidental at all that enthusiasts are complaining that NVIDIA 40-series GPUs aren't more cost effective than 30-series GPUs or simply greed; the economic engine of Moore's Law is dead, international competition and dirigisme might lead to another generation or two more than the free market would have delivered on its own. We've had a long time of getting better performance through more cores, wider SIMD, specialized architectures, etc. but it's going to take something big like true 3-d integration, semiconductors past silicon, or superconducting logic gates for performance to improve... Or maybe the revolution will be in "doing more with less".
I think it was just too much data for (cheap) digital tech at the time, and the fact that Laserdisc games makers chose analogue LaserDisc over something digital on a CD is an indication.
Yes, that, but also that it was easier and cheaper to master and control an analogue signal than pushing that much video and sound data digitally through a computer.
You could store and playback video from a CD … it looked like crap though. The best computers could handle back then was MPEG-1 video which it was really really blocky at CD bit rates. Double whammy of lack of storage and lack of processing power.
Things finally got good enough with DVD that had 7x-13x the storage and MPEG-2 was available.
> The best computers could handle back then was MPEG-1 video…
Fun fact: MPEG-1 decode needed enough horsepower that software decode implementations shipped several years after popular-for-CD-ROM video formats like Cinepak¹.
The timeline: Cinepak shipped as part of QuickTime in 1992. Myst (a very popular CD-ROM title which used Cinepak) shipped in 1993. Apple shipped their MPEG-1 QuickTime extension (PowerPC-only) in 1997.
I don't remember which specific audio CD ripper software it was, but one of my most favoritist features ever was the "Try really hard" checkbox. This was used if you had a disc which was hard to read from either scratches or dirty or whatever. I never looked into exactly what it did, but I had assumed that a normal read just read, but try really hard would make use of the error correction which the math caused the slower reads.
To this day, I've been known to write functions called tryReallyHard() even if I haven't made a UI option explicitly for it, yet.
CD rippers have progressed a lot. Now most audio CDs' hashes are stored in public databases and you can rip a CD in a 100% bit-perfect way and if a single bit is off, your checksum won't match the hash other people got in the database.
And CD rippers do check the online DB automatically.
Many people do archive their CD collection in a bit perfect way nowadays and you know when your rip is perfect (and it usually is).
I don't think many computer CD drives have ever exposed the error correction to the computer. It's usually handled entirely internally, as the link shows on the last page. What that program was probably doing was trying a bunch of times to read the same area from a bunch of different starting points to see if it could tease out the data with a little luck.
>math caused the slower reads.
The error correction was ALWAYS being done, and that math was done on a dedicated circuit.
cdparanoia[0], still the standard Linux command-line application for CD ripping, took its name from the fact that it goes heavy on error-correcting. However, among the filesharing community the Windows-only application Exact Audio Copy[1] always enjoyed a stronger reputation for accuracy, and several major private torrent trackers and DC++ hubs required EAC to have been used for rips, which sadly prevented Linux-only users from contributing to the scene.
those may be apps that use error correction, but these are not the apps I was thinking. i don't run Linux desktops. It was most likely a Mac program, but at that time I was still forced to use Windows for some work tasks. I'm thinking it was Nero Burning ROM.
I still love CDs. I started buying them around 1990-1991. I started with MP3 in 1996 but kept buying CDs. I think I got an iPod in 2003, I digitized all my CDs by 2004-2005 and I did go through a period from about 2005-2013 where I bought almost all my music as digital files.
I never got rid of my CDs though, and in recent years I started listening to them again just like Vinyl LPs. There's still something fun about listening to them, and they always sound great and work really well. I've never had a CD get damaged enough to hear any artifact. It is still a good way to support artists you like, and prices have inverted to the point CDs are affordable and Vinyl LPs are quite ridiculous a lot of the time.
I love my CD collection, also been collecting since 1989/90, a new one arrived in the mail today and I buy used on eBay frequently. It's a musical guide to my life since then. I also went through the MP3/streaming route but it's not the same experience at all. I like the ceremony.
And NXP. Though it would have been nice if Philips would have evolved to be the "European iPhone" company instead of selling lights, shavers and food blenders.
They also license the brand name to other manufacturers, making all kinds of dollar store tat. The first thing that clued me into this was reading the back of a set of airpod-like wireless headphones.
The picture on the last page makes a great illustration of how dense electronics integration has become over the years --- all of that, and more, are now possible in a single IC:
Well, it is certainly a prototype. In the early consumer CD players the whole CD decoding datapath was split into chip with the analog part (servo amplifier, which also produces an error signal that contains the read data) and digital chip with its associated DRAM that does the rest of the decoding. It is somehow surprising how empty the single layer PCB in these early CD players is. Obviously, this level of integration was required for the technology to be commercially viable. The CD somewhat coincides with the time of cost-effective VLSI ASICs, which made this possible.
On the other hand, the modern CMOS logic runs so fast and can be so densely integrated, that the 15 years old SAF784x apparently implements the “complex analog RF” part of CD player by using DSP techniques. And the process it is made in is apparently mature enough that it can accommodate various analog functions including high-powered ones so you end up with a complete CD/MP3 player on one ASIC (if you can fit what you want to do into 64kB of ARM7TDMI code and can tolerate the sound quality and EMC implications of the integrated audio output DAC)
There’s a first Reed-Solomon layer that can theoretically correct 2 errors per 28 byte frame, but they actually only fix 1 error and use the remaing error detection as an erasure code (instead of an error correction code) to make a second layer of Reed-Solomon correction much better due to interleaving.
If that fails, they have the audio samples swizzled in a way so that you can do interpolation between neighboring samples, and if that fails, there was yet another mechanism (that I forgot the details about) for a final effort.
I reminds me a bit of spacecraft, where there’s so much redundancy and flexibility to make things still work in case of a catastrophic failure.
The Galileo spacecraft comes to mind, where they reprogrammed the whole communication mechanism (adding compression etc) after the main antenna failed. (https://en.m.wikipedia.org/wiki/Galileo_project#High_gain_an...)