Beyond CD Quality and True Color

I realized about a year ago that today’s smartphones are nearly feature complete, as far as what we’ve asked of them at the platform level. I want to mention a few areas for improvement.

I’m old enough to remember when the first audio CDs were introduced in the 1980s and how much cleaner they sounded, and easier they were to use, than vinyl records or cassettes.

What’s embarrassing is that, 35 years after Philips and Sony defined the original CD-DA format, we’re still using the same old 16-bit, 44.1 kHz sampling rate for digital audio. We should really be using 24-bit, 48 kHz audio, at minimum (ideally, 32-bit floating-point samples). I’m not sure that 96-kHz DACs are necessary, but they'd be nice.

I’m not a fan of MP3, or even Ogg Vorbis. The oldest audio codec that I respect is AAC: the encoders have continued to improve over time with better psychoacoustic models that work with all players, and I’m about 98% satisfied with the 256 kbps AAC files that Apple Music serves. I’m also optimistic about Opus, the audio codec that Google paired with VP9 in the 2013 update to WebM, as it’s designed to work well for low-latency, low-bitrate applications (like VoIP) as well as for music.

The other area that I think has needed improvement for years is color depth. We have such great pixel density and color accuracy in today’s LCD and AMOLED screens that it seems like we can’t possibly improve them, right?

But we’re still limiting ourselves to the same 24-bit color depth that high-end graphics workstations originally called “true color” in the 1980s, to distinguish themselves from PC displays that supported even fewer than 16.7 million colors total. They were only ever “true color” when compared to the 256 color palettes that you got on most PCs (even the Amiga could only display up to 4096 colors with the original chipset).

If you consider brightness alone, and our eyes have evolved to be much more sensitive to it than to color, 24-bit color is really only 256 shades of gray. Therefore, I don’t think we can improve smartphone displays much further without breaking through the 24-bit color barrier in both the OS and apps, not to mention the image processing pipeline for the camera.

One good thing about smartphones is that they do all the graphics compositing in the GPU, and OpenGL on the desktop does have extensions to support 30-bit color (10 bits each for R/G/B) with a 2-bit alpha channel. Beyond that, we’ll have to break through the traditional 32 bits-per-pixel barrier (8 bits for R/G/B + optional 8-bit alpha channel), which will be awkward for memory alignments.

We’re also going to need a better file format than JPEG. Fortunately, JPEG 2000 already supports what we need, and OpenJPEG looks fairly good. But because of inertia, we settle for the older, inferior image format. How many of you had never heard of JPEG 2000 until just now?

Google started promoting WebP as an improvement over JPEG, but independent studies have suggested that it isn’t actually much of an improvement. The reason JPEG 2000 hasn’t been very successful is that it’s much more CPU intensive to encode/decode than JPEG, which has been highly optimized on every platform and CPU architecture by now.

I think there’s something interesting about how our industry, outside of the specialized domains of music production and graphic artists, have allowed our world to stay digitized at the same resolutions and sampling depths that engineers standardized on in nearly 40 years ago, out of expedience. Let’s break the “CD quality” and “true color” barriers together!

I'm a software engineer in the Los Angeles area specializing in mobile applications and embedded systems.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store