Sunday, July 19, 2009

Bionic eyes to go with your MacBook

So I was just informed that my MacBook probably doesn't have millions of colors. I know, shocking. What amuses me is that the guy repeatedly admits to not being able to tell the difference with his eyes, he tries to confirm with Apple over the phone instead, and fails to get a real answer.

There are two messages here:

  1. Apple sucks
  2. Millions of colors matter

I agree with the first point. Apple is evil and it's getting worse.

The second point is bullshit. He justifies it by saying that as a designer color is important to him. However, the whole post implies that he is not actually able to see the difference, which is why he is calling Apple.

If he can tell the difference why does he need to call/write/confirm? If it makes his design work suffer, can't he tell the difference while he's working?

Furthermore, the people for whom he is designing things probably can't see the difference either, they don't buy first class displays and they they are not professional designers.

I find this lack of perspective a little sad.

jrockway conjectured that this is because bit depth is a measurable spec that you can compare, like LCD refresh rates. People pay lots of money for 1ms refresh rates (1000 FPS), whereas displays with 5ms refresh rates (200 FPS) are cheaper. I don't think either of these solves the wagon wheel effect, there is still aliasing happenning, and anything over 30 frames per second is smooth enough for non pathological cases). And yet people still pay good money for this. Even when their video card has a maximal refresh rate of 120 Hz.

Compare this with a much more meaningful value like the dynamic range of a display (how deep the black is and how bright the white is) matters a lot more. We don't really have a measurement system for that.

What we have is vendor made up values like the contrast ratio, which you can "improve" by jacking up the output of the backlight, while giving your display shitty looking blacks. I bet my MacBook's display really sucks in terms of dynamic range, but guess what, I can't tell. This is despite the fact that top of the line scanners give slightly over 4 f-stop equivalents of range, whereas the human eye can see about 60.

Even though most people will easily be able to tell the difference between a display with a high dynamic range and one without, things like dust, glare and ambient light make a much bigger difference. Blended HDR images give the same dramatic effect while faking dynamic range, and we're able to enjoy them just fine in non optimal conditions. Is it really worth all that extra money?

IMHO there are much better reasons to hate Apple, like the iPhone jail and the awful app store policies, supporting DRM, backwards incompatible changes (10.5's language switching is still driving me nuts after all these months). As a long time Mac user (since 3rd grade), for the last 10 years I've been feeling more and more like I'm being shafted by apple, to the point where I know I want to switch (probably to Ubuntu), but I'm just too lazy at the moment.

I agree that their policy of treating their customers like children is annoying and negative, but we have bigger fish to fry.

So lastly, If you agree and you're a programmer, please keep this in mind the next time you are benchmarking or optimizing your code, decide whether it makes an actual difference, or whether the numbers just look better on paper =)

6 comments:

Ed said...

Back in the day, I took a computer graphics course which had us do our work on Silicon Graphics workstations. Supposedly, according to the TA, these things had 48 bit color (these were CRTs, so we talk about all the bits together, rather than per prime color) and monitors to match. One day, I was playing around with my program's color output, making it do gradual shifts in color, and I noticed color steps. A bit of analysis showed that it was effectively displaying only 18 bit color. I mentioned this to people in the lab, and nobody else could see the the color steps. Later on, someone mentioned to the instructor that I'd been bad-mouthing his computers, and he admitted that the monitor on the computer I'd been using was inferior to exactly that degree, but he'd never found anyone else who could see it before. Most of the rest of the monitors could only handle 21 bit color, but none of us could distinguish that, nor could we identify which of the other computers had the one monitor which could supposedly actually handle 24 bit color correctly. None of the monitors was better than 24 bit, because none of the computer graphics instructors at the school could distinguish the color steps, and monitors which supposedly could display more than that were ridiculously expensive.

2^18 is 200k colors. 2^21 is 2m colors. When I use the ColorSync utility, and I slowly shift through the colors on one band, it looks like many of the shades are using dithering (yeah, I can see pixels, and I'm one of those pathological people who get headaches from 60Hz monitor refresh rates. Fortunately for me, I don't need much more, and even 60Hz takes a while to get to me.) For a real test, I'd like to do my own color shift lines, but I'm kind out of graphics now.

Of course, I have a PowerBook G4, so it's not quite the same hardware.

Does this matter to me? I mean, I *can* tell the difference.

No. It really doesn't. 200k colors is a lot. I was happy with 16, back in the day. In fact, I actually chose to spend several years in computer labs with black and white monitors, because they had a 72Hz refresh rate, and the color computer labs (which had 64k colors, btw) had 60Hz refresh rates.

SamV said...

Yeah, it's all a bit silly really.

Then there are colours which your monitor will never be able to display - eg fluourescent pink. It will never be quite as pink as that real object; highly saturated colours which are not the same hue as your phosphors cannot be perfectly reproduced ever. There's a visual demonstration of this at Colour Triangle on Wikipedia.

Some methods of determining the "gamut volume" will arrive at a figure of under 30,000 as the maximum number of visible distinct colours. Others, 1.6 million. A large number of those will be outside the sRGB space.

Ed said...

For what it's worth, sometimes optimizing does make a huge difference. But the key there is, the first step is to get it working. If it's desktop code, the second step is to determine if the users think it's too slow - if not, you're done. Otherwise, the next step is to benchmark it, to find out exactly how well it does perform. Only after doing all that do you... profile it.

After you've profiled it (and you've done all the stuff before profiling), you can finally look at optimizing it. Not before. Otherwise, you don't really have a basis to know what effect you're really having.

I've had to optimize code by reverting someone's 'optimization changes' before. That's sad. (No, the original code was not 'optimal', but it was good enough, and a quick revert through your revision control system is much quicker than tuning code, so it met the goal and used as little of my time as possible. Although I'll admit, when the site usage increased, I did need to go fix it for real. Still, the pre-'optimization' code was easier to fix, too.)

Anonymous said...

> If he can tell the difference why does he need to call/write/confirm? If it makes his design work suffer, can't he tell the difference while he's working?

There are several ways in which this might matter:
- maybe he can't tell the difference, but others can (as Ed says).
- maybe he can tell the difference, but only under certain conditions (lighting, not being tired...) that are not trivial to set up.
- maybe the imperfection introduces a bias that carries over and is more visible in other media.

When you buy a TV, common wisdom is that the cheaper ones deliberately have their color settings misconfigured, and the more expensive ones are set correctly or to go well with a particular loop. If a device has all the colors it's "supposed" to, it's presumably easier to benchmark it against other devices.

chorny said...

All cheap TN TFT displays has 6 bits per color. 16M of colors is as real as those 5ms (5ms is minimum, maximum can be as large as 20-30 ms), but AFAIK there is some mechanism to compensate for lack of colors.

Robin Smidsrød said...

As chorny said, most displays out there are TN LCDs with 6 bit color. Luckily I'm sitting in front of an 8-bit display that doesn't cost an arm and a leg, the HP LP2475w, and when calibrated with a hardware device it is insanely much better than what I had before (a Hansol 700Fs).

Most of the Apple displays are (unfortunately) TN displays, which is why I don't understand why graphics designers rave so much about Apple laptops. The monitors are just as shitty as everything else out there.

Read this article to understand what types of LCDs are out there.

Personally, I'm just waiting for an Adobe RGB (or for godness, ProPhoto RGB) compatible OLED screen, sometime in the future... :)