The human eye is most sensitive to luminance, which is why 4.5 MHz of the
television signal is devoted to it. The color information is transmitted in
two bands of 1.5MHz and 0.5MHz, and almost all televisions filter the color
to 0.5MHz in the receiver anyway.
Any system using 10 bits per channel for RGB is wasting bandwidth; even ten
bits for the total signal is a bit high. You can dig up my ICASSP paper for
some results on compression using properties of the human vision (it's
The CIE has done a lot of work in this area, but be warned it is an
extremely complicated subject.
on 4/4/02 3:01 AM, Tom Motoyoshi Kalland at tmk_at_earthling.net wrote:
> the human eye's ability to distinguish colours depends on the colours
> though. like, our eyes suck at distinguishing blue gradients. in most cases
> 24bit should be enough, but there are cases in which one needs an even
> higher accuracy. some years back they used 30 bit colour depth (10 bit on each
> channel) for colour vision research. they might use even more now.
> At 12:40 04.04.2002 +0200, you wrote:
>> I heard the human eye can in fact distinguish something like two million
>> colors. Whether this is necessary is another question.
>> What's not in question is that 262,144 colors need 2 bits more per pixel
>> than 65536 colors and that's a potential 12.5% price increase for the same
>> size of a picture in packet fees (not counting compression here, for jpeg's
>> it wouln't make a difference at all afaik). Subtle opportunity to increase
>> prices (more difficult with products like milk and butter where at least in
>> Germany there's a law on how much empty space you may package)! Same is true
>> for an increase of screen format - if widely adopted.
Received on Thu Apr 4 20:25:31 2002