High Def Forum
Thank you for visiting. This is our website archive. Please visit our main website by clicking the logo above.

HDRI and DeepColor

hd-Dude
05-03-2007, 11:07 AM
Hi all!

I'm new to the forum, so I apologize if this new thread doesn't belong here.
Reading articles around about DeepColor, I found some new interesting technologies worth reading and discussing about.

Here are some links as a starter:

http://www.dolby.com/promo/hdr/technology.html
http://www.trellis-mgmt.com
http://www.mpi-inf.mpg.de/resources/hdr/hdrmpeg/

Some are pretty technical articles, some are more commercial.
Is this stuff going to hit market anytime soon in your opinion ?

Lee Stewart
05-03-2007, 07:00 PM
If you are interested in Deep Color here is the "official" specs on it from the HDMI Org.:

http://www.hdmi.org/pdf/HDMI_Insert_FINAL_8-30-06.pdf

hd-Dude
05-03-2007, 09:25 PM
Thanks, I knew the HDMI site.
I was rather interested in how DeepColor is accomplished.
The Trellis website says that a movie is actually interpolated and filtered in order to reach DeepColor.
This makes some sense since movies are 24bit/pixel on the DVD or HD-DVD and Blu-Ray. (or not ??)

Although I couldn't verify this.

BobY
05-04-2007, 08:24 AM
Ideally Deep Color will be implemented by encoding the real colors with more bits, but until the capture equipment actually does that, interpolation will be used. That wouldn't make the colors more accurate, but it will certainly reduce color banding/quantization errors.

I suspect the reason the Component Video connection on my DVD player has less obvious banding than the HDMI connection is the HDMI is limited to 8 bits/color, while the Component Video connection uses 10-bit DAC's and ADC's for 10 bits/color. The extra bits are interpolated, but that creates smoother shading.

Be forewarned that there are no current, or even announced (at any price) HDMI 1.3 displays that use all 16 bits/color supported ny Deep Color. Most are 10 bits/color, which is certainly better than 8 bits/color, but hardly "Deep" Colo--more "Not quite as shallow" Color.

Lee Stewart
05-04-2007, 08:29 AM
Boby,

Look at this article about DC and please explain to me what "dithering" is. It seems to blend the colors better with the 8 bit system that we currently use.

http://easyhdtv.blogspot.com/2007/02/color-banding-on-hdtv-display.html

paulc
05-04-2007, 08:31 AM
Actually, it's 8 bits/pixel. 24 bit refers to all 3 channels (red, green, blue, RGB).

"Interpolated" sounds odd, it could be that is a "phony" way to say your transfer has "Deep Color" as opposed to digitizing the original film in 16 bit/channel.

Of course, one very interesting question is exactly what color depth has been used in HD transfers to date? Did they actually start making such transfers at higher bit depth than current technology can handle, essentially preparing for the future?

Another very interesting question is there is some serious doubt about current display panels and what bit depth they can actually show.

So even if everything works the way it should, the final question is whether or not most folks can actually see a difference. Only an opinion, but I'd say most would NOT be able to see any visible difference. I'd love to see some double blind type tests with data about ages factoring in (IMO, a 25 year old with particularly good color discrimination MAY see, but my guess is most over 30 will not really be able to tell).

Lee Stewart
05-04-2007, 09:31 AM
So even if everything works the way it should, the final question is whether or not most folks can actually see a difference. Only an opinion, but I'd say most would NOT be able to see any visible difference. I'd love to see some double blind type tests with data about ages factoring in (IMO, a 25 year old with particularly good color discrimination MAY see, but my guess is most over 30 will not really be able to tell).

Here is my read on Deep Color . . .

Back in the 1990's, when the USA was working on an HDTV system, it was using the Japanese MUSE system as a guideline (some tried to re-invent the wheel and were just too ahead of their time).

Due to the imits of technology, we were able to increase the resolution of the B & W signal . . . but not the color signal.

Deep Color is a fix for that "problem" (or as i like to call it . . . . HDTV's dirty little secret)

If the color depth and "resolution" matched the B & W resolution . . . IMO - a bunch of problems disappear and the image should look "much better" as long as the display is able to handle the signal properly (no gimicks)

It should easily be seen by anyone who can distunguish the difference between NTSC and HDTV. DC "should" be that much of a difference (I am talking about the full blown, top of the chart DC - 48 bits)

BobY
05-04-2007, 09:31 AM
Lee-

Dithering is a common method for reducing quantization error in digitized signals, basically adding low-level random changes to each sample. What often happens with digitzing is, due to the quantizing limits and hysteresis of the Analog-to-Digital converters or the math limits of the digital processing, obvious patterns develop in the data (like the color banding they show in your link). By randomizing the samples slightly around the original value (dithering), the patterns are broken up and become harder to discern.

In the early days of CD's, a lot of recordings had music that decayed away into annoying low-level pitched tones as the volume trailed off. By dithering the data, the music trailed off into low level noise, which was far preferable to pitched tones as people are very sensitive to pitch and not very sensitive to random noise.

PaulC-

I should have phrased that better. 8 bits/"color component" would have been a better way of expressing it (by which I meant each separate color component--i.e. R, G and B of the RGB space).

8 bits/color component is 24 bits/pixel, although the color information is not stored per pixel. Deep Color supports 16 bits/color component or 48 bits/pixel.

BobY
05-04-2007, 09:35 AM
Lee-

The main reason color resolution is more restricted than luminance resolution is that the eye is much more sentive to changes in brigthness than changes in color. DVD and HD exploit this fact to reduce bandwidth/storage requirements by not encoding color information for every pixel. It was a concious choice to keep price down, not a technological limitation per se.

Lee Stewart
05-04-2007, 11:16 AM
Lee-

The main reason color resolution is more restricted than luminance resolution is that the eye is much more sentive to changes in brigthness than changes in color. DVD and HD exploit this fact to reduce bandwidth/storage requirements by not encoding color information for every pixel. It was a concious choice to keep price down, not a technological limitation per se.

In the 1990's the only HD ever discussed was OTA as that was going to be the initial "phase" of HDTV. The problem as we know was to squeeze the HD signal into a 6 Mhz channel.

So could we have gone to 16 bit color back then? How much would have the signal increased in "size" to make this change?

With the current compression technology at the time, could we have had 16 bit color?

And of course . . at what cost?

Where is the cost located? Camera? Recording Equipment? Transmission Equipment? Display?

BobY
05-04-2007, 11:29 AM
One can view the 6MHz bandwidth restriction as a technological limitation, but from an engineering viewpoint, there was no technological limit on the bandwidth, it was an arbitrary limit imposed by the FCC to keep digital HD within the existing requirements of analog.

The cost is in storage. Cameras capture all the color information, then it is compressed. Recording/storage equipment would need higher capacity to store the increased amount of data. Transmission equipment would need the increased bandwidth, which they weren't allowed to have although it was technically possible. Displays don't care, but for CRT's, the more bits/color, the more bits the D/A converters need. Many fixed-pixel dispalys are incapable of resolving that much color information due to physical limits in their ability to respond to small changes in information--but mostly it would be a cost issue.

Lee Stewart
05-04-2007, 11:33 AM
One can view the 6MHz bandwidth restriction as a technological limitation, but from an engineering viewpoint, there was no technological limit on the bandwidth, it was an arbitrary limit imposed by the FCC to keep digital HD within the existing requirements of analog.

The cost is in storage. Cameras capture all the color information, then it is compressed. Recording/storage equipment would need higher capacity to store the increased amount of data. Transmission equipment would need the increased bandwidth, which they weren't allowed to have although it was technically possible. Displays don't care, but for CRT's, the more bits/color, the more bits the D/A converters need. Many fixed-pixel dispalys are incapable of resolving that much color information due to physical limits in their ability to respond to small changes in information--but mostly it would be a cost issue.

Thank you for your post. It does make a lot of sense.

So BobY, let me ask you a question . . . .

Will Deep Color make a difference in the image we see?

BobY
05-04-2007, 11:42 AM
Oh yeah! Even at 10 bits/component.

I used this analogy before. Suppose you take a video of a red ball with a light shining on one side. While 24-bit color may reproduce "millions" of colors, you have only 8 bits for each color component, so there is only 8 bits of "red" for our ball. The video image will render the ball in only 256 shades of red. It will not appear smoothly shaded, but divided up into bands of different shades of red, rather like a beach ball.

If we switch to 30-bit color, we now have 10 bits/color component and we can render 1024 shades a red--a big improvement. Each color band will be 1/4 as wide and there will now be 4 times as many shades of red to blend with.

hd-Dude
05-04-2007, 08:34 PM
Actually, it's 8 bits/pixel. 24 bit refers to all 3 channels (red, green, blue, RGB).

"Interpolated" sounds odd, it could be that is a "phony" way to say your transfer has "Deep Color" as opposed to digitizing the original film in 16 bit/channel.

Of course, one very interesting question is exactly what color depth has been used in HD transfers to date? Did they actually start making such transfers at higher bit depth than current technology can handle, essentially preparing for the future?

Another very interesting question is there is some serious doubt about current display panels and what bit depth they can actually show.

So even if everything works the way it should, the final question is whether or not most folks can actually see a difference. Only an opinion, but I'd say most would NOT be able to see any visible difference. I'd love to see some double blind type tests with data about ages factoring in (IMO, a 25 year old with particularly good color discrimination MAY see, but my guess is most over 30 will not really be able to tell).

Well, yes, interpolated means that the "source" wasn't higher than 24bits, so it's been interpolated into 48bits.
The real trouble with most HDTV sets is when they're huge...that's when you begin to notice MPEG "blocks" or bands if your DVD player doesn't do deblocking or filtering.
I think the idea of having the movie itself at 48bits on the DVD actually makes sense.
You bought the newest HDMI 1.3 TV set, spent lots on the HD-DVD or Blu-Ray player...and then you can't see, say, the "film-grain" in your movie as in the old-good 100 bucks CRT television because it's been "smoothed" away to get rid of MPEG blockyness ???
Well, crap! :)
The movie itself should be at a higher color depth!
And, yes, I'm pretty sure 99% of the movies are shot at far more than 24bits/pixel. Movie cameras usually do 12bits for each RGB component, Panavision cameras I think make 16bits, that is 48bits/pixel.

hd-Dude
05-04-2007, 09:11 PM
As a side note, since I was also reading that HDMI 1.3 introduces resolutions up to 2560x1600 (WQXGA) pixels...this is way beyond what any HD-DVD or Blu-Ray disk could possibly carry with common compression technologies, IMHO.
A regular DVD takes around 2.5 Gbytes for the main movie, plus sound, plus extras and so on.
Now, take a movie that size - WQXGA - and with DeepColor (48bits/pixel) that's roughly 24-times the space needed for the old DVD movie.... that is, let's say, 60 Gbytes for the video *only* (no audio, no extra, no nothing).
So, unless some of these new technologies come around, we won't see movies "stored" in DeepColor...which is bad for our "money-spent-on-HD" I think.
We'll keep getting the same old "mosaic" and "wobbling" mpeg side-effects as with common DVD's.

BobY
05-04-2007, 09:33 PM
One doesn't need to increase the spatial resolution in order to increase the color depth.

HDMI 1.3 is an attempt to set a standard for current use and use well into the future. Systems will take advantage of it's capabilties to various degrees, but will be better than what we currently have, regardless.

The trick is not to fall for the marketing ploy that wants you to think that just because a product has HDMI 1.3, it will have Deep Color or any of the other enhancements...

hd-Dude
05-05-2007, 06:19 AM
One doesn't need to increase the spatial resolution in order to increase the color depth.

HDMI 1.3 is an attempt to set a standard for current use and use well into the future. Systems will take advantage of it's capabilties to various degrees, but will be better than what we currently have, regardless.

The trick is not to fall for the marketing ploy that wants you to think that just because a product has HDMI 1.3, it will have Deep Color or any of the other enhancements...

No, I didn't mean that *they have to* increase the resolution...but only *in case* they do.
They just presented a 100 inches LCD TV at NAB - I think. With that size you have around 20 pixels/inch of physical resolution - horizontally.
In that case you might want to have more pixels per square inch...that is probably why they included higher resolutions in HDMI 1.3.
I usually don't fall for marketing shout-outs, but that is why I'm interested in new solutions...that most of the times are "niche solutions", and not very known to the general public.
And it's interesting to discuss this with people with my same interests. :)

gekke henkie
05-06-2007, 01:35 PM
The trick is not to fall for the marketing ploy that wants you to think that just because a product has HDMI 1.3, it will have Deep Color or any of the other enhancements...

I have read on AVS that both HD-DVD's as Bluray-Discs do not support DeepColor at all, or at least, no such discs exist at this time. Did I misunderstood :what: ?

hd-Dude
05-06-2007, 05:39 PM
Well, that's easily true, since you need to have the movie itself "encoded" on disk at more than 24 bits/pixel in order to send it at an higher bit-depth.
To achieve this you need to use an Mpeg format allowing for more than 24 bits/pixel color-depth...which doesn't exist yet, to my knowledge at least.
Otherwise you could have the player to interpolate the data to a higher color-depth, but, as said, that would just be "fake" DeepColor...it would have nothing to do with the higher precision the movie was *originally shot* at. (or post-produced anyway)

That is why those things look interesting to me, because as of now, I haven't heard of any such capability around - in terms of the stored movie on disk - although DeepColor on TV's is already out.... which sounds very fake to me, if the source isn't at "DeepColor" depth.

BobY
05-06-2007, 06:07 PM
Well, let's be realistic. HDMI 1.3 with Deep Color just came out--we shouldn't expect any true Deep Color content for quite a while.

It's more expensive and uses more storage and nobody owns a display that will respond to it yet and even the newly announced displays with Deep Color are only 10-bit and even if they accepted 16-bit, current LCD and Plasma panels are physically incapable of displaying that many colors and nobody is going to build CRT's with 16-bit video DACs.

One would have to be crazy at this point to develop Deep Color content and we probably won't see much until the installed base of Deep Color compatible displays is significant.

hd-Dude
05-07-2007, 05:56 AM
Well, let's be realistic. HDMI 1.3 with Deep Color just came out--we shouldn't expect any true Deep Color content for quite a while.

It's more expensive and uses more storage and nobody owns a display that will respond to it yet and even the newly announced displays with Deep Color are only 10-bit and even if they accepted 16-bit, current LCD and Plasma panels are physically incapable of displaying that many colors and nobody is going to build CRT's with 16-bit video DACs.

One would have to be crazy at this point to develop Deep Color content and we probably won't see much until the installed base of Deep Color compatible displays is significant.


Of course, agreed on the whole line.
It is also obvious that manufacturers usually employ those new technologies for future products...as soon as the market is mature enough.
Right now to have DeepColor content would be crazy...but in 1 or 2 years it makes sense. All technology markets move way faster than others, when it comes to introduce new products or services. (we've seen enough of this "speed" lately with HD-TV, HD-DVD and LCD displays ;) )

hd-Dude
05-08-2007, 07:41 PM
and even the newly announced displays with Deep Color are only 10-bit and even if they accepted 16-bit, current LCD and Plasma panels are physically incapable of displaying that many colors...


About 16-bit LCD's, if you checked out the Brightside (Dolby) stuff in the past, it's going to be pretty easy to have 16-bit LCD's....even because they already exist - the DR37-P LCD display shown on the Dolby page is actually 16-bit capable - but not "mass-marketed" yet.
The Brightside's LCD works with IMLED - Individually Modulated LED - arrays. Basically they have thousands of LEDs as a backlighting source, forming a low-resolution "luma" image behind a standard LCD color panel.
As a result, since they can individually modulate the IMLED array with 16-bit precision, you have 24+16 bits -> 40bits/pixel. If you couple IMLED with a 10-bits modern LCD panel you get: 30+16 bits -> 46bits/pixel.... pretty much there... :)

BobY
05-08-2007, 08:00 PM
They will certainly be better, but the LED's only vary luma, not chroma, so they still won't be able to achieve the variety of hues necessary for true Deep Color, but they will produce a broader range of shades of the hues they *are* able to achieve.

hd-Dude
05-08-2007, 08:38 PM
They will certainly be better, but the LED's only vary luma, not chroma, so they still won't be able to achieve the variety of hues necessary for true Deep Color, but they will produce a broader range of shades of the hues they *are* able to achieve.

Definitely, and strictly speaking of DeepColor, yes.
Although such sublte variations of chroma aren't perceived by the viewer - that is also why Jpeg and Mpeg work so good - while sublte variations in luma are critical in order to perceive "smooth" pictures.
It would definitely be interesting to compare a picture at full DeepColor, with one produced with 30 bit chroma and 16 bit luma.