High Def Forum - Your High Definition Community & High Definition Resource

Go Back   High Def Forum - Your High Definition Community & High Definition Resource >
Rules HDTV Forum Gallery LINK TO US! RSS - High Def Forum AddThis Feed Button AddThis Social Bookmark Button Groups

High Definition News & Informative Articles Get the Latest High Definition News & Informative Articles Here! Please post newsworthy information here only! This forum is NOT for your first post. Thank you!

HDRI and DeepColor

Reply
AddThis Social Bookmark Button
 
Thread Tools
Old 05-03-2007, 11:07 AM   #1  
A couch and an HDTV to go please.
Thread Starter
 

Join Date: May 2007
Posts: 14
Default HDRI and DeepColor

Hi all!

I'm new to the forum, so I apologize if this new thread doesn't belong here.
Reading articles around about DeepColor, I found some new interesting technologies worth reading and discussing about.

Here are some links as a starter:

http://www.dolby.com/promo/hdr/technology.html
http://www.trellis-mgmt.com
http://www.mpi-inf.mpg.de/resources/hdr/hdrmpeg/

Some are pretty technical articles, some are more commercial.
Is this stuff going to hit market anytime soon in your opinion ?
hd-Dude is offline   Reply With Quote
Old 05-03-2007, 07:00 PM   #2  
Muscle Cars Forever!
 
Lee Stewart's Avatar
 

Join Date: Jan 2007
Location: Albuquerque, NM
Posts: 47,283
Default

If you are interested in Deep Color here is the "official" specs on it from the HDMI Org.:

http://www.hdmi.org/pdf/HDMI_Insert_FINAL_8-30-06.pdf
Lee Stewart is offline   Reply With Quote
Old 05-03-2007, 09:25 PM   #3  
A couch and an HDTV to go please.
Thread Starter
 

Join Date: May 2007
Posts: 14
Default

Thanks, I knew the HDMI site.
I was rather interested in how DeepColor is accomplished.
The Trellis website says that a movie is actually interpolated and filtered in order to reach DeepColor.
This makes some sense since movies are 24bit/pixel on the DVD or HD-DVD and Blu-Ray. (or not ??)

Although I couldn't verify this.
hd-Dude is offline   Reply With Quote
Old 05-04-2007, 08:24 AM   #4  
What's all this, then?...
 
BobY's Avatar
 

Join Date: Jan 2006
Posts: 6,197
Default

Ideally Deep Color will be implemented by encoding the real colors with more bits, but until the capture equipment actually does that, interpolation will be used. That wouldn't make the colors more accurate, but it will certainly reduce color banding/quantization errors.

I suspect the reason the Component Video connection on my DVD player has less obvious banding than the HDMI connection is the HDMI is limited to 8 bits/color, while the Component Video connection uses 10-bit DAC's and ADC's for 10 bits/color. The extra bits are interpolated, but that creates smoother shading.

Be forewarned that there are no current, or even announced (at any price) HDMI 1.3 displays that use all 16 bits/color supported ny Deep Color. Most are 10 bits/color, which is certainly better than 8 bits/color, but hardly "Deep" Colo--more "Not quite as shallow" Color.
BobY is offline   Reply With Quote
Old 05-04-2007, 08:29 AM   #5  
Muscle Cars Forever!
 
Lee Stewart's Avatar
 

Join Date: Jan 2007
Location: Albuquerque, NM
Posts: 47,283
Default

Boby,

Look at this article about DC and please explain to me what "dithering" is. It seems to blend the colors better with the 8 bit system that we currently use.

http://easyhdtv.blogspot.com/2007/02...v-display.html
Lee Stewart is offline   Reply With Quote
Old 05-04-2007, 08:31 AM   #6  
Very Grizzled Vet of 1 yr
 
paulc's Avatar
 

Join Date: May 2006
Location: New York City
Posts: 1,764
Default

Actually, it's 8 bits/pixel. 24 bit refers to all 3 channels (red, green, blue, RGB).

"Interpolated" sounds odd, it could be that is a "phony" way to say your transfer has "Deep Color" as opposed to digitizing the original film in 16 bit/channel.

Of course, one very interesting question is exactly what color depth has been used in HD transfers to date? Did they actually start making such transfers at higher bit depth than current technology can handle, essentially preparing for the future?

Another very interesting question is there is some serious doubt about current display panels and what bit depth they can actually show.

So even if everything works the way it should, the final question is whether or not most folks can actually see a difference. Only an opinion, but I'd say most would NOT be able to see any visible difference. I'd love to see some double blind type tests with data about ages factoring in (IMO, a 25 year old with particularly good color discrimination MAY see, but my guess is most over 30 will not really be able to tell).
paulc is offline   Reply With Quote
Old 05-04-2007, 09:31 AM   #7  
Muscle Cars Forever!
 
Lee Stewart's Avatar
 

Join Date: Jan 2007
Location: Albuquerque, NM
Posts: 47,283
Default

Quote:
Originally Posted by paulc View Post
So even if everything works the way it should, the final question is whether or not most folks can actually see a difference. Only an opinion, but I'd say most would NOT be able to see any visible difference. I'd love to see some double blind type tests with data about ages factoring in (IMO, a 25 year old with particularly good color discrimination MAY see, but my guess is most over 30 will not really be able to tell).
Here is my read on Deep Color . . .

Back in the 1990's, when the USA was working on an HDTV system, it was using the Japanese MUSE system as a guideline (some tried to re-invent the wheel and were just too ahead of their time).

Due to the imits of technology, we were able to increase the resolution of the B & W signal . . . but not the color signal.

Deep Color is a fix for that "problem" (or as i like to call it . . . . HDTV's dirty little secret)

If the color depth and "resolution" matched the B & W resolution . . . IMO - a bunch of problems disappear and the image should look "much better" as long as the display is able to handle the signal properly (no gimicks)

It should easily be seen by anyone who can distunguish the difference between NTSC and HDTV. DC "should" be that much of a difference (I am talking about the full blown, top of the chart DC - 48 bits)

Last edited by Lee Stewart; 05-04-2007 at 09:34 AM..
Lee Stewart is offline   Reply With Quote
Old 05-04-2007, 09:31 AM   #8  
What's all this, then?...
 
BobY's Avatar
 

Join Date: Jan 2006
Posts: 6,197
Default

Lee-

Dithering is a common method for reducing quantization error in digitized signals, basically adding low-level random changes to each sample. What often happens with digitzing is, due to the quantizing limits and hysteresis of the Analog-to-Digital converters or the math limits of the digital processing, obvious patterns develop in the data (like the color banding they show in your link). By randomizing the samples slightly around the original value (dithering), the patterns are broken up and become harder to discern.

In the early days of CD's, a lot of recordings had music that decayed away into annoying low-level pitched tones as the volume trailed off. By dithering the data, the music trailed off into low level noise, which was far preferable to pitched tones as people are very sensitive to pitch and not very sensitive to random noise.

PaulC-

I should have phrased that better. 8 bits/"color component" would have been a better way of expressing it (by which I meant each separate color component--i.e. R, G and B of the RGB space).

8 bits/color component is 24 bits/pixel, although the color information is not stored per pixel. Deep Color supports 16 bits/color component or 48 bits/pixel.
BobY is offline   Reply With Quote
Old 05-04-2007, 09:35 AM   #9  
What's all this, then?...
 
BobY's Avatar
 

Join Date: Jan 2006
Posts: 6,197
Default

Lee-

The main reason color resolution is more restricted than luminance resolution is that the eye is much more sentive to changes in brigthness than changes in color. DVD and HD exploit this fact to reduce bandwidth/storage requirements by not encoding color information for every pixel. It was a concious choice to keep price down, not a technological limitation per se.
BobY is offline   Reply With Quote
Old 05-04-2007, 11:16 AM   #10  
Muscle Cars Forever!
 
Lee Stewart's Avatar
 

Join Date: Jan 2007
Location: Albuquerque, NM
Posts: 47,283
Default

Quote:
Originally Posted by BobY View Post
Lee-

The main reason color resolution is more restricted than luminance resolution is that the eye is much more sentive to changes in brigthness than changes in color. DVD and HD exploit this fact to reduce bandwidth/storage requirements by not encoding color information for every pixel. It was a concious choice to keep price down, not a technological limitation per se.
In the 1990's the only HD ever discussed was OTA as that was going to be the initial "phase" of HDTV. The problem as we know was to squeeze the HD signal into a 6 Mhz channel.

So could we have gone to 16 bit color back then? How much would have the signal increased in "size" to make this change?

With the current compression technology at the time, could we have had 16 bit color?

And of course . . at what cost?

Where is the cost located? Camera? Recording Equipment? Transmission Equipment? Display?
Lee Stewart is offline   Reply With Quote
Old 05-04-2007, 11:29 AM   #11  
What's all this, then?...
 
BobY's Avatar
 

Join Date: Jan 2006
Posts: 6,197
Default

One can view the 6MHz bandwidth restriction as a technological limitation, but from an engineering viewpoint, there was no technological limit on the bandwidth, it was an arbitrary limit imposed by the FCC to keep digital HD within the existing requirements of analog.

The cost is in storage. Cameras capture all the color information, then it is compressed. Recording/storage equipment would need higher capacity to store the increased amount of data. Transmission equipment would need the increased bandwidth, which they weren't allowed to have although it was technically possible. Displays don't care, but for CRT's, the more bits/color, the more bits the D/A converters need. Many fixed-pixel dispalys are incapable of resolving that much color information due to physical limits in their ability to respond to small changes in information--but mostly it would be a cost issue.
BobY is offline   Reply With Quote
Old 05-04-2007, 11:33 AM   #12  
Muscle Cars Forever!
 
Lee Stewart's Avatar
 

Join Date: Jan 2007
Location: Albuquerque, NM
Posts: 47,283
Default

Quote:
Originally Posted by BobY View Post
One can view the 6MHz bandwidth restriction as a technological limitation, but from an engineering viewpoint, there was no technological limit on the bandwidth, it was an arbitrary limit imposed by the FCC to keep digital HD within the existing requirements of analog.

The cost is in storage. Cameras capture all the color information, then it is compressed. Recording/storage equipment would need higher capacity to store the increased amount of data. Transmission equipment would need the increased bandwidth, which they weren't allowed to have although it was technically possible. Displays don't care, but for CRT's, the more bits/color, the more bits the D/A converters need. Many fixed-pixel dispalys are incapable of resolving that much color information due to physical limits in their ability to respond to small changes in information--but mostly it would be a cost issue.
Thank you for your post. It does make a lot of sense.

So BobY, let me ask you a question . . . .

Will Deep Color make a difference in the image we see?
Lee Stewart is offline   Reply With Quote
Old 05-04-2007, 11:42 AM   #13  
What's all this, then?...
 
BobY's Avatar
 

Join Date: Jan 2006
Posts: 6,197
Default

Oh yeah! Even at 10 bits/component.

I used this analogy before. Suppose you take a video of a red ball with a light shining on one side. While 24-bit color may reproduce "millions" of colors, you have only 8 bits for each color component, so there is only 8 bits of "red" for our ball. The video image will render the ball in only 256 shades of red. It will not appear smoothly shaded, but divided up into bands of different shades of red, rather like a beach ball.

If we switch to 30-bit color, we now have 10 bits/color component and we can render 1024 shades a red--a big improvement. Each color band will be 1/4 as wide and there will now be 4 times as many shades of red to blend with.
BobY is offline   Reply With Quote
Old 05-04-2007, 08:34 PM   #14  
A couch and an HDTV to go please.
Thread Starter
 

Join Date: May 2007
Posts: 14
Default

Quote:
Originally Posted by paulc View Post
Actually, it's 8 bits/pixel. 24 bit refers to all 3 channels (red, green, blue, RGB).

"Interpolated" sounds odd, it could be that is a "phony" way to say your transfer has "Deep Color" as opposed to digitizing the original film in 16 bit/channel.

Of course, one very interesting question is exactly what color depth has been used in HD transfers to date? Did they actually start making such transfers at higher bit depth than current technology can handle, essentially preparing for the future?

Another very interesting question is there is some serious doubt about current display panels and what bit depth they can actually show.

So even if everything works the way it should, the final question is whether or not most folks can actually see a difference. Only an opinion, but I'd say most would NOT be able to see any visible difference. I'd love to see some double blind type tests with data about ages factoring in (IMO, a 25 year old with particularly good color discrimination MAY see, but my guess is most over 30 will not really be able to tell).
Well, yes, interpolated means that the "source" wasn't higher than 24bits, so it's been interpolated into 48bits.
The real trouble with most HDTV sets is when they're huge...that's when you begin to notice MPEG "blocks" or bands if your DVD player doesn't do deblocking or filtering.
I think the idea of having the movie itself at 48bits on the DVD actually makes sense.
You bought the newest HDMI 1.3 TV set, spent lots on the HD-DVD or Blu-Ray player...and then you can't see, say, the "film-grain" in your movie as in the old-good 100 bucks CRT television because it's been "smoothed" away to get rid of MPEG blockyness ???
Well, crap!
The movie itself should be at a higher color depth!
And, yes, I'm pretty sure 99% of the movies are shot at far more than 24bits/pixel. Movie cameras usually do 12bits for each RGB component, Panavision cameras I think make 16bits, that is 48bits/pixel.
hd-Dude is offline   Reply With Quote
Old 05-04-2007, 09:11 PM   #15  
A couch and an HDTV to go please.
Thread Starter
 

Join Date: May 2007
Posts: 14
Default

As a side note, since I was also reading that HDMI 1.3 introduces resolutions up to 2560x1600 (WQXGA) pixels...this is way beyond what any HD-DVD or Blu-Ray disk could possibly carry with common compression technologies, IMHO.
A regular DVD takes around 2.5 Gbytes for the main movie, plus sound, plus extras and so on.
Now, take a movie that size - WQXGA - and with DeepColor (48bits/pixel) that's roughly 24-times the space needed for the old DVD movie.... that is, let's say, 60 Gbytes for the video *only* (no audio, no extra, no nothing).
So, unless some of these new technologies come around, we won't see movies "stored" in DeepColor...which is bad for our "money-spent-on-HD" I think.
We'll keep getting the same old "mosaic" and "wobbling" mpeg side-effects as with common DVD's.
hd-Dude is offline   Reply With Quote
Sponsored Links
Go Back   High Def Forum - Your High Definition Community & High Definition Resource >
AddThis Social Bookmark Button
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -6. The time now is 01:19 PM.



Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
Copyright 2004 - 2018, MH Sub I, LLC dba Internet Brands