High Def Forum
Thank you for visiting. This is our website archive. Please visit our main website by clicking the logo above.

HDTV Picture settings / color temp.

mikerossi
02-14-2006, 07:50 PM
Hello,
I am new to High Def Forum and I just have a few questions about the picture settings. I did some reading, and I heard that keeping the picture setting high can reduce the life of your TV. By how much can it reduce the life of my TV? Is this true? If so, why does the picture look best with it up so high? Also, I noticed that colors look a lot better with the color temp. down. Does this affect the life of the TV in anyway, and why is it that the colors appear darker as I go up in temperature.
Thanks!

Mike Wolf
02-23-2006, 01:41 PM
As a professional installer, I recommend the 6500K temp setting, which is usually the WARM setting. I recommend SOUND & VISION Home Theater tuneup DVD. Check out www.soundandvisionmag.com

Porcupine
02-23-2006, 03:11 PM
I absolutely hate the Warm settings on most TVs. Everything looks so red. I always use Cool.

However, I would like to know the theory or reason why one should always use the 6500k settings. This seems to be the generally accepted thing. But I've yet to see a convincing theoretical argument explaining it.

I suppose the goal should be to reproduce the original image before it was caught on camera or film. Is 6500k the correct setting for doing that?

If a convincing argument is given to me I might change my viewing habits. I've done so in the past for things like Sharpness (used to turn it all the way up, now I understand what it really is and try to set it "neutral", wherever that happens to be).

mikerossi, different TVs are different. The actual menu number of the Picture setting on your TV doesn't really mean anything. Some TVs have super high Picture settings on zero, some TVs still have low Picture settings on maximum. It depends on your set and how it is set up what Picture level you should use.

d6500k
02-23-2006, 11:48 PM
However, I would like to know the theory or reason why one should always use the 6500k settings. This seems to be the generally accepted thing. But I've yet to see a convincing theoretical argument explaining it.

Ok...

Film makers use monitors calibrated to D65, a white point on a graph called a CIE chart. 6500k is a color temperature that when attained at D65 is the color of the sky, midday on an overcast day.

Since film makers use this as a standard.... we should too if accuracy is paramount.

Displays that tout settings as warm, cool, standard or actually give a color temp like 6500k, 7300k or 5500k (b/w), when measured, are almost NEVER are correct. Nothing more than pollihollowthermofuzz advertizing. Just like you have found that edge enhancement (sharpness) is a folly, incorrect color temperature is just as important if not more so.

You mention that Warm is "too red". Probably. "Cool", being preferred by you, might be running over 15000k. The only way to get it correct is to measure the output as it is and modify it through the use of controls (usually in service but manufacturers are getting with it more and more), to give you ONE setting that delivers exactly what the director intended.

Only measured results count. When you view this ISF calibrated display, you will have the same experience as if you were seeing the film in a top quality cinema. You'll just be able to pause the show to grab a cold one or take a .....

Good viewing,

Doug k

godson
02-24-2006, 12:58 AM
lmao porcupine....I used to do the same stuff with the sharpness setting..actually wasnt till I got the CRT that I actually noticed that it couldnt be right that way and have since also adopted a more 'nuetral" setting on all my sets. I also agree about the warm setting,way to red atleast on my sets

Porcupine
02-24-2006, 04:08 PM
Film makers use monitors calibrated to D65, a white point on a graph called a CIE chart. 6500k is a color temperature that when attained at D65 is the color of the sky, midday on an overcast day.

Since film makers use this as a standard.... we should too if accuracy is paramount.

Displays that tout settings as warm, cool, standard or actually give a color temp like 6500k, 7300k or 5500k (b/w), when measured, are almost NEVER are correct.

You mention that Warm is "too red". Probably. "Cool", being preferred by you, might be running over 15000k. The only way to get it correct is to measure...Thanks for the response, d6500k.

I've always known that what my TV/monitor labels as "Warm", 6500k, or whatever is generally not perfectly correct. For now that's not my main concern. My main concern is whether I should try to watch at something roughly like 6500k, or set things to 15000k Cold like I seem to prefer and have been doing so far.

First an observation...I have a Dell 20" Trinitron monitor that lets me manually set R,G,B values for "color temperature". It also comes with 3 preset RGB values. There is "5000k" which according to rough memory went something like 100 Red, 66 Green, 15 Blue....."6500k" which was roughly 100 Red, 77 Green, 45 Blue, and "9300k" which was roughly 100 Red, 93, Green, 78 Blue. The color setting I ended up using, after watching my monitor for ages and experimenting what what I liked best was 100 Red, 100 Green, 100 Blue. I do not know what the "color temperature" of that setting is but it is probably near 15000k like you say. To me, only on that setting do whites look white. Everything else looks reddish. Other things about me: I never use incandescent light bulbs I hate them. I try to avoid the Sun whenever possible because I hate its yellowy light (although it is much whiter than incandescent bulbs). I am puzzled because the Sun is often claimed to be "white" light, and the same is said of incandescent bulbs (says white on the box or bulb) but they look super yellow to me. I always try to use flourescent lights whenever possible because only those to me look white. I also am very fond of blacklights and even have one for my room although I never use it. ^_^;

I am paranoid. Things other people call white (incandescent bulbs, the Sun) do not look white to me. My Dell computer monitor says 6500k is heavily red-shifted in the RGB values. Of course, the RGB values do not necessarily mean anything as I do not know what physical quantities or units they represent (or try to represent). But they could still mean something, especially since I find it hard to believe the fact that I only feel 100 Red, 100 Blue, 100 Green makes white white is just a "coincidence".

You say that film makers review their own films with monitors calibrated to 6500k. I fully believe that. So I have no doubt that I should be watching my Revenge of the Sith at 6500k because the most important thing is to watch the image as it was originally intended.

But, the only DVD movies in my collection are Star Wars and Lord of the Rings. :) But I've got about 200+ anime DVDs, lol. And lots of videogames. That's my primary viewing material. For videogames/anime, I'm not sure the same rules apply. They may not use the same production methods. Nowdays most animes are created digitally though, so I suppose I should use whatever color temp on my TV as they had on their monitors. But can I be assured they are using 6500k? Maybe those weird Japanese people are like me and using 15000k. :) What about for videogames? I think in those cases oftentimes they might just choose colors based on "computer numbers." We want white game menu text? Ok, 255 R, 255 G, 255 B. Then it's up to me, the viewer, to make sure that looks white to me. And in my opinion it looks white at 15000k. Is there something wrong with my logic as applied to these types of sources?

Anyway, I should do what is correct, not what I feel, so I'll continue to look more into this...

BobY
02-24-2006, 07:38 PM
Actually long term exposure to video games causes a spectrum shift in the optic nerve.

I've heard it can be corrected by staring at a blank green wall for several hours a day.

Let me know if it works...

;)

d6500k
02-25-2006, 12:35 AM
To me, only on that setting do whites look white.

then you are riding the good wave....

I am paranoid. Things other people call white (incandescent bulbs, the Sun) do not look white to me

Next time you are up and ready at noon on an overcast day, go outside and look at the sky. Depending on the depth of the cloud cover you will either see white, or varying levels of gray. D65. It will not be red, blue or chartreuse (sp?), but white to gray with the absence of tint. That is why filmakers use D65. It is a constant.
BTW, if you look at the sun on a clear day, it is yellow, that is until your eyeballs flame out, then usually quite dark. Just kidding you there buddy! :)
Actually long term exposure to video games causes a spectrum shift in the optic nerve.

Of that, I have never had any doubt! Cracked me up and it is late!


Good Viewing Everyone!

Doug k

Porcupine
02-27-2006, 03:23 PM
After a lot of websearching, I think I finally understand what 6500 K finally is, and whether it truly makes whites look white. Thanks to everyone who has been contributing to this thread. Your information has been a lot of help to me. Before I go about explaining everything, first a disclaimer: I've gathered a lot of accurate facts, but relating them to TV language required some of my own interpretations, so feel free to comment on or disagree with anything I say. Another note: I've got a masters degree in physics so I do know about certain things, although I only have a beginning knowledge in most of the useful stuff, like astronomy and electronics.

First off, the incandescent light bulb. Supposedly they put out "white" light, however from experience I'm sure everyone can agree that this is simply bad terminology. In reality they put out a very yellowish light. So throw them in the garbage and let's skip right to the Sun, which puts out a whiter light.

Is the light from the Sun actually white? This issue is complicated by the fact that there are many misconceptions and inconsistent terminologies regarding this issue. I will try to clear them up. First, some quotes:

"WHY IS THE SKY BLUE?

The blue color of the sky is due to Rayleigh scattering. As light moves through the atmosphere, most of the longer wavelengths pass straight through. Little of the red, orange and yellow light is affected by the air.

However, much of the shorter wavelength light is absorbed by the gas molecules. The absorbed blue light is then radiated in different directions. It gets scattered all around the sky. Whichever direction you look, some of this scattered blue light reaches you. Since you see the blue light from everywhere overhead, the sky looks blue.

THE BLACK SKY AND WHITE SUN

On Earth, the sun appears yellow. If you were out in space, or on the moon, the sun would look white. In space, there is no atmosphere to scatter the sun's light. On Earth, some of the shorter wavelength light (the blues and violets) are removed from the direct rays of the sun by scattering. The remaining colors together appear yellow.

WHY IS THE SUNSET RED?

As the sun begins to set, the light must travel farther through the atmosphere before it gets to you. More of the light is reflected and scattered. As less reaches you directly, the sun appears less bright. The color of the sun itself appears to change, first to orange and then to red. This is because even more of the short wavelength blues and greens are now scattered. Only the longer wavelengths are left in the direct beam that reaches your eyes.

The sky around the setting sun may take on many colors. The most spectacular shows occur when the air contains many small particles of dust or water. These particles reflect light in all directions. Then, as some of the light heads towards you, different amounts of the shorter wavelength colors are scattered out. You see the longer wavelengths, and the sky appears red, pink or orange."

Porcupine
02-27-2006, 03:58 PM
So now one mystery has been solved. The Sun appears yellow if you are brave enough to look at it, but in reality it is much whiter than it appears. The clouds that d6500k referred to, we can assume are truly white objects which are being illuminated by the "white" light from the Sun. Therefore, the color of the overcast clouds at noon should be the same as "white" color of the Sun, if you were to see it from space.

I did go outside and looked at the clouds yesterday (by luck, good viewing conditions arose) and I felt that those clouds looked pretty white, but still with a slight tint towards the yellow. What a stubborn bastard I am! :) Anyway, that was before I did all my research so you can be sure that was my pure, unpolluted opinion. Continuing on...

Now the question is, is the REAL color of the "white" Sun actually white? The Sun's "temperature" can be measured in a variety of ways, but 5840 K is the quoted value I found that seems most reliable.

What exactly does it mean, for the Sun to have a "temperature" of 5840 K? In actuality the Sun is a stellar body which has many layers and a fairly complicated structure. The actual temperature inside the Sun can vary from around 15 million Kelvins at the core, to around 6000 K at the surface, yet back in the millions further out in the Sun's "atmosphere". However, we can model (or "pretend") that the Sun is what is called a perfect blackbody, for many purposes. Some quotes again:

"A blackbody is a theoretical object, which is both a perfect absorber and emitter of radiation. The radiation given off by a blackbody occurs in a wide range or spectrum of wavelengths."

Now for my own words. By radiation they mean electromagnetic radiation, i.e light. The graph of the intensity of the radiation vs the wavelength of the radiation, being emitted by a blackbody, gives you what is called a "blackbody radiation curve." This curve looks different depending on the temperature the blackbody is at. At 0 Kelvin (absolute zero) the "blackbody radiation curve" is just a flat line, zero everywhere. Therefore a blackbody at 0 K absorbs all light and emits no light, it appears black. That is why it is called a blackbody.

However, at various "temperatures" a blackbody will emit light with a certain predefined distribution of wavelengths and intensities. The "blackbody radiation curve" will have some shape to it. The light that the Sun outputs is almost exactly the same as a blackbody with a "temperature" of 5840 K. Therefore, if you want to know what kind of light the Sun is giving out, you need only calculate or look up the blackbody curve corresponding to a 5840 K blackbody. This curve is easily calculated according to an equation called the Planck Radiation Law.

Porcupine
02-27-2006, 04:24 PM
White is not a pure color. That is to say, no single wavelength of light looks white. White is a superposition of an equal balance of all visible colors. Or, equivalently to our eyes, an equal superposition of Red, Green, and Blue light could be defined as "white." However, as you may have figured, the definition of what is white leaves some slight room for subjectivity, which is where all our TV questions and arguments originate from.

If you look at the blackbody curve at 5840 K, it is an uneven distribution of all colors, not just Red, Green, and Blue. But it comes very close to being as if it were an equal superposition of Red, Green, and Blue, in a sense. So the light from the Sun looks quite white. But is it actually a completely unbiased white? There is a way to calculate this but it would take me some time and I don't know how to find a simple result to explain it. So I will just quote the following astronomy chart instead:

"Stellar Spectral Types

Stars can be classified by their surface temperatures.... The standard classes are:

Temperature
O 30,000 - 60,000 K Blue stars
B 10,000 - 30,000 K Blue-white stars
A 7,500 - 10,000 K White stars
F 6,000 - 7,500 K Yellow-white stars
G 5,000 - 6,000 K Yellow stars (like the Sun)
K 3,500 - 5,000K Yellow-orange stars
M < 3,500 K Red stars

The commonly used mnemonic for the sequence of these classifications is 'Oh Be A Fine Girl, Kiss Me'. "

I've left it up to the astronomers to analyze the blackbody curves for me and declare whether the final color balance is exactly white, or yellow-shifted, or blue-shifted. Notice that the Sun at 5840 K is definitively classified as Yellow.

Again, the Sun looks yellow to our eyes, but is actually whiter in reality. Nevertheless, even the real "whitish" color of the Sun is called by astronomers to be yellow. Because when they analyze the blackbody curve they come to the conclusion that the color balance is still very slightly shifted towards the yellow. An astronomer would look at d6500k's clouds and say they are definitely yellow-shifted (even though they may look white to most people, although to me they do look very slightly yellow).

TVs and computer monitors (when calibrated correctly) usually support the following settings: 5500 K or Warm, which would mimic the sort of lighting conditions you get from the Sun, though a teensy bit yellower....6500 K or Standard, which is actually a tiny bit whiter than even the white light of the Sun or d6500k's clouds....and 9300 K or Cool, which according to that chart above is actually True White. By being True White, what that means in this case is that the intensity of the Red, Green, and Blue coming off your TV for a white image (encoded with equal intensities of Red, Green, and Blue) is unaltered and actually displays with equal intensities of Red, Green, and Blue.

This result is consistent with what I saw on my Dell 20" Trinitron computer monitor. At 9300 K it gave me values for R,G, and B boosts, and they were roughly equal (still a tiny tiny tiny bit red-shifted).

It is possible that either my monitor is off by a bit or the astronomers are off by a bit. If I am to believe the astronomers then 9300 K is the true color temperature setting that makes white truely white. If I am to believe my monitor then 12000 K or 15000 K, or something like that is true white. Either way, that's far different from what ISF calibrators and the movie industry want to call white.

So we indeed have two competing standards, both of which have strong arguments supporting them. Should we choose 6500 K "Standard", or roughly 10000 K "Cool" ?

mikerossi
02-27-2006, 04:28 PM
Thanks for the help and replies everybody. Mike Wolf, I will be sure to purchase a TV calibration DVD. Thanks again.

Porcupine
02-27-2006, 04:50 PM
So to sum up, what is meant by a TV's "color temperature" and how does it relate to the white balance? I *believe* that when the various color parameters inside a TV (there are a lot...Color, Tint, and many more Service Menu parameters) are correctly set up (may require ISF calibration) then a certain color temperature setting puts the original image (the original RGB data as it is stored on the source and being transmitted to your TV) through a sort of color filter. It will try to mimick the look of those source images being illuminated by a blackbody of that given color temperature. This is my belief. I do not know if it is correct. It seems reasonable, though.

So, if you could set your TV's color temperature to 5840 K then a theoretical white sheet of paper stored as data ends up on your TV screen looking like that same white sheet of paper, as if it existed in real life being illuminated by the Sun.

Now this poses several interesting questions. What is the goal of image reproduction? Another factor to be considered is how the camera that captures the original image works. I will just assume that the camera captures the original image exactly as it looks as if you were standing there in real life. Now this poses a problem. In real life, a white sheet of paper would already be illuminated by the Sun, therefore appearing very slightly yellow. The camera should have captured that and the slightly yellow paper should have been stored in the source. Now the correct thing for the TV to do is to display that source image without any additional filtering (you don't want two Suns), therefore Cool is the right choice.

Perhaps that is why Cool is often the default setting for Sports modes on TVs.

Regarding movies, in theory Cool should be the correct setting as well. But d6500k said that many movie makers view their own movies, as they are filming them, through monitors calibrated to 6500 K. And they try to rig up their lighting conditions such that their movie looks the way they want it to, on a 6500 K monitor. Therefore you should also set your TV to 6500 K for watching these movies.

However, that depends on the particular movie I would think. Not everyone might make their movies that way, especially in foreign countries or low budget films. Cool might be the correct setting for many of those.

What about videogames or cartoons? Cool is often the default setting, and Cool is probably what you should use. However, in this case, there was no real life image to begin with, so a white sheet of paper (or text) is stored as pure pure white. Even if you did use 6500 K Standard, that wouldn't be so bad. It would be as if you made your videogame images come to life, then looked at everything in the Sun. That's okay too I guess. That would still be a color filtering though, and not an accurate representation of the original data. The question is simply what effect you want to achieve.

Porcupine
02-27-2006, 05:15 PM
Some other side notes now. Flourescent lights do appear to me to be pure white. Do they have a color temperature? Technically they don't. The reason is that flourescent lights physically behave nothing like blackbodies. They don't put out a continuous spectrum of light wavelengths. Instead they only put out a very select few wavelengths. However, this balance appears very white to our eyes, which only has 3 kinds of receptors (Red, Green, and Blue). I like flourescent lights and think they give better color balance than the Sun, that is why I always try to look at things under flourescent lighting. However, there is a pitfall to them which is well known among painters and modelers. Sometimes, certain colors appear totally weird under flourescent lighting. For example, if you are unlucky an object that is a particular shade of orange could suddenly turn black (or other weird colors) under flourescent lighting, etc. This is because they don't put out a continuous spectrum of colors. Anyways, that's a slightly different issue but I just wanted to mention it. :) It has to do with the way colors add and subtract and whether or not the colors of an object are true colors of that wavelength or superpositions of other colors that appear to be that color.

Incandescent lights are probably more like blackbodies than flourescents but they put out a very yellow light. And even blackbodies are not perfect lighting sources. An ideal lighting source puts out equal intensity light of all visible wavelengths, which blackbodies don't even come close to. But no such thing exists in real life so the Sun is the best thing we have, I guess.

This is all crazy side stuff now, not really related to TVs, sorry. :)

Um, oh yeah, I do know that SOME videogame-makers try to make their games such that they are correctly viewed at 6500 K. I believe all Namco games do this. Because the Namco logo is always pretty Red when you look at it in magazines and posters, whether viewed under flourescent lighting or the nearly-white Sun. But it looks slightly pink at 9300 K, and super pink at 15000 K. It looks the correct shade of red in the games themselves at 6500 K (or at least what my uncalibrated TV and monitors thinks is 6500 K). So play your Namco games at 6500 K if you want to be right. :) But any other games and all bets are off, Cool is probably more likely to be what they intended. :)

Anyway so my end conclusion is that both 6500 K and Cool are two competing standards and there are very good reasons for using both. A different choice may be more correct depending on what viewing material you are currently watching. But Cool is way more genuine than people give it credit for, which is what I've always suspected. A lot of people say 6500 K is invariably the right way to go, but I say they are wrong.

Criticism or thoughts on my arguments are very welcome.

d6500k
02-27-2006, 06:56 PM
Good Stuff!

Porcupine, one of the reasons for a true measured calibration is so you may have more than one "mode" accurately displayed. Many of my clients "prefer" ( right/wrong not withstanding) a higher k temp. Especially for casual sports viewing or gaming. Television manufacturers understand this and provide various presets to comply with the wishes of the prospective buyers.

Getting back to accuracy. Without measuring the actual display, there is absolutly no way of telling for sure just what temperature "cool" is. It may say in the device nomenclature that cool represents a k temp of 9500, but experience has taught me that it may well be something entirely different. Quite literally varying by thousands of degrees either way. That is the true basis of "standards" and why the NTSC adopted, along with SMPTE, the D65 point. They could have, and maybe someday will, adopt a higher (cooler) standard. I've no problem with that whatsoever. My ambition and ISF standards are to get your displays to actually be specific. When it says it is outputting D65, it actually is outputting D65 or whatever other preset you desire.

Have fun, you have worked hard on this so go watch a good movie! You deserve it!

Doug k

scgalena
03-02-2006, 10:33 PM
SMPTE says to reproduce 6500K a display should use a ratio of 30% red, 59% green and 11% blue. Back when those SMPTE standards were written the CRT's red phosphor efficiency was much less than today and had a very noticeable orange cast rather than the more accurate red tint we see today. Even then manufactures didn't necessarily stick to standards and routinely added extra blue drive to balance the phosphor induced yellow in the face tones. You old timers will remember the orange reds.

Porcupine
03-03-2006, 04:34 PM
d6500k, so ISF calibrators will also calibrate your "Cool" to an actual 9300 K (or whatever it should be), as well as your 6500 K to an actual 6500 K, right? That's good to know, thanks.

scgalena, do you have a link to an online source that references the 30% to 59% to 11% RGB ratio you mentioned? I'm curious to see what those numbers represent. Especially curious to me is why the green value is so high...

By the way, I found the following picture of the 6500 K blackbody curve from an astronomy website:

http://www.cur.org/publications/aire_raire/images/oregon4.jpg

Eyeballing that, it is hard to say whether that is blue or red-shifted. It peaks out in the blue, and the red parts are pretty low. However, also of critical importance is how wide the red parts are (in the X-direction) and how wide the blue parts, and green parts are. Because all the "red parts" add up to Red (since your eye only has 3 types of color detectors) you have to in some sense integrate the area under the curve to get your overall Red intensity value. Anyway, the astronomers say that curve is red-shifted slightly, if you believe them. :)

I still need to find out more things, especially regarding how things work on the source side...do cameras really capture the real-life image exactly as it appears in real-life, or do they automatically put that image through a color filter such that the image is properly viewed on a 6500 K monitor, or...etc etc...

Also I've been looking through websites, and some of them have said that in Japan the most common TV standard is 9300 K. I don't know if they are right though.

scgalena
03-03-2006, 05:26 PM
Did some searching around and found a site which mentions the 30, 59, 11 ratio. That color ratio is also what you would find at the prism splitter and RGB color separation filters behind the lens in a broadcast camera if it was pointed at a 6500K illuminated white card.
http://www.epanorama.net/links/videosignal.html

SMPTE bulletin board. You must be a member to view standards documents.
http://www.smpte.org/message_boards/index.cfm?postid=8&topicid=2

scgalena
03-04-2006, 11:25 AM
I still need to find out more things, especially regarding how things work on the source side...do cameras really capture the real-life image exactly as it appears in real-life, or do they automatically put that image through a color filter such that the image is properly viewed on a 6500 K monitor, or...etc etc...


Yes, the more sophisticated TV cameras and lenses have built-in diascopes and an auto set-up function which will automatically do a scan of an internal test image. While scanning, the computer will do a series of corrections and optimize such things as 6500K white balance, gamma, black level and flare compensation and save them to memory. With ENG cameras the operator, on the other hand, simply shoots a white card and pushes a button to store, much like a home video camera.

A more precise "ISF" type calibration can be made using a test chart such as the ChromaDuMonde from DSC Labs http://www.dsclabs.com/chromadumonde.htm . By watching scope patterns from charts such as these you are also able to adjust the matrix and saturation as well as the above mentioned settings.

For outdoor sporting events charts are usually just not practical for keeping white balance in line. This is one area where color shading or painting multiple cameras to look good and match each other is more an art than a science. One of the worst scenarios for always changing color temperatures is a late afternoon baseball game that lasts through sunset and wraps-up under the lights. Mix in a little of the bluish shadows from the stadium walls and some thick clouds that always seem to start rolling through on cue and you have some very busy video operators!

Porcupine
03-06-2006, 05:12 PM
Hmm, I took a link at that link you gave and it was not useful for this issue, ohwell. All it mentioned was that ratio, but it gave no measurement units or explanation of what those numbers really represent.

But if good cameras do automatically calibrate themselves for 6500 K like you just said, then all the details shouldn't matter and if we set our displays for 6500 K the original image should get reconstructed. So even sports or any material produced in the USA where 6500 K is the standard should probably be viewed at 6500 K, if all that is right.

Of course, in my case, where almost everything I watch is foreign from Japan, what setting I should watch at remains in question so I'll continue to think about this issue and experiment with settings as I view different shows and games. Thanks for the help and input.

scgalena
03-06-2006, 08:22 PM
Hmm, I took a link at that link you gave and it was not useful for this issue, ohwell. All it mentioned was that ratio, but it gave no measurement units or explanation of what those numbers really represent.


Sorry, try this link. This one has a good illustration of the 30, 59, 11 ratio of RGB light levels in the luminance portion of a color bar image.

http://nemesis.lonestar.org/reference/internet/web/color/charts/bars.html

Porcupine
03-07-2006, 10:21 PM
Hmm, I'm sorry for being stupid and/or blind, but on this latest page link you gave, all I see is a lot of nice looking color bars and numbers...but none of those numbers match with your quoted 30, 59, 11 RGB values. They are totally different numbers. I don't see how this applies. Maybe I just didn't look at the right section...?