High Def Forum
Thank you for visiting. This is our website archive. Please visit our main website by clicking the logo above.

Why today's best HDTVs aren't worth buying

Cass
01-27-2007, 09:22 PM
Why today's best HDTVs aren't worth buying (http://tech.msn.com/products/article.aspx?cp-documentid=2168611&page=1)

dabig25
01-27-2007, 11:16 PM
Funny how David Katzmaier is reviewing all these TV's & says great things about some. Now he's saying there not much worth buying ????

bobc1215
01-28-2007, 07:57 AM
If you always waited for the next technical improvement you would never buy anything. At some point and time you need to just pull the trigger.

BobY
01-29-2007, 12:51 PM
I can practically guarantee that the first displays with HDMI 1.3 still won't support deep color. Oh, they'll receive the information, but they will scale it to the color capabilities of the panel, which won't likely be "billions of colors" for typical consumer-grade LCD or Plasma panels (although Plasmas should have an easier time getting there than LCDs).

RedSIinPA
01-30-2007, 08:31 AM
electronics are like cars. no matter what you spend on your sh.it, someone's always better/faster/getting all the chicks.

if you like what you see and can afford it, no one's magazine article will convince me it's not "worth it." hell, a good part of the fun is the research and the buy. some people look at buying electronics like they're buying a doctor for life. it's just video and sound. buy the best you can afford when the time is right.

fryet
01-30-2007, 11:54 AM
I am more worried about my receiver having HDMI 1.3 than my TV. There is no source that can make use of the extra colors available in HDMI 1.3, however some audio specs are waiting for HDMI 1.3 before they can be used fully.

The article does appear to be pretty poor. Last I checked, there are not any HDMI 1.3 TV sets on the market.

Emil
01-30-2007, 12:20 PM
Question, where does Component fit in, isn't it analog? Is the limitation bandwidth and display color resolution then since I'm assuming it does get converted into a digital signal at some point?
Thanks

RedSIinPA
01-30-2007, 12:29 PM
If I was buying equipment right now, I'd really want to research that HDMI 1.3 is the solution everyone's waiting for. This technology changes faster than anything i've seen. HDMI = source of angst.

Yes component is an analog signal w/ max 1080i output I believe.

georgesp
01-30-2007, 10:02 PM
I am EX Broacast retired, still doing video work for Pac-10, and UCLA. sure HDMI 1.3 will be better, but when you get to 72 years old, and you have seen Television go from Black and White, to Color, and now to Digital, I want to see want ever the best I can see now. I might be dead the next day. When we went fom Black/White to Color, I was working in Broadcast during those days, 1955,I think is when the station I was working for, went to Color, it did not take long for us to go out and get the first Color sets we could get are hands on. I have Two Hdef sets today, One is also my Computer Monitor, 37" Sceptre, 1080P, and a Westinghouse. 1080P, which I did not want, they were giving it away, has it had no HTMI input, 2 DVI inputs, Component, VGA, ect. I was under duress from the wife to buy it. I have DirectTV HD-20-600 , 5lnb dish, and I get Locals in High Def.
I have a question on Dish. This friend has no DVR. Wants to upgrade, is it better for them to get a new Reciever with DVR, or buy TIVO. Dish told them TVIO would not work with Dish. I know that's Bull. Thanks George

RedSIinPA
01-31-2007, 07:48 AM
Buying a newer Tivo can get pretty pricey. Dish should offer the DVR for a cheap monthly rental rate.

Anyways, back to topic, I stand by what most of us are saying - "worth buying" is a very subjective term based a lot around what you currently have, what you have as free cash, and what kind of deal you're getting, so the subject of this thread, imho, doesn't hold water.

herkyjerk
02-01-2007, 09:31 AM
If you always waited for the next technical improvement you would never buy anything. At some point and time you need to just pull the trigger.

Whether I'm buying an HDTV or a computer, I have one staunch poilcy. Once I buy, I refuse to continue to follow the product I bought (like in sales ads & stuff). You will only depress yourself. The computer or TV you buy today will be a dinosuar in a year. This guys article is stupid. I love HDTV & could care less what I might be missing if I buy now. In 2 years I'll throw these in the gutter & go buy 2 more cause I got cache$$ & he don't. :thumbsup:

HDohioTV
02-01-2007, 10:15 AM
I was under duress from the wife to buy it.

Now THAT is a 1st!!

Anyway yeah this dude is a flake. Shilling HDTV's by day and telling consumers not to buy at night. I bought my 37 inch LCD in december 05 for $1,099. Can buy the same spec/brand tv today for $899 but, you know what, in that time I've enjoyed over a year of HDTV programming including: two superbowls, two BCS games for my beloved Buckeyes (and one complete--undefeated--regular season) as well as march madness in HD, MLB in HD, the 2006 world cup in HD (won't be back till 2010) and a hell've'ah lotta High Def movies on cable. Now, I COULD have stuck with SDTV through 2006and saved $200 but I would've missed out on all of that great hig def programming. But I think not. In terms of savings that extra $200 was money well spent IMO. You only live once and if you work hard everyday you've gotta treat yourself from time to time. I'm HD all the way now and if my 720p tv broke down today, I'll look at 1080p sets tomorrow (but would, most likely, buy a 720p replacement set)

HDTV forever!, ohio

Rob052067
02-01-2007, 10:59 AM
Just like having a 1080p TV when all the broadcasts are only in 720p or 1080i, what good is a TV with HDMI 1.3 today when most HD STB's still only have HDMI 1.0 connections (or no HDMI at all).

BobY
02-01-2007, 11:05 AM
A 1080p display will convert 1080i to 1080p, whether it accepts a 1080p input or not.

HD DVD and Blu-Ray disc players are already available with 1080p outputs. A 1080p input is not meant for broadcast signals--there may never be any 1080p broadcasts...

I WUV HD
02-01-2007, 06:44 PM
I am more worried about my receiver having HDMI 1.3 than my TV. There is no source that can make use of the extra colors available in HDMI 1.3, however some audio specs are waiting for HDMI 1.3 before they can be used fully.
I'm curious - which technology is waiting for HDMI 1.3 so that the audio capabilities can be used fully? I hope you're not referring to HD DVD or Blu-Ray, since industry insiders from both camps have gone on record as saying the preferred method of processing audio from the discs will be: decode and mix in-player, output as PCM to receiver. This method requires only HDMI 1.1.

Was there another technology you were referring to?

pappylap
02-01-2007, 10:55 PM
I bought my 37 inch LCD in december 05 for $1,099.


Hey HDohio I own the same set bought online in July love the set not many Olevia owners here so its nice to see one. I have one question about my set the ATSC tuner locks up when watching OTA HD I live a long way from stations approx 50 miles I wonder if that is causing a problem?...Ever experience this with your set?:confused:

GoBirds
02-03-2007, 09:03 PM
That's why I wouldn't be caught dead spending 7k on a freakin' TV. My eyes aren't 20/20 any way. At 35, I have to wear my glasses to enjoy my 1080i no-namer, and that suits my purpose just fine. I say keep your money or spend it on some other fun gadget!

paulc
02-04-2007, 11:31 AM
From what I've read, HDMI 1.3 promises a wider color gamut and an ability to handle uncompressed multi-channel audio.

Color gamut was a very real issue when we were moving from 8 to 16 to 24 bit color. Has to do with the ability to display very smooth gradients (i.e. less to no banding). BUT, that issue had substantially more impact on static rather than dynamic images. Nowadays, I would doubt most could accurately chose a HDMI 1.3 "Deep Color" image from a plain old HDMI 1.1 transmitted one. On the audio side, the issue seems to be uncompressed audio. Again even with $10,000 plus audio gear, the accuracy of choosing uncompressed would be very, very small.

I think it's mostly all about specsmaniship. In many areas of technology, you CAN test for differences using test gear while it translates 0% into what your eyes and ears see/hear.

BobY
02-04-2007, 04:19 PM
Color banding is a real issue. You're right, it won't show up in many scenes, but when it does, it's annoying.

24-bits sounds like a lot for color--millions of colors after all. Suppose the object on the screen is a red ball lit on one side. Well, with 24-bit color, you only have 8-bits of red and that means you have 256 shades of red. The red ball looks like it's a striped beach ball with clearly visible transitions from one shade to another, rather than the smooth gradient of the real image.

It can show up pretty obviously in CGI animation. Banding is very noticable in the "Finding Nemo" DVD--look particularly at the scenes inside their anemone when Nemo's dad is trying to wake him for school.

rbinck
02-04-2007, 04:52 PM
Color banding is a real issue. You're right, it won't show up in many scenes, but when it does, it's annoying.

24-bits sounds like a lot for color--millions of colors after all. Suppose the object on the screen is a red ball lit on one side. Well, with 24-bit color, you only have 8-bits of red and that means you have 256 shades of red. The red ball looks like it's a striped beach ball with clearly visible transitions from one shade to another, rather than the smooth gradient of the real image.

It can show up pretty obviously in CGI animation. Banding is very noticable in the "Finding Nemo" DVD--look particularly at the scenes inside their anemone when Nemo's dad is trying to wake him for school.Although I won't argue the greater points of that post, the 8 bits will give you 256 different lumen levels of red, not shades. I would not venture a guess as to how many shades of red that a 16 million color pallet might have, but I know it is a lot more than 256. Mixing a bit of blue and/or green in with the red will still result in a red shade, so in the final analysis there are more than 8 bits to define shades of red.

BobY
02-04-2007, 05:09 PM
Sorry, I was imprecise, but it is a matter of semantics and context...

My Webster's "New World Dictionary" defines "shade" in the context of color as "a degree of darkness of a color; gradation of a color with reference to its mixture with black", not with other colors. When you start adding other colors, I consider them more properly to be "hues".

But your description is more precise and not open to misinterpretation, particularly in the context of video, so I'll be happy to use it.

rbinck
02-04-2007, 10:15 PM
Yep, it is more complex I suppose. If you consider pink (light red) as a shade of red, which I do, then to get pink you have to add equal amounts of green and blue, something like R=255; B=233; G=233 which only lightens up the red to pink. I guess since there is also possibly an argument of pink being a hue as well, but then all colors near white (light blue, light green) would all be hues. Not too often this comes up however.

paulc
02-05-2007, 10:25 AM
Thanks for stepping up rbinick, I probably would have responded much as you did, I have dealt with still imagery for quite a far longer time than HD issues!

AND on reflection, I was slightly too harsh on "deep color." Correct about the banding, but I didn't consider something I already knew. The "real" benefit of shooting RAW (still) is in it's 48 bit color space. NOT because you get "better colors" or 'less banding" but (IMO) because of the grayscale levels. You have a better ability to pull detail from shadows than with a 24 bit image. Not a HUGE difference, we're talking fairly subtle stuff.