High Def Forum
Thank you for visiting. This is our website archive. Please visit our main website by clicking the logo above.

Monster(marketing) Cable HDMI 1.3 vs Offbrand! A test bench from a visual perspective

jmgator
11-16-2007, 01:12 PM
Hey everyone,
I know the big question for the latest and greatest HDMI picture/sound quality is do you really need to spend a large sum of money on that cool Monster cable that is ultra fast and produces 6.68gbps+ speeds with 24k gold plating and the sales associate is pushing hard on you to purchase monster because nothing is as good as a monster cable. Well lets test that theory.

To answer that question I have done the following: I purchased the following brand cables - Monster and monoprice.com

Monster Ultra 800 HDMI at 6.68gbps
http://www.bestbuy.com/site/olspage.jsp?skuId=8473476&st=monster+ultra&lp=4&type=product&cp=1&id=1188558641997

Monoprice.com - 1.3a HDMI cable
http://www.monoprice.com/products/product.asp?c_id=102&cp_id=10240&cs_id=1024008&p_id=3952&seq=1&format=2

Ok so we have 130.00+ cable vs 6.00 cable from a company that most people haven't heard of. You would think that the Monster cable should produce 5x the picture and sound at that price point right?

The verdict.......I will let you know tonight or tomorrow.:thumbsup:

Lee Harvey
11-16-2007, 03:01 PM
Five bucks says that the Monoprice cable produces a better picture. I have purchased expensive HDMI cables and Monoprice cables and the Monoprice do as well or better than the expensive cables. Monoprice uses 24 Ga or larger wire for each conductor in the cable. Most of the other guys use 32 or smaller wire. Basic rule of thumb is that you can pass more current and more bandwidth with a larger conductor than you can with a smaller conductor.

Loves2Watch
11-16-2007, 04:31 PM
Five bucks says that the Monoprice cable produces a better picture. I have purchased expensive HDMI cables and Monoprice cables and the Monoprice do as well or better than the expensive cables. Monoprice uses 24 Ga or larger wire for each conductor in the cable. Most of the other guys use 32 or smaller wire. Basic rule of thumb is that you can pass more current and more bandwidth with a larger conductor than you can with a smaller conductor.

With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... The pass more current and bandwidth issue doesn't come into play. I work as a broadcast engineer. I live and breathe digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...

That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"

Scottnot
11-16-2007, 06:41 PM
Monoprice uses 24 Ga or larger wire for each conductor in the cable. Most of the other guys use 32 or smaller wire.
Funny, the Monoprice cable cited in this thread is 28 Ga (smaller than 24).
And, I doubt that anyone is using 32 Ga for HDMI cables.
Let's not be starting urban myths.

jmgator
11-16-2007, 09:22 PM
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... The pass more current and bandwidth issue doesn't come into play. I work as a broadcast engineer. I live and breathe digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...

That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"



I completely agree. It's has now been made official. Those expensive Monster HDMI cables are not worth the money. Monoprice.com should be the new standard for our highdef members.:thumbsup:

drock912001
11-17-2007, 02:28 AM
I completely agree. It's has now been made official. Those expensive Monster HDMI cables are not worth the money. Monoprice.com should be the new standard for our highdef members.:thumbsup:
Probably so!But monster pays heavily for their marketing,promoting and hype.Even Radio Shack carry Monster brand and push it over their on.I had a manager at the Shack tell me about all these patents and awards Monster have won.And even went to say they have done innovative things that no other cable manufacturer has.I had to shut 'em up and tell him with digital it's either good or bad their is no in between.I know from experience on that 1.

Scottnot
11-17-2007, 08:57 AM
Most simply, HDMI was developed for two primary reasons:
First, to keep all things digital.
Second, to "simplify" wiring of home entertainment, by reducing it to a "single" wire rather than a multitude of wires.

It achieved this goal quite well, with a few notable limitations:
1) complex technical requirements of transmitter-receiver interface which has been "mostly" overcome by manufacturers.
2) limitations on distance over which HDMI can be easily implimented.

Why the distance limitation? Simply, to reduce the wire size required by the use of a very small "coaxial bundle". The smaller coax becomes, the greater the attenuation and the lower the bandwidth vs distance.

Attenuation can only be solved by increasing the size of the center conductor - note that the "standard" HDMI cables use 28 awg wire while the "high performance" HDMI cables use 24 or 22 awg wire. Going above 22 awg gets us back into the realm of bundled RG58/59/6 size of cables.

Bandwidth can be partially solved by maintaining strict manufacturing controls of the cables, the connectors and the manufacturing controls of cable/connector attachment. Most cable manufacturers have done a pretty good job of this as well.

So, let's not be mislead:

With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON...
At the simplest level, this would be correct, however, let's not forget the TMDS electrical specification defines much lower voltages than this, and requires a very tight (less than 1.0 V swing) voltage requirements, in very tight time slices; resulting in an eye diagram mask of rather strict tolerance requirements. It isn't "simply" ON or OFF . . . . it is EITHER ON or OFF, and the trouble is when it should be ON, but it's OFF!! This CAN and DOES happen with HDMI.

The pass more current and bandwidth issue doesn't come into play.
Which is exactly why the "pass more current and bandwidth issue" DOES come into play. Let's not forget that HDMI cables are essentially bundles of coaxial cables designed to carry high speed data.

As for current: Mr. Ohm tells us that E=IR, which means there will be a voltage drop (signal attenuation) on any cable relative to the current and the resistance of the cable. Now, TMDS specifies a very restrictive voltage swing at the receiver - if the attenuation is too great - no signal! That's why 22 awg cables are better than 28 awg cables.

As for bandwidth: A cable with insufficient bandwidth, or of such a lenght as to limit the bandwidth, will simply fail to pass the full digital bit stream and again - no signal!

Now, I do not work as a broadcast engineer. I do not live and breathe digital and analog signals every day. So perhaps, you could say I'm not qualified to give the answer to this question...but a little research goes a long way.

That answer, however remains almost the same, "No, an expensive HDMI cable does not guarantee improvement in the quality of your picture OR sound";
However, ANY HDMI cable that will pass the signal over the distance required is as good for the task as any other cable that will do the same thing.

Loves2Watch
11-17-2007, 11:32 PM
It isn't "simply" ON or OFF . . . . it is EITHER ON or OFF, and the trouble is when it should be ON, but it's OFF!! This CAN and DOES happen with HDMI.

Which is exactly what I said.

Loves2Watch
11-17-2007, 11:47 PM
Which is exactly why the "pass more current and bandwidth issue" DOES come into play. Let's not forget that HDMI cables are essentially bundles of coaxial cables designed to carry high speed data.

As for current: Mr. Ohm tells us that E=IR, which means there will be a voltage drop (signal attenuation) on any cable relative to the current and the resistance of the cable. Now, TMDS specifies a very restrictive voltage swing at the receiver - if the attenuation is too great - no signal! That's why 22 awg cables are better than 28 awg cables.

As for bandwidth: A cable with insufficient bandwidth, or of such a lenght as to limit the bandwidth, will simply fail to pass the full digital bit stream and again - no signal!



In the context the bandwidth issue was discussed, it makes no difference. Either the signal is passed or it is not. A larger gauge wire will make no difference in the quality of the digital HDMI signal. Alright Bill?

Scottnot
11-18-2007, 07:51 AM
A larger gauge wire will make no difference in the quality of the digital HDMI signal. Alright Bill?
Semantics here; unclear what you might mean by "quality of the digital HDMI signal".
The statement was that " A cable with insufficient bandwidth, or of such a lenght as to limit the bandwidth, will simply fail to pass the full digital bit stream".
And indeed, cables designed and manufactured with larger gauge wire can/should/will exhibit higher bandwidth than cables designed and manufactured with smaller gauge wire.
This (bandwidth) is ONE of the reasons why some cables work quite well with a 720p/60 signal or a 1080i/30 signal but fail to pass a 1080p/60 signal; or why a particular brand of cable can pass a 1080p/60 signal over 6', but fail to pass the same signal over a 25' run. Bottom line is - bandwidth does matter and wire size does matter.

Loves2Watch
11-18-2007, 08:34 AM
You are missing the point, but oh well.

daleb
11-18-2007, 05:06 PM
Introducing larger gauge wire does not automatically improve bandwidth and transmission along an HDMI cable.
In general that's true for the longest runs, but other considerations do not make it a flat rule for every HDMI cable.

In fact, higher frequencies can be hindered with larger gauge conductors due to great capacitance between adjacent conductors.

There are many variables a manufacturer has to consider. But the bottom line is they have to meet specific requirements for HDMI transmission for any one cable, 'return loss' and controlling impedance are two critical design considerations.
One design may use different gauge conductors and based on other design criteria still meet the same requirements.

Scottnot
11-18-2007, 08:05 PM
Introducing larger gauge wire does not automatically improve bandwidth and transmission along an HDMI cable.
"All things being equal . . . yes it does.

In general that's true for the longest runs, but other considerations do not make it a flat rule for every HDMI cable.
Long runs for HDMI are >30 feet, I would say, certainly "other considerations" such as manufacturing tolerances, etc. make a difference.

In fact, higher frequencies can be hindered with larger gauge conductors due to great capacitance between adjacent conductors.
Simply not true. (all other "normal" factors being equal)

There are many variables a manufacturer has to consider. But the bottom line is they have to meet specific requirements for HDMI transmission for any one cable, 'return loss' and controlling impedance are two critical design considerations.
One design may use different gauge conductors and based on other design criteria still meet the same requirements.
duh . . .

oh, I believe the "technical term" that you really wanted to use to blow everyone's mind with your erudite knowlege was "characteristic impedance". Perhaps its time go go back to your Blue Jeans reference material and brush up.

daleb
11-18-2007, 10:42 PM
"All things being equal . . . yes it does.

No, because capacitance could indeed offset the bandwidth advantage of a larger cable at higher frequency.
You know, larger conductors = more capacitance?


Long runs for HDMI are >30 feet, I would say, certainly "other considerations" such as manufacturing tolerances, etc. make a difference.
Simply not true. (all other "normal" factors being equal)


Obviously by talking about differences, I was not talking about 'other things being equal'.


oh, I believe the "technical term" that you really wanted to use to blow everyone's mind with your erudite knowlege was "characteristic impedance". Perhaps its time go go back to your Blue Jeans reference material and brush up.

No, Impedance is what I wanted to use.
And within the context of my sentence, it is entirely appropriate.

Besides, BJC did not invent the idea or the terms, they are common in any engineering text. But then why would I expect you to know that?

http://www.quabbin.com/tech_briefs/tech5.html

Since you never provide any evidence to back your statements and refuse to recognize, learn or share anyone else's opinion, experience, or knowledge, your credibility suffers accordingly, as witnessed by many other's comments (in both your namesakes).

There are so many factors involved in designing a cable for specific applications, that saying larger conductors are always preferred for HDMI for longer runs is simply naive. In actuality, the smallest cross-section conductor that will do the job is preferred. And that has a lot more to do with design than with less copper = less cost.

Scottnot
11-19-2007, 12:57 PM
No, because capacitance could indeed offset the bandwidth advantage of a larger cable at higher frequency.
You know, larger conductors = more capacitance?
Actually, no, I do not know that, at least not in the realm of coaxial cable. Again, to demonstrat that "all things being equal" is a pretty important criteria, following your oversimplified logic, an RG11 cable (using 14 AWG wire) ought to have higher capacitance and lower bandwidth that RG59 cable (using 20 AWG wire). Well, as it turns out, when the center conductor size is changed, it is also necessary to change the diameter of the outer conductor (or the dielectric material, or both) to maintain the correct characteristic impedance of the cable. And guess what? The capacitance stays (almost) exactly the same - approximately 21pF/ft whether the cable is RG59, RG6 or RG11 and whether the center conductor is 20 AWG, 18 AWG or 14 AWG, but the bandwidth and attenuation characteristics are readily improved with the larger inner conductor - the same science applies to HDMI cables.
If you need references to be convinced of these well know scientific facts (not opinions), you might start at wiki/coax read it and then follow a few links to the real world. All of the facts, formulas and technical information is there.

No, Impedance is what I wanted to use.
And within the context of my sentence, it is entirely appropriate.

Besides, BJC did not invent the idea or the terms, they are common in any engineering text. But then why would I expect you to know that?
It could very well be that "Impedance is what (you) wanted to use", however you erroneously used the non-term "controlling impedance", and that non-term, even in the context of your sentence was not only inappropriate, but had no meaning. After all, what you DID say was
. . 'return loss' and controlling impedance are two critical design considerations . . .
. . . and there just is no such thing as "controlling impedance"

Since you never provide any evidence to back your statements . . .
huh??

. . . and refuse to recognize, learn or share anyone else's opinion, experience, or knowledge, . . .
huh?? again.
As for opinion - correct, opinion doesn't much count in technical discussion, and indeed, I am quick to dismiss simple unsupported opinions.
As for experience - some people benefit from it and some don't.
As for knowledge - well, there was a time when everyone KNEW that the world was flat.

There are so many factors involved in designing a cable for specific applications, that saying larger conductors are always preferred for HDMI for longer runs is simply naive.
Well, since I never made any such statement, I won't worry about being naive. Please don't misquote, it's annoying.

In actuality, the smallest cross-section conductor that will do the job is preferred.
. . . took the words right out of my mouth, and I don't believe I have ever said otherwise.
In fact, I recall that I have previously stated in my Post #7 that " . . . ANY HDMI cable that will pass the signal over the distance required is as good for the task as any other cable that will do the same thing."

And that has a lot more to do with design than with less copper = less cost.
Well, now we're back to square one and the issue that the OP brought to this forum. Frankly, I think that the "less copper = less cost" issue is somewhat bogus as even a monoprice 22AWG/50' cable sells for twice as much as the equivalent 26AWG cable ($31 vs $15) . . . the total added copper is somewhere in the range of 3-5 ounces which, at $3/lb is about $1 of added material cost so what's happening? Simple, the "higher quality" or "higher performance" cables can and are sold at a premium resulting in better margins for the seller - nothing wrong with that, it makes good business sense and indeed, Monster is one of the leaders at this marketing strategy while stores like Monoprice use a different business model and take less advantage of this particular marketing tool.

jmgator
11-19-2007, 02:42 PM
This thread was never opened up to be an indepth analysis, but only a visual test for an average consumer to know that there isn't a difference between the monster $130.00 cable and the $6.00 monoprice.com cable. Lets keep it simple for the members out there and help everyone save money.:D

Scottnot
11-19-2007, 03:16 PM
Amen!

Thanks for the experiment.

Can you advise what you actually "pushed through" the cables? Was it 1080p/24?

fmw
11-19-2007, 04:27 PM
Frequency? What frequency? This is a digital transmission line. It sends digital bits.

daleb
11-19-2007, 08:13 PM
Actually, no, I do not know that, at least not in the realm of coaxial cable.

It applies to any conductor, used in coax or otherwise.
Of course it's frequency dependent, meaning at audio frequencies capacitance of a larger conductor in a speaker wire for example, would not be detrimental. At much higher frequencies it certainly can be.



Again, to demonstrat that "all things being equal" is a pretty important criteria, following your oversimplified logic, an RG11 cable (using 14 AWG wire) ought to have higher capacitance and lower bandwidth that RG59 cable (using 20 AWG wire). Well, as it turns out, when the center conductor size is changed, it is also necessary to change the diameter of the outer conductor (or the dielectric material, or both) to maintain the correct characteristic impedance of the cable. And guess what? The capacitance stays (almost) exactly the same - approximately 21pF/ft whether the cable is RG59, RG6 or RG11 and whether the center conductor is 20 AWG, 18 AWG or 14 AWG, but the bandwidth and attenuation characteristics are readily improved with the larger inner conductor - the same science applies to HDMI cables.

No surprise there. And the kind of dielectric used is critical as it the spacing between shield and center conductor to maintain a specific impedance.


If you need references to be convinced of these well know scientific facts (not opinions), you might start at wiki/coax read it and then follow a few links to the real world. All of the facts, formulas and technical information is there.
I think you could well benefit from the references. I have bookshelves full.


I have bookshelves full. But hope you keep reading, you are showing improvement.


It could very well be that "Impedance is what (you) wanted to use", however you erroneously used the non-term "controlling impedance", and that non-term, even in the context of your sentence was not only inappropriate, but had no meaning. After all, what you DID say was
. . . and there just is no such thing as "controlling impedance"


If you read my post in context I was referring to impedance having to be controlled (along with return loss) etc.
Controlling impedance is not a kind of impedance. I will attribute that to interpretation or poor wording on my part.


As for opinion - correct, opinion doesn't much count in technical discussion, and indeed, I am quick to dismiss simple unsupported opinions.

Coming from you I find that somewhat ironic. At least up to now, where there is evidence of improvement on your part!



............. I won't worry about being naive. Please don't misquote, it's annoying.


Your attempt at humor is appreciated, almost as much as the irony.

jmgator
11-19-2007, 08:48 PM
Amen!

Thanks for the experiment.

Can you advise what you actually "pushed through" the cables? Was it 1080p/24?


We tested Batman Begins at 1080p/24fps with the setup you will find in my signature and we used a monoprice.com HDMI 1.3a cable switching with a Monster Ultra Series 800 HDMI 6.68gbps cable. Their was absolutely no difference between the visual image and sound reproduction. Save your money on cables and buy more movies! .:thumbsup:

Scottnot
11-19-2007, 09:05 PM
We tested Batman Begins at 1080p/24fps with the setup you will find in my signature and we used a monoprice.com HDMI 1.3a cable switching with a Monster Ultra Series 800 HDMI 6.68gbps cable. Their was absolutely no difference between the visual image and sound reproduction.
Thanks for the additional info. The results are not surprising.

Save your money on cables and buy more movies! .:thumbsup:
Ha, ha, not to worry, all of my HDMI and/or component cables have come from monoprice.

Loves2Watch
11-19-2007, 09:55 PM
Glad to hear that. It just proves what many here have been saying all along.

daleb
11-20-2007, 06:09 PM
Glad to hear that. It just proves what many here have been saying all along.

Nice avatar! :)

Loves2Watch
11-20-2007, 06:45 PM
Nice avatar! :)

Thanks.

xjae14x
11-21-2007, 12:21 AM
We tested Batman Begins at 1080p/24fps with the setup you will find in my signature and we used a monoprice.com HDMI 1.3a cable switching with a Monster Ultra Series 800 HDMI 6.68gbps cable. Their was absolutely no difference between the visual image and sound reproduction. Save your money on cables and buy more movies! .:thumbsup:

Sorry!!! Let me apologize in advance but I have an off topic question hopefully that can be answered. I have not been able to find confirmation that the Onkyo receivers (605 and/or 705 as used in the test) can pass 1080p/24fps...I have only been able to find ONE A/V receiver that indicates it can do this but it's in the area of $1,500 and thus way out of my budget.

Oh and just to say something on topic...I am running a Samsung BD1400 player hooked up to a 71 series Sammy LCD via the HDMI 1.3 Cat 2 cables from Monoprice and the picture quality is outstanding!! I got into an argument a few days ago at CC when the salesman was offended that I would in no way even consider the $200 HDMI cable he was trying to sell me.

jmgator
11-23-2007, 08:05 PM
Sorry!!! Let me apologize in advance but I have an off topic question hopefully that can be answered. I have not been able to find confirmation that the Onkyo receivers (605 and/or 705 as used in the test) can pass 1080p/24fps...I have only been able to find ONE A/V receiver that indicates it can do this but it's in the area of $1,500 and thus way out of my budget.

Oh and just to say something on topic...I am running a Samsung BD1400 player hooked up to a 71 series Sammy LCD via the HDMI 1.3 Cat 2 cables from Monoprice and the picture quality is outstanding!! I got into an argument a few days ago at CC when the salesman was offended that I would in no way even consider the $200 HDMI cable he was trying to sell me.

Yes the 705 can, but and I cannot speak for the 605 (never tested). Don't qoute me on this... I believe all the receivers are all the same...605,705, ad 805 can support a bandwidth of 1080p/60 audio processing. We watched the movie in Dolby TrueHD utilizing the 5gbps bandwidth it offers. Hope this helps.

By the way I had the samsung bdp1400 player but took it back. I didn't like the quirky playback features and the 1.0 profile compliancy issue. I love my Toshiba :yippee:

rivy1992
11-25-2007, 10:42 AM
Dear High Def Gurus,

I greet you all with the utmost respect as I have racked my brain for 48 hours now trying to resolve a problem with no success at all!!

I purchased a new Toshiba 42", 1080p LCD HD TV Friday morning and have been trying all weekend to hook up the Motorola HD-DVR (DCT6412 III) cable box provided by Comast Cable to the new television using a HDMI cable. No matter what I try, the TV tells me that there is no signal present when I try to set the input to the tv as HDMI 1 (recommended location of connection per manufacturer, which is unlocked).

Comcast has been here twice and can't figure it out (they tried 3different cable boxes and none of them could transmit the signal to the tv)...the manufacturer can't figure it out, I have returned the TV twice (this is the 3rd model I have brought home)...I have returned the HDMI cable twice (third set of MONSTER, 4.65 Gbps), and still no signal through the HDMI 1 port...

Can anyone please help me? :banghead:

SteKbierr
11-30-2007, 04:30 AM
Nothing wrong with using component cables in my book! (ummm nice rich colors!)

Scottnot
11-30-2007, 09:42 AM
Nothing wrong with using component cables in my book! (ummm nice rich colors!)
In theory, and all else being equal, you should see no difference.
The "nice rich colors" could easily be the difference between calibration settings of the component vs the HDMI input on your TV.

Joe_news
11-30-2007, 09:20 PM
You should have tried using the M1000!

pappylap
11-30-2007, 11:07 PM
Dear High Def Gurus,

I greet you all with the utmost respect as I have racked my brain for 48 hours now trying to resolve a problem with no success at all!!

I purchased a new Toshiba 42", 1080p LCD HD TV Friday morning and have been trying all weekend to hook up the Motorola HD-DVR (DCT6412 III) cable box provided by Comast Cable to the new television using a HDMI cable. No matter what I try, the TV tells me that there is no signal present when I try to set the input to the tv as HDMI 1 (recommended location of connection per manufacturer, which is unlocked).

Comcast has been here twice and can't figure it out (they tried 3different cable boxes and none of them could transmit the signal to the tv)...the manufacturer can't figure it out, I have returned the TV twice (this is the 3rd model I have brought home)...I have returned the HDMI cable twice (third set of MONSTER, 4.65 Gbps), and still no signal through the HDMI 1 port...

Can anyone please help me? :banghead:

First of all eliminate that its your hdmi input hook up another source via hdmi and see if you get the signal.....in another thread some guy was told that comcast did not support hdmi that may be it... Sorry if this hijacked this thread but I wanterd to try and help this poor guy...

http://www.highdefforum.com/showpost.php?p=426520&postcount=1

BoSoxMole
12-01-2007, 09:31 PM
Well, what did you find out jmgator?

huy30
12-03-2007, 03:02 PM
First of all eliminate that its your hdmi input hook up another source via hdmi and see if you get the signal.....in another thread some guy was told that comcast did not support hdmi that may be it... Sorry if this hijacked this thread but I wanterd to try and help this poor guy...

http://www.highdefforum.com/showpost.php?p=426520&postcount=1

The problem is he's using Monster cable? :what:



I still have friends that want to spend money on Monster cables.

joevberg
12-14-2007, 01:51 AM
Hi everyone, I posted these 2 posts on another forum:

Ok, explain this one. I too hate the ridiculous prices of monster cables. I resent the fact that the sales people who just made a pretty dime off me buying expensive equipment try to make me feel like I won't get my moneys worth unless I spend a lot more money on the best cables. I have read about analog vs digital and that as long as the signal is there thats all that matters. So I have been happy using cheap cables, even the one that came in the box of my directv hr-20. Until, the other day I picked up a blu-ray open box buy, they wouldn't honor my 12% coupon because the player wasnt at full price...so I used the coupon on a hdmi monster cable after arguing with the rep that their full of crap, she said just bring it back if you dont like it and by the way you get a free movie with it also.. Fair enuff. So I was too tired to re-run all the cables that night so instead of hooking up the blu-ray, I just replaced the hdmi cable on the hr-20 with the monster, both only 4'. Next day my wife asked me what I did to the tv (she understands nothing about what is hooked up to the tv) I asked why, whats wrong? She said "nothing, it just looks better, like the stuff in the stores". I said it already did, she said "no, it never looked this crisp and clear and the colors are brighter". I finally took the time to watch some tv, and I agree with her, I even tried another cheap hdmi just in case it was a bad original one. That is the best test possible, my own eyes. Now I am not saying the price is worth the better picture, because it is still great with any hdmi or component cables. The only true test would be having the exact same setup side by side running the same content. Its like wine, the expensive bottle is not necessarily better then a cheaper bottle but you've gotta try both and if you like the cheaper one or can't tell the difference, or just want the buzz, then definitely save the money. Sorry such a long post, but had to post my honest findings.

And my next post:

Ok, I'm back. I may have found my own answer. And if I am right then I should delete my original post, but I'll leave it in case my new theory is wrong.

Could it be that the cheaper cables were older and non v1.3a? And thh new monster ultra 600 is v1.3a which = more bandwidth? If so, then a new monoprice v1.3a would be just as good and I can take this monster back and whip the biach with it.

Thanks for your time folks.

jmgator
12-14-2007, 07:21 AM
That's correct that you could've obtained a really cheap cable. I did the test bench with monoprice.com cables because of their reputation and price. Look at my singnature to find your answer. Take back the cable unless you want it for looks...even though you won't be seeing it behind your TV LoL. I tested the 800 and there was no difference.:hithere:

rwdavis2
12-14-2007, 10:26 AM
Hi everyone, I posted these 2 posts on another forum:

I finally took the time to watch some tv, and I agree with her, I even tried another cheap hdmi just in case it was a bad original one. That is the best test possible, my own eyes. Now I am not saying the price is worth the better picture, because it is still great with any hdmi or component cables.

I'm curious about this opinion. I thought that the only possible difference a cable could make with digital signals was by dropping bits. I don't know much about how an hdmi signal is sent but how would dropped bits show up on the TV? It's not as if, for example, a 5 volt '1' looks better than a 4 volt '1', is it?
Thanks,
BD

joevberg
12-14-2007, 10:45 AM
That's correct that you could've obtained a really cheap cable. I did the test bench with monoprice cables because of their reputation and price. Look at my singnature to find your answer. Take back the cable unless you want it for looks...even though you won't be seeing it behind your TV LoL. I tested the 800 and there was no difference.:hithere:

You missed my point. As I said I have always agreed with you and stuck to cheaper cables, until recently I have noticed a difference. I know my post was long, but read the whole thing, not just the last part.

daleb
12-14-2007, 11:50 AM
You missed my point. As I said I have always agreed with you and stuck to cheaper cables, until recently I have noticed a difference. I know my post was long, but read the whole thing, not just the last part.

I also swore up and down a cheaper (AR I think) S-Video cable 'looked better' than a Monster.
Same length of cable. But the Monster was about as thick as my thumb. And the inflexibility of it would have popping out of the jack behind my TV set, every few days.
To add to the problem, the pin retention of the AR cable was much better. It just grabbed better when inserting it. Felt much more solid.
That does not explain why I could still SEE a difference in the picture, as small as it was.

In general, the biggest difference between any two cables is the workmanship and quality of materials. How much shielding, etc. might explain some attributes, but probably none that would be visible to the naked eye. Beyond that, there should be no difference in how they work. So here is one case where the cheaper cable 'appeared' better, but certainly fit better, regardless.