High Def Forum - Your High Definition Community & High Definition Resource

Go Back   High Def Forum - Your High Definition Community & High Definition Resource > General Chit Chat > Speakers & Surround Sound
Rules HDTV Forum Gallery LINK TO US! RSS - High Def Forum AddThis Feed Button AddThis Social Bookmark Button Groups

Speakers & Surround Sound A place to learn more about speakers, their functionality and how they can complement your entertainment experience. RSS - Speakers & Surround Sound

HDMI audio vs Optical Audio

Reply
AddThis Social Bookmark Button
 
Thread Tools
Old 10-27-2007, 08:43 PM   #1
What is HD?
 

Join Date: Jun 2007
Posts: 3
Default HDMI audio vs Optical Audio

What is considered the highest quality for audio output between HDMI and the optical out? I using HDMI for my blu-ray but was wondering if using the optical out would enhance the sound (not that it sounds bad, but always looking for the best).

Are there people who setup their systems using the HDMI for the video but then the optical as the sound? seem more practical to just use the HDMI to avoid one less cable but maybe it is common.

Thanks for any info on this.
dustyjnz is offline   Reply With Quote
Old 10-27-2007, 08:46 PM   #2
Blu-ray is Betamax 2.0
 

Join Date: Oct 2007
Posts: 494
Default

it's the same
heyman421 is offline   Reply With Quote
Old 10-28-2007, 06:02 AM   #3
fmw
High Definition is the definition of life.
 
fmw's Avatar
 

Join Date: Jun 2007
Posts: 1,033
Default

They transmit data, not sound. The cables can't change the value of the data. They can only transmit it.
fmw is offline   Reply With Quote
Old 10-28-2007, 06:50 AM   #4
High Definition is the definition of life.
 

Join Date: Mar 2005
Posts: 1,993
Default

Optical bit rate limit is 1.5 Mbps, HDMI will carry a lot more (depend on the HDMI spec., HDMI 1.3 is upto 10.2 Gbps) this is why you need HDMI for Dolby Ture HD and DTS HD and SACD, DD+ is converted to DTS if you use optical cable your receiver will show DTS when you play DD+ through optical cable.
if one has Receiver with HDMI, the best connection is HDMI.
__________________
Panasonic 60ST60/Pioneer BDP-51FD /Toshiba HD DVD,
DIRECTV HMC,Harmony Remote One
Apple TV2, Yamaha RXV1700
Def. Tech. Mythos III, V & Gems, Super Cube 6000
Power Conditioner: Belkin PF40
Panasonic TC-P46ST30/OPPO BDP-93
ISF Calibration
APPLE TV2,Harmony Remote One,
Polk Audio Soundbar SDA.
Belkin Power Conditioner
iserum is offline   Reply With Quote
Old 10-28-2007, 08:37 AM   #5
Former Super Moderator
 
Loves2Watch's Avatar
 

Join Date: Aug 2007
Location: In Flux
Posts: 20,284
Default

Commentary: Specs vs. Reality
Fri Oct 26, 2007 at 04:22 PM ET
By Joshua Zyber

One of the inevitable side effects of the High Definition revolution is that the advanced video and audio technology used in the Blu-ray and HD DVD formats tends to bring out the know-it-all tech geek in home theater fans. Sometimes this can be a great benefit, when knowledgeable users band together to analyze specific technical deficiencies that have occurred and share their feedback with the parties responsible, hopefully leading to improvements in the future. We've seen some of this at various points during the format war. Early Blu-ray releases such as 'The Fifth Element' exhibited obvious visual deficiencies due to weak source materials and poor digital compression encoding. Likewise, HD DVD catalog titles from Universal have been hit-or-miss in quality, many of them recycled from dated and problematic video masters (like 'In Good Company', with its ghastly edge enhancement artifacts). Reviews published on this site and others were negative, and buyers voiced their displeasure to the studios, eventually resulting in improved mastering on subsequent releases. 'The Fifth Element' was even remastered in significantly better quality as a direct result of owner feedback. That wouldn't have happened had no one spoken up about it.

Generally speaking, the High Definition studios, knowing the intense scrutiny their work is placed under, have maintained a much higher standard of quality on recent releases (with some notable exceptions, of course). Just imagine what might have happened had the public been apathetic and merely accepted whatever shoddy treatment they were handed. In this case, the voice of the people resulted in a better end product for everyone to enjoy.

Unfortunately, the above example is a best case scenario. On the flip side of that coin, we have countless cases of agenda-driven individuals attempting to use a partial understanding of technical matters as a bludgeon in arguments supposedly "proving" the superiority of one format over the other. Anyone who's spent time browsing home theater discussion forums has suffered through an endless string of debates about how the HD DVD format "sucks" because its discs can only store 30 gb of content, while Blu-ray discs can store up to 50 gb, and therefore must be amazingly superior. Never mind that HD DVD has time and again proven capable of delivering exceptional picture and sound quality, plus copious bonus material, easily equaling even the best available on Blu-ray. At the same time, there are others who point to the occasional Blu-ray encoded with MPEG-2 compression as being "unacceptable", even though MPEG-2 can certainly achieve excellent results when given enough room to breathe (witness 'Black Hawk Down'). To some people, the actual quality presented to them is irrelevant if they don't like the sound of the specs on paper.

This "specs above all else" mentality has reared its ugly head again recently with the release of 'Transformers' on HD DVD, a title that delivers stunning video and audio, as well as a number of innovative interactive features. What could possibly be the problem here? Well, the soundtrack is only encoded in Dolby Digital Plus format, not a lossless codec such as Dolby TrueHD or an uncompressed one like PCM. In his review of the disc for this site, our Peter Bracke gave the DD+ track a perfect "5" for audio quality and said of it that, "Directionality, imaging, accuracy of localized effects, and the sheer depth of the soundfield are all fantastic stuff." Nonetheless, in the minds of many, this disc is a huge failure, and its soundtrack a pathetic disgrace for not including a TrueHD or PCM option.

I should mention at this point that at least one working Hollywood sound mixer has voiced his opinion that, when played back on his professional dubbing stage, well-mastered Dolby Digital Plus soundtracks encoded at the high 1509 kb/s bit rate that Paramount uses can be audibly transparent to the studio masters, when tested on movies that he mixed himself and would presumably know better than anyone else. But what use is the informed opinion of an expert in the field when it's easier to just point to the specs list on the back of a disc's packaging to make conclusive statements about matters of quality? In the forum on this site, a number of readers have made proclamations such as, "Compressed audio is just not acceptable these days" and "Whether you can tell the difference or not is irrelevant."

The disc's audio being indistinguishable from its studio master is "irrelevant"? Even with just a Dolby Digital Plus track, the 'Transformers' disc rated the highest score for audio quality that we can give. What more could we demand from it? It's absolutely terrific, but it's just not absolutely terrific enough if the packaging doesn't have a listing for TrueHD or PCM, even when it's likely impossible for human ears to tell the difference? What kind of argument is that?

The lossy compressed audio formats offered by Dolby and DTS use perceptual encoding techniques to filter out data from the studio masters in order to conserve disc space. The intent of perceptual encoding is that the data removed should consist mainly of either frequencies beyond the range of human hearing or frequencies that would normally be masked by other frequencies in the track anyway. With the most heavily compressed formats, including basic Dolby Digital and DTS (the standards on regular DVD), often additional frequencies within the range of hearing are affected, and this has resulted in much variability in sound quality. However, Dolby Digital Plus, especially the 1509 kb/s variety found on a disc like the 'Transformers' HD DVD, uses much more efficient encoding techniques at a very high bit rate. The people who actually make these movie soundtracks have found it pretty impressive, and yet average home listeners seem to believe with absolute certainty that the home theater speakers in their living rooms would be capable of resolving with precision the mathematical difference between a high bit rate Dolby Digital Plus track and a lossless one, and that their golden audiophile ears would also be capable of discerning it. Personally, I would like to put these people to a properly-controlled blind test, where all of the audio levels have been carefully matched to the same volume, and then see how well their hearing fares.

I would not claim that all DD+ tracks are flawless or transparent to their masters; it does take some effort to encode them properly. But to dismiss the format out of hand simply because the soundtrack isn't labeled as lossless or uncompressed demonstrates an ignorance of the technology being used. If the audio codec alone were the only important criteria in sound quality, how could it be that a disc like 'Dinosaur' with a 48 kHz / 24-bit PCM 5.1 track would sound so underwhelming? With specs like those, why isn't that disc a spectacular audio showcase? Somehow I doubt you'll find too many critical listeners who would ever claim that 'Dinosaur' sounds better than 'Transformers', but based on the specs, shouldn't it? Perhaps it's time we all realize that there's more to quality than the specs can tell us.

Yet we see the same thinking applied to matters of video. How many more arguments must there be about the different video compression codecs? Proponents on one side proclaim the infallible superiority of VC-1 above all other options, while those opposed insist that VC-1 is garbage and only AVC MPEG-4 is any good. Both camps attempt to prove their point by capturing screen shots on their computers, which they run through Photoshop to crop, zoom, filter, and distort in all manner of convoluted ways in order to locate individual errant pixels, completely invisible to the naked eye in the normal course of movie watching, and heartily declare their victory in the debate.

The truth of the matter is that all video compression codecs have the same purpose, to accurately represent the source using a fraction of the storage space. In the hands of a good operator, both VC-1 and AVC are more than capable of achieving this goal. Even the dated MPEG-2 codec has been known to deliver excellent results (owners of the now-defunct D-Theater tape format sure didn't seem to have any problem with it). There are plenty of examples of "reference quality" transfers using any of the above, from 'King Kong' (VC-1) to 'Final Fantasy' (AVC) to 'Kingdom of Heaven' (MPEG-2). In all cases, the skill of the compressionist and the quality of the work is more important than the codec used to get there.

It's also more important than the bit rate. As far as I'm concerned, Sony's decision to incorporate a bit rate meter in their PS3 Blu-ray player is one of the worst things to have ever happened to the home theater hobby. Because of that one seemingly-innocuous and frequently-inaccurate data display, now just about anyone, no matter how technologically ignorant, can believe themselves to be experts in the field of video reproduction, based on nothing more than whether their bit rate meters read a high number or a low one -- as if that number were even relevant. The whole point of video compression is to squeeze a High Definition picture into as little space as possible. A compressionist who's maintained a high-quality picture with a low bit rate has done an excellent job, but that's a point lost on most consumers, who assume that a good picture needs a high bit rate, regardless of what they actually see on their TV screens. The bit rate alone is a meaningless statistic and says nothing about the quality of the compression work. It is equally possible to create a lousy video image with a high bit rate, or a great image with a low bit rate, depending on the complexity of the content and how well the work is done. I found it extremely amusing to read complaints about the low bit rate used on 'TMNT', a disc with a razor sharp and amazingly detailed picture that some owners nonetheless decried as "soft" against the evidence their own eyes gave them, for no reason other than an ill-founded assumption that the picture would have been even sharper if the bit rate meter spiked a little higher. How would they know? Have they compared it against the studio master?

This misconception has reached such heights of absurdity that certain viewers have started petitions demanding that Warner Bros. stop using the same video encodes on HD DVD and Blu-ray, and instead "maximize" the bit rates on their Blu-ray releases if the extra disc space is available. But for what purpose? Video compression doesn't work on a linear scale. Using advanced codecs like VC-1 and AVC, there are diminishing returns above a certain point, and throwing more bits at a picture that doesn't require them accomplishes nothing more than to make the meter number go up. As time goes on, compression tools and techniques become more efficient, requiring even less space to achieve visual transparency to the original master. Warner Bros. has many times over demonstrated outstanding results within the 30 gb limit of HD DVD, even on very long films such as the 'Troy: Director's Cut', a movie that runs 3 1/2 hours and yet fits comfortably on a 30 gb disc with beautiful picture quality, despite also squeezing in a lossless Dolby TrueHD audio track and a bunch of supplements. So what if the Blu-ray edition has an extra 20 gb of space available? Are we watching the movie or watching the bit rate meter? If there were no bit rate meter, would anyone have a legitimate basis to complain?

Back when they were supporting both High-Def formats, Paramount actually did what these users are demanding. They authored every movie separately for HD DVD and Blu-ray, each maximized to its format's potential. And what were the results? The same movie looked visibly identical on the bit rate maximized Blu-ray as it did on the lower bit rate HD DVD. Once again, the quality of the compression trumped other considerations regarding tech specs or bit rate.

Don't get me wrong, I'm not trying to imply that all HD DVDs and Blu-rays are perfect now. Video artifacts do occur, and the studios have been known to rest on their laurels and allow shoddy work to slip through. Sometimes disc space really does strain the limits of what a studio wants to include on a High-Def title. It's important to scrutinize their results, lest we return to a state where the original 'Fifth Element' Blu-ray is considered acceptable. But it's equally important to understand what we're actually looking at. Many times, the "artifacts" picked apart by viewers have nothing to do with video compression or encoding whatsoever, but rather are issues found in the source, such as natural film grain, which isn't a flaw at all. Yes, a soft picture can be the result of poor compression or excessive filtering, but it can also be the result of soft focus photography. A heavily-grainy image could be overcompressed, or it could be stylistically intentional. Not every movie is photographed to look exactly the same as every other, and even within a film certain shots or scenes may look different than others. We must understand what a movie is supposed to look like before we can judge how well a video disc reproduces it. Being moderately proficient at manipulating still images in Photoshop does not necessarily qualify someone as an expert in the art of filmmaking.

I'm not suggesting that viewers should relax their standards or accept substandard quality as "good enough" when it's really not, but the technical specs alone simply do not tell the whole story, and over-emphasizing them is a matter of misplaced priorities. We should judge these discs by the actual quality they deliver, not by misleading statistics like the bit rate or the specs listing on the packaging. Surely, that can't be too much to ask.
__________________
It's always time for pie
Live everyday as if it was your last and plan on living forever...
Loves2Watch is offline   Reply With Quote
Old 04-09-2008, 03:30 AM   #6
What is HD?
 

Join Date: Jan 2008
Posts: 3
Default

Great post Loves2watch. I couldn't agree more.
Art in Heaven is offline   Reply With Quote
Old 04-09-2008, 04:48 PM   #7
Member
 
BIslander's Avatar
 

Join Date: Jan 2008
Location: Bainbridge Island, WA
Posts: 1,627
Default

Quote:
Originally Posted by dustyjnz View Post
What is considered the highest quality for audio output between HDMI and the optical out? I using HDMI for my blu-ray but was wondering if using the optical out would enhance the sound (not that it sounds bad, but always looking for the best).

Are there people who setup their systems using the HDMI for the video but then the optical as the sound? seem more practical to just use the HDMI to avoid one less cable but maybe it is common.
To the OP (dating back to 2007) - HDMI can handle multichannel PCM and can bitstream the new high bit rate codecs including DD+, Dolby TrueHD, dts-HD HRA, and dts-MA. Optical is limited to two channels of PCM and to legacy multichannel codecs such as DD and DTS. HDMI is the only way to get lossless digital audio.

To Loves2Watch - Personally, I prefer lossless audio. Lossy DD and DTS are encoded at high bit rates on Blu-ray and they sound quite a bit better to me than the lower bit rate versions commonly used on DVD. But, I find lossless audio to be another step up - clearer and more detailed. The DD+ sound track on Transformers is spectacular and I have no idea whether a lossless track would be an improvement. I do agree with the author that higher bit rates do not necessarily mean better sound. But, in my personal experience, I almost always prefer a lossless track to its lossy cousins.
BIslander is offline   Reply With Quote
Old 10-27-2012, 08:57 PM   #8
Audiophiles Anonymous
 

Join Date: Oct 2012
Posts: 1
Default The short and skinny

Quote:
Originally Posted by iserum View Post
Optical bit rate limit is 1.5 Mbps, HDMI will carry a lot more (depend on the HDMI spec., HDMI 1.3 is upto 10.2 Gbps) this is why you need HDMI for Dolby Ture HD and DTS HD and SACD, DD+ is converted to DTS if you use optical cable your receiver will show DTS when you play DD+ through optical cable.
if one has Receiver with HDMI, the best connection is HDMI.
LOL, my bad, LOL, OK had to get that out. I see and hear this all the time and it makes me laugh. The limit is always going up for one. Second, the bit rate on HDMI is so high because over 80% of it is used for the VIDEO. C'mon do your home work people! Thirdly, man that spelling is bad, True HD, Master Audio etc. has been out since BEFORE HDMI hit the scene. Hmm how is that possible? Because you could play those audio formats with either an Optical or Coaxial DIGITAL input. What! Amazing right? And also think about this, do you really know how much bandwidth, a.k.a. data transfer capability, you need in order to play those HD audio signals? KHZ? MHZ? GHZ? You only need Megahertz to play the audio from todays Blu-ray dvds. Ta-da! Sorry, I'm not trying to be mean, I just want people to be properly informed. Also on a similar note, ever see a nice CD player that costs around three to four hundred dollars? And wonder how in the world they could charge so much for it? Do some home work and find out. I GUARANTEE you can hear the difference. I'll give ya a clue too, it has to do with the audio decoder chip inside. Check out the specs on its bandwidth rating
E_r1c is offline   Reply With Quote
Old 10-28-2012, 08:46 AM   #9
Direct TV Fresh Meat
 

Join Date: Feb 2007
Posts: 2,346
Default

Thanks for opening a 5 year old thread that is no longer relative for your first post!
__________________
This post is not to be construed as a slam on plasma TV's nor an endorsement of any other TV technology.
DoctorCAD is offline   Reply With Quote
Old 10-28-2012, 10:03 AM   #10
pissoffe
 

Join Date: Mar 2012
Location: right here
Posts: 826
Default

Quote:
Originally Posted by heyman421 View Post
it's the same
no, HDMI is capable of lossless formats, optical is not.
quad4.0 is offline   Reply With Quote
Old 10-28-2012, 10:12 AM   #11
pissoffe
 

Join Date: Mar 2012
Location: right here
Posts: 826
Default

Quote:
Originally Posted by E_r1c View Post
LOL, my bad, LOL, OK had to get that out. I see and hear this all the time and it makes me laugh. The limit is always going up for one. Second, the bit rate on HDMI is so high because over 80% of it is used for the VIDEO. C'mon do your home work people! Thirdly, man that spelling is bad, True HD, Master Audio etc. has been out since BEFORE HDMI hit the scene. Hmm how is that possible? Because you could play those audio formats with either an Optical or Coaxial DIGITAL input. What! Amazing right? And also think about this, do you really know how much bandwidth, a.k.a. data transfer capability, you need in order to play those HD audio signals? KHZ? MHZ? GHZ? You only need Megahertz to play the audio from todays Blu-ray dvds. Ta-da! Sorry, I'm not trying to be mean, I just want people to be properly informed. Also on a similar note, ever see a nice CD player that costs around three to four hundred dollars? And wonder how in the world they could charge so much for it? Do some home work and find out. I GUARANTEE you can hear the difference. I'll give ya a clue too, it has to do with the audio decoder chip inside. Check out the specs on its bandwidth rating
PHP Code:
True HDMaster Audio etchas been out since BEFORE HDMI hit the sceneHmm how is that possible
sorry, lossless is NOT and was never avail. with spdif. IT WAS doable with ANALOG, 5.1, just like I use now. As was dvd audio and sacd via analog ports. Opitcal/coax will never be able to carry lossless.
quad4.0 is offline   Reply With Quote
Old 10-28-2012, 10:45 AM   #12
You know who you are...
 
elwaylite's Avatar
 

Join Date: Aug 2006
Location: Lower Alabama
Posts: 1,368
Default

On blu-ray, of course I use HDMI for lossless. I've used optical and digital coax from my HDDVR before, and IMO, the sound is not as good for DD 5.1 as the HDMI feed.

It's pretty obvious to me.
__________________
Video: VT50 / S5100
Audio: 3313ci / RF-82 II (2) / RC-62 II / VTF-15H
HDTV: Dish / Hopper & Joey
elwaylite is offline   Reply With Quote
Old 10-28-2012, 02:53 PM   #13
I bleed for HD
 

Join Date: Nov 2007
Location: Ohio
Posts: 11,272
Default

Quote:
Originally Posted by E_r1c View Post
LOL, my bad, LOL, OK had to get that out. I see and hear this all the time and it makes me laugh. The limit is always going up for one. Second, the bit rate on HDMI is so high because over 80% of it is used for the VIDEO. C'mon do your home work people! Thirdly, man that spelling is bad, True HD, Master Audio etc. has been out since BEFORE HDMI hit the scene. Hmm how is that possible? Because you could play those audio formats with either an Optical or Coaxial DIGITAL input. What! Amazing right? And also think about this, do you really know how much bandwidth, a.k.a. data transfer capability, you need in order to play those HD audio signals? KHZ? MHZ? GHZ? You only need Megahertz to play the audio from todays Blu-ray dvds. Ta-da! Sorry, I'm not trying to be mean, I just want people to be properly informed. Also on a similar note, ever see a nice CD player that costs around three to four hundred dollars? And wonder how in the world they could charge so much for it? Do some home work and find out. I GUARANTEE you can hear the difference. I'll give ya a clue too, it has to do with the audio decoder chip inside. Check out the specs on its bandwidth rating
In addition to reviving a 5 year old thread the statements in blue are just plain old FALSE. The limit is the limit. It is not constantly increasing and you most certainly CAN NOT play Dolby True HD and DTS HD-MA with optical or digital coax. The bit rate is indeed to high. That is one of the of the main reason HDMI came out.

It appears you aren't mean just another ignorant poster.
__________________
HT- Panasonic TH-50PZ800U 1080p plasma, Panasonic PT-AX200u LCD projector, Elite Vmax Electric 100' screen, Pioneer Elite SC-71 AVR, Paradigm Titan Monitor v5 fronts, Paradigm CC-190 v6 center, Paradigm Atom Monitors v5 surrounds(x4), HSU research VTF-2 sub, HTPC/Gaming rig, Sony BDP-S550 blu-ray, Uverse HD-DVR, Roku 2

Bedroom: Insignia NS-lcd26, Samsung BDP-1400 blu-ray
Den - Vizio 20" 1080p LCD + Samsung BD-p1600


Don't believe everything you read on the internet - Albert Einstein

Last edited by jkkyler; 10-28-2012 at 09:32 PM.
jkkyler is offline   Reply With Quote
Old 10-31-2012, 09:18 AM   #14
No I don't miss LowDef
 
tkurkowski's Avatar
 

Join Date: Jul 2012
Location: Mid-Atlantic, USA
Posts: 251
Default

Quote:
Originally Posted by elwaylite View Post
On blu-ray, of course I use HDMI for lossless. I've used optical and digital coax from my HDDVR before, and IMO, the sound is not as good for DD 5.1 as the HDMI feed.

It's pretty obvious to me.
To me also. And my Yamaha receiver clearly indicates that the Toslink audio signal is not 5.1, but the HDMI audio is.
__________________
"Just because you don't understand something, doesn't mean it's wrong." Arthur Pendragon
tkurkowski is offline   Reply With Quote
Old 10-31-2012, 10:10 AM   #15
I bleed for HD
 

Join Date: Nov 2007
Location: Ohio
Posts: 11,272
Default

Quote:
Originally Posted by tkurkowski View Post
To me also. And my Yamaha receiver clearly indicates that the Toslink audio signal is not 5.1, but the HDMI audio is.
Toslink (optical) can certainly handle 5.1 standard DD or DTS. What it cannot do is handle the hd lossless codecs such as Dolby TrueHd or DTS HD-MA. It can only handle 2chthe pcm and not multichannel pcm.
__________________
HT- Panasonic TH-50PZ800U 1080p plasma, Panasonic PT-AX200u LCD projector, Elite Vmax Electric 100' screen, Pioneer Elite SC-71 AVR, Paradigm Titan Monitor v5 fronts, Paradigm CC-190 v6 center, Paradigm Atom Monitors v5 surrounds(x4), HSU research VTF-2 sub, HTPC/Gaming rig, Sony BDP-S550 blu-ray, Uverse HD-DVR, Roku 2

Bedroom: Insignia NS-lcd26, Samsung BDP-1400 blu-ray
Den - Vizio 20" 1080p LCD + Samsung BD-p1600


Don't believe everything you read on the internet - Albert Einstein
jkkyler is offline   Reply With Quote
Sponsored Links
Go Back   High Def Forum - Your High Definition Community & High Definition Resource > General Chit Chat > Speakers & Surround Sound
AddThis Social Bookmark Button
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off

Similar Threads to HDMI audio vs Optical Audio
Thread Thread Starter Forum Replies Last Post
HDMI video but no audio 73vtail Cables & Connections 28 08-12-2011 03:30 PM
Hdmi Cable Vs Rca Vs Dvi roc1911 Flat-Panel TVs 13 09-14-2008 03:39 PM
HDMI vs. Optical? bprewitt High Definition Receivers, Recorders, Players, Tivos 8 07-05-2008 01:24 PM
HDMI vs Optical Audio...? BucJam07 Speakers & Surround Sound 6 12-04-2007 11:56 AM
HDMI vs HDMI Pass through + Optical... ICUDOGG Speakers & Surround Sound 8 11-15-2007 02:48 PM


All times are GMT -6. The time now is 02:09 PM.



Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2004 - 2008, High Def Forum