HDMI versus Component Video
--Which is Better?
As HDMI cable connections become more and more widely used, we are often asked: which is better, HDMI or component video? The answer, as it happens, is not cut-and-dried.
First, one note: everything said here is as applicable to DVI as to HDMI; DVI appears on fewer and fewer consumer electronic devices all the time, so isn't as often asked about, but DVI and HDMI are essentially the same as one another, image-quality-wise. The principal differences are that HDMI carries audio as well as video, and uses a different type of connector, but both use the same encoding scheme, and that's why a DVI source can be connected to an HDMI monitor, or vice versa, with a DVI/HDMI cable, with no intervening converter box.
The upshot of this article--in case you're not inclined to read all the details--is that it's very hard to predict whether an HDMI connection will produce a better or worse image than an analogue component video connection. There will often be significant differences between the digital and the analogue signals, but those differences are not inherent in the connection type and instead depend upon the characteristics of the source device (e.g., your DVD player) and the display device (e.g., your TV set). Why that is, however, requires a bit more discussion.
What are HDMI and Component Video?
HDMI and Component Video both are video standards which support a variety of resolutions, but which deliver the signal from the source to the display in very different ways. The principal important difference is that HDMI delivers the signal in a digital format, much the same way that a file is delivered from one computer to another along a network, while Component Video is an analogue format, delivering the signal not as a bitstream, but as a set of continuously varying voltages representing (albeit indirectly, as we'll get to in a moment) the red, green and blue components of the signal.
Both HDMI and Component Video deliver signals as three discrete colour components, together with sync information which allows the display to determine when a new line, or a new frame, begins. The HDMI standard delivers these along three data channels in a format called T.M.D.S., which stands for "Transition Minimized Differential Signaling." Big words aside, the T.M.D.S. format basically involves a blue channel to which horizontal and vertical sync are added, and separate green and red channels (though HDMI can also be configured to use "colour-difference" colourspace--see below).
Component Video is delivered, similarly, with the colour information split up three ways. However, component video uses a "colour-difference" type signal, which consists of Luminance (the "Y", or "green," channel, representing the total brightness of the image), Red Minus Luminance (the "Pr," or "Red," channel), and Blue Minus Luminance (the "Pb," or "Blue," channel). The sync pulses for both horizontal and vertical are delivered on the Y channel. The display calculates the values of red, green and blue from the Y, Pb, and Pr signals.
Both signal types, then, are fundamentally quite similar; they break up the image in similar ways, and deliver the same type of information to the display, albeit in different forms. How they differ, as we'll see, will depend to a great extent upon the particular characteristics of the source and display devices, and can depend upon cabling as well.
Isn't Digital Just Better?
It is often supposed by writers on this subject that "digital is better." Digital signal transfer, it is assumed, is error-free, while analogue signals are always subject to some amount of degradation and information loss. There is an element of truth to this argument, but it tends to fly in the face of real-world considerations. First, there is no reason why any perceptible degradation of an analogue component video signal should occur even over rather substantial distances; the maximum runs in home theater installations do not present a challenge for analogue cabling built to professional standards, and we have had customers run analogue component video for more than 200 feet without trouble, and without even the need for a booster. Second, it is a flawed assumption to suppose that digital signal handling is always error-free. HDMI signals aren't subject to error correction; once information is lost, it's lost for good. That is not normally a consideration with well-made cable over short distances, but can easily become a factor at distance.
So What Does Determine Image Quality?
Video doesn't just translate directly from source material to displays, for a variety of reasons. Most displays do not operate at the native resolutions of common source material, so when you're viewing material in 480p, 720p, 1080i or 1080p, there is, of necessity, some scaling going on. Meanwhile, the signals representing colours have to be accurately rendered, which is dependent on black level and "delta," the relationship between signal level and actual as-rendered colour level. Original signal formats don't correspond well to display hardware; for example, DVD recordings have 480 lines, but non-square pixels, and they have colour recorded in colour-difference format, while HDMI ordinarily runs in RGB colourspace. Many displays do not correspond very well to any common output resolution; instead of 720 lines or 1080, they often will have 768, or 1024, or some other number of lines. What all of this means is that there is scaling to go on along the signal chain.
The argument often made for the HDMI signal format is the "pure digital" argument--that by taking a digital recording, such as a DVD or a digital satellite signal, and rendering it straight into digital form as an HDMI signal, and then delivering that digital signal straight to the display, there is a sort of a perfect no-loss-and-no-alteration-of-information signal chain. If the display itself is a native digital display (e.g. an LCD or Plasma display), the argument goes, the signal never has to undergo digital-to-analogue conversion and therefore is less altered along the way.
That might be true, were it not for the fact that digital signals are encoded in different ways and have to be converted, and that these signals have to be scaled and processed to be displayed. Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analogue," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out on paper. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
Additionally, it's not uncommon to find that the display characteristics of different inputs have been set up differently. Black level, for example, may vary considerably from the digital to the analogue inputs, and depending on how sophisticated your setup options on your display are, that may or may not be an easy thing to recalibrate. We have frequently found dramatic, unmistakable differences in image quality between HDMI and component video--sometimes favoring one, sometimes the other--on the default calibration settings of sources and displays.
The Role of Cable and Connection Quality
Cable quality, in general, should not be a significant factor in the HDMI versus Component Video comparison, as long as the cables in question are of high quality. There are, however, ways in which cable quality issues can come into play.
Analog component video is an extremely robust signal type; we have had our customers run analogue component, without any need for boosters, relays or other special equipment, for over 200 feet without any signal quality issues at all. However, at long lengths, cable quality can be a consideration--in particular, impedance needs to be strictly controlled to a tight tolerance (ideally, 75 +/- 1.5 ohms) to prevent problems with signal reflection which can cause ghosting or ringing.
HDMI, unfortunately, is not so robust. The problem here is the same as the virtue of analogue component: tight control over impedance. When the professional video industry went to digital signals, it settled upon a standard--SDI, serial digital video--which was designed to be run in coaxial cables, where impedance can be controlled very tightly, and consequently, uncompressed, full-blown HD signals can be run hundreds of feet with no loss of information in SDI. For reasons known only to the designers of the HDMI standard, this very sound design principle was ignored; instead of coaxial cable, the HDMI signals are run balanced, through twisted-pair cable. The best twisted pair cables control impedance to about +/- 10%. When a digital signal is run through a cable, the edges of the bits (represented by sudden transitions in voltage) round off, and the rounding increases dramatically with distance. Meanwhile, poor control over impedance results in signal reflections--portions of the signal bounce off of the display end of the line, propagate back down the cable, and return, interfering with later information in the same bitstream. At some point, the data become unrecoverable, and with no error correction available, there's no way to restore the lost information.
HDMI connections, for this reason, are subject to the "digital cliff" phenomenon. Up to some length, an HDMI cable will perform just fine; the rounding and reflections will not compromise the ability of the display device to reconstruct the original bitstream, and no information will be lost. As we make the cable longer and longer, the difficulty of reconstructing the bitstream increases. At some point, unrecoverable bit errors start to occur; these are colloquially described in the home theater community as "sparklies," because the bit errors manifest themselves as pixel dropouts which make the image sparkle. If we make the cable just a bit longer, so much information is lost that the display becomes unable to reconstitute enough information to even render an image; the bitstream has fallen off the digital cliff, so called because of the abruptness of the failure. A cable design that works perfectly at 20 feet may get "sparkly" at 25, and stop working entirely at 30.
In practice, it's very hard to say when an HDMI signal will fail. Well-made HDMI cables can be quite reliable up to 50 feet or so with most devices, and we have run HDMI at high-definition resolutions as far as 150 feet in Belden bonded-pair cable without bit errors. But because the ability to reconstitute the bitstream varies depending on the quality of the circuitry in the source and display devices, it's not uncommon for a cable to work fine at 30, 40, or 50 feet on one source/display combination, and not work at all on another. Meanwhile, the demands on the HDMI interface are increasing. A few years ago, no one needed to run anything more than 1080i through a cable; now 1080p is common; and soon, "deep colour" devices may become prevalent in the market. Each of these developments results in a large corresponding increase in the bitrate being shoved through the HDMI cable, and cables that worked once will stop working--not because the cable has changed, but because the signal being run through it has changed.
The Upshot: It Depends
So, which is better, HDMI or component? The answer--unsatisfying, perhaps, but true--is that it depends. It depends upon your source and display devices, and there's no good way, in principle, to say in advance whether the digital or the analogue connection will render a better picture. You may even find, say, that your DVD player looks better through its HDMI output, while your satellite or cable box looks better through its component output, on the same display. In this case, there's no real substitute for simply plugging it in and giving it a try both ways.