HD stands for "High Definition" and it is a new video standard.
What's "high" about the "definition"? Well, it's "higher" than "standard" definition, or the TVs we've all been used to since...forever.
Although HD is supposed to be a standard, there are a lot of different varieties, which leads to much confusion.
The Thirty-One Flavors of HD
Okay, there really aren't thirty-one different varities of HD. But, there are a lot. You've probably heard terms like 1080i, 720p, 24p, and progressive scan when people are talking about HD. And we'll admit, it can be very confusing.
When someone talks about HD, they're usually referring to the resolution of the image. Basically, the higher the resolution, the better the picture quality and clarity. And as long as the resolution is higher than "standard" resolution, the video signal can be called HD.
It's actually the number of pixels in an image. A pixel is a colored dot. So, the higher the number of pixels in an image, the better the picture.
The resolution of an image is usually defined as a product of two numbers. For instance, 1920 x 1080 refers to 1920 horizontal pixels and 1080 vertical pixels for a total of 2,073,600 pixels. Most TV manufacturers only tell you the number of vertical pixels. That's why you see TVs described as having 720 or 1080 lines of resolution.
What resolution do I need?
That all depends. Let's investigate the difference between standard definition with 480 lines of vertical resolution and an HD TV with 720 lines of vertical resolution.
An image with 480 lines of resolution is actually 720 x 486 pixels = 349,920 pixels.
An image with 720 lines of resolution is actually 1280 x 720 pixels = 921,600 pixels.
So, an HDTV with 720 lines of resolution is about 3 times "better" than a standard TV. What about a TV with 1080 lines of resolution?
An image with 1080 lines of resolution is actually 1920 x 1080 pixels = 2,073,600 pixels. Wow, that's about 6 times better than a standard TV and more than double the pixels of a TV with only 720 lines of resolution.
Everything depends on the source signal.
So, once I buy that super expensive wall-mounted, flat screen LCD TV, it's going to look like Lebron James is literally dunking on top of my face, right?
No. If you take a standard definition signal with only 349,920 pixels and display it on your TV that has 2 million, there's going to be a problem. Your TV has to scale the image up. Just like when you blow up an image on a copy machine, the bigger it gets, the worse it starts to appear.
But the guy at Best Buy told me...
I know where you're going with this. The guys at Best Buy said that your TV had the best upconverting circuitry ever devised - the same stuff that is used in all of those movies where they "enhance" grainy security camera footage to read the name of the manufacturer of the shoelaces on the criminal. Uh huh, sure. OK, so I'm being a little harsh. Yes, upconverting circuitry can do an excellent job and is definitely worth the price.
However, to enjoy the best image possible, you want to make sure that you have an HD signal going into your TV. So, you want HD satellite or HD cable.
But what about my DVDs?
The guy at Best Buy said that if I get an upconverting DVD player, I can watch my DVDs in High Definition. Just like with any scaling technology, you are going to experience artifacting. However, if you sit an average distance away from your TV, you'll probably never notice the difference.
A fun fact about Standard Definition
Standard Definition (SD) in the US has 480 lines of vertical resolution. That means that each image you see on your TV is "drawn" using 480 horizontal lines stacked on top of eachother.
Interestingly enough, European TVs "standard definition" actually has 576 lines of vertical resolution.
Wait a second! Are you saying that Europe's SD would actually be considered HD here in the US? Weeellll, not really. The lowest number of vertical lines of resolution that is considered HD is 720.