A high definition (HD) Ready TV is a television capable of displaying high definition signals but relies on an outside source, such as a cable or satellite receiver, to process and convert those signals.
HDTV has been in development in the United States since it was first demonstrated in 1981, based on a format created in Japan. The high bandwidth needed, competing manufacturers and the prohibitive costs of producing not only the television sets but the infrastructure needed to create and transmit the signals slowed the process.
The first public HD broadcast in the United States was in Raleigh, North Carolina, in 1996. The creation of digital compression technology and the Federal Communications Commission’s Advanced Television Systems Committee (ATSC )’s adoption of a national HD standard paved the way for commercially viable HDTV sets.
HD Ready TVs can display up to 1,080 lines of resolution, far more than the 480 of standard TVs. They allow for a frame rate of up to 25 frames per second and must have a digital, composite video or high definition multimedia interface (HDMI) input to receive HD signals.
HD Ready vs. Full HD
As with most new technology, early HDTV sets were quite expensive. The higher-end models were Full HD, which included an HD-capable tuner, while the HD Ready TVs needed a device with an HD tuner, such as a cable or satellite receiver, to decode the HD signals.
With the United States switching to digital-only broadcasting in June 2009 and the growing popularity, technological advancements and shrinking prices of HDTV sets, more TVs are now Full HD, phasing out the HD Ready TV sets.