History of Video Cables
Many electronics have to be connected to a monitor before you can use them. Even devices with their own screens like smartphones and laptops have that option so users have the option of using bigger displays like televisions and projectors. Over the years, many new video cables been introduced and gradually replace the old as technology continues to grow. Modern high-def TVs have come a long way from the black and white TVs of old.
1956: Composite RCA is introduced, becoming a common standard for televisions, VCRs, LaserDisc, video game consoles, computers, and more for the next several decades. Each cable can only send a single signal, paving the way for multi-signal cables to eventually replace RCA in the future.
1979: S-Video is released as a superior (for its time) alternative to composite RCA. It is included on VCRs, home computers, and some early video game consoles up through the early 90s.
1987: VGA was initially created by IBM for their x86 machines. It worked so well that VGA went on to become an industry standard on computers, televisions, projectors, and other electronics. Over time, it becomes the most successful and longest lasting analog connector.
1990s: Component RCA, an upgrade to composite, is developed. While powerful for an analog cable, it has little time to catch on before digital cables start becoming the new standard.
1999: DVI is invented by the Digital Display Working Group, a collaboration between seven major tech companies, and becomes the first major standard for digital video cable. It holds the advantage of also being compatible with older analog formats. Mini- and Micro-DVI are introduced as well, being heavily used with Apple products.
2002: HDMI 1.0 is developed as a collaboration project between multiple tech companies, with backing from various media and telecom companies, as a new industry-wide standard. It quickly becomes implemented on computers, televisions, and virtually every other electronic that connects to monitors.
2004: HDMI 1.1 is released, adding support for DVD audio.
2005: HDMI 1.2 is released, featuring several changes to make it more friendly for computers.
2006: HDMI 1.3 is released, with superior data transmission speed and support for more colors allowing for better resolution.
2008: Apple discontinues Mini- and Micro-DVI and begins implementing Mini DisplayPort in its place. They offer a free license for other companies to use Mini DisplayPort as well.
2008: Companies involved with the development of HDMI are awarded the Technology and Engineering Emmy Award by the National Academy of Television Arts and Sciences.
2009: HDMI 1.4 is released, being the first HDMI capable of 4k (albeit limitedly). HDMI cables from this point onwards are also capable of supporting ethernet.
2013: HDMI 2.0 is released and is the first HDMI capable of fully supporting 4k resolution.
2013: Thunderbolt 2 is developed and first released with the 2013 MacBook Pro.
2016: Apple begins phasing out Mini DisplayPort, replacing it with USB-C.
2016: Thunderbolt 3 (physically identical to USB-C) starts seeing implementation on Apple products.
2017: HDMI 2.1 is released, ushering in “Ultra High Speed” HDMI. 2.1 is capable of up to 8k resolution and supports backward compatibility with older versions of HDMI.