The worst picture quality is conventional coax, which tends to be poorly-shielded. Next up from that is composite video (RCA connectors). Composite encodes chroma and luma (color and brightness) on the same pin, in an analog signal. This makes it susceptible to noise, color bleeding and undefined contrast.
S-video is an improvement since it encodes chroma and luma on separate pins. It's still analog, but since two are encoded separately the image appears sharper.
Component video encodes video as a vector of analog streams: CR, CB, Y. Two pins for defining color and one for defining luma. In reality it's only a marginal improvement over s-video - you get far superior definition of color at the expense of some image definition. Many TVs attempt to subtract entropy data (unsharp information - also known as an unsharp mask) in an effort to improve the image definition of component video. It's enough for most purposes but it's not winning any awards. Of course, component video does have sufficient bandwidth and the cables have a low enough impedence to allow higher resolutions.
VGA or SCART is the best of the analog world. It separates R, G, and B signals into individual pins with separate horizontal and vertical synchronization pins. SCART can carry either VGA-style RGB with sync signals or s-video-style C/Y signals. SCART also carries audio. As far as I know this is only used in Europe and for token ring ethernet.
The best is DVI or HDMI. They're the same technology, but HDMI has a smaller connector and includes more DRM to punish honest consumers. These two encode video data as a series of binary pulses instead of using an analog wave. This means perfect color reproduction (as far as the monitor is capable of), absolutely no spill-over and no interference. You either have a signal or you don't. (This is actually a simplification - there are several variants of DVI capable of carrying different kinds of analog and digital signals and at different bandwidths!) The cables and connectors are also quite expensive.
All of the devices connected to my TV are connected with component video and optical audio. HDMI TVs were a tad uncommon when I bought mine, and DVI doesn't really offer that much in the way of image quality when you're talking about TVs.
My monitors are hooked up with DVI.
Edit: DVI and VGA are not compatible. Those little adapters that come with your video cards just short 2 pins so the video card can tell what kind of signal to send. Don't use the adapters on your TV or monitor or what-have-you.