Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → Movies look better on PC than TV
12
Movies look better on PC than TV
2006-02-02, 9:31 AM #41
Originally posted by Jon`C:
Thank you. I obviously have no idea what interlacing or progressive scan are and I clearly think VHS is a better format than DVD or H.264. I really appreciate how you dumbed-down the concept so my feeble mind could grasp the idea. Thanks!


It should have been obvious that the "dumbed down" nature of the post was for the benefit of others who might think you are speaking Greek. I made that clear in my closing paragraph to you. I should have dumbed that part down abit too! :p
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

2006-02-02, 9:38 AM #42
Originally posted by Jon`C:
Edit: "High-end displays" don't have "re-interlacing" hardware because "high-end displays" are almost always "flat panels". CRTs will step up a 480P signal to 720I or 1080I. It's not going to drop 480P back down to 480I.



Highend displays are certainly not almost always flat panels. Those are simply the costliest. You'll find flat screen CRTs (although due to my lack of interest in CRTs I don't know what they top out screen size and resolution wise now) as well as rear projection LCD and DLP formats. "Re-interlacing" isn't an accepted term. Just more accurate than the industry and technical term deinterlacing. And all of those types of hardware, on the higher end of the quality scale, usually feature circuitry that steps the video image up to become comparable to what most progressive scan DVD players can offer. My point is that many people will not notice an improvement, not that an improvement is not possible because it certainly is.
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

2006-02-02, 9:49 AM #43
Originally posted by Wookie06:
"Re-interlacing" isn't an accepted term. Just more accurate than the industry and technical term deinterlacing.
...What? I don't think English is your first language... the prefix "de" is not equal to "re". Deinterlacing means to reverse the process of interlacing. I defenestrate you, not refenestrate you. If you reinterlaced an interlaced picture you'd end up with a blank screen.

And no, a CRT isn't a high-end display. Plasma is, possibly DLP depending on the viewing range, and LCD flat panel. LCD RPTV (which I own) is toward the midrange.
2006-02-02, 10:09 AM #44
Re makes more sense to the layman because it implies putting the pieces back together. What I really should have said was re-deinterlacing!

And, I wouldn't personally consider CRT high end either. But that is all semantics. You have high end varieties of each type but I think you are meaning high end with regards to all types combined. In that regard plasma really only recently became generally high end as most plasma sets from recent past were only capable of EDTV.
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

2006-02-02, 10:24 AM #45
Originally posted by Wookie06:
Re makes more sense to the layman because it implies putting the pieces back together. What I really should have said was re-deinterlacing!
So calling something a deceptive name which implies a process that isn't happening and using a term that nobody else uses is supposed to "[make] more sense to the layman"? ...k.

Quote:
In that regard plasma really only recently became generally high end as most plasma sets from recent past were only capable of EDTV.
How 'recent' are you thinking? Plasmas hit 1080p about 2 years ago already.
While you could theoretically have a decent CRT HDTV, I have yet to see one. None that I've seen can handle a reasonable cross-section of the HDTV formats, usually only 1080i. Meanwhile LCD and plasma displays almost universally downsample 1080i to 720p without any problems. If CRTs get an analog signal they don't natively support, they take it aaaaaaalllll the way down to 480i.

A CRT that displays a 720p source at 480i? Crap.
2006-02-02, 10:28 AM #46
Originally posted by Jon`C:
So calling something a deceptive name which implies a process that isn't happening and using a term that nobody else uses is supposed to "[make] more sense to the layman"? ...k.


Yes because interlaced implies that something is already in one piece. Therefore, deinterlaced would be deconstruction of that piece. The "re" prefix more accurately resembles the fact that someting is being reassembled which is what is happening in this case.

Originally posted by Jon`C:
How 'recent' are you thinking? Plasmas hit 1080p about 2 years ago already.
While you could theoretically have a decent CRT HDTV, I have yet to see one. None that I've seen can handle a reasonable cross-section of the HDTV formats, usually only 1080i. Meanwhile LCD and plasma displays almost universally downsample 1080i to 720p without any problems. If CRTs get an analog signal they don't natively support, they take it aaaaaaalllll the way down to 480i.

A CRT that displays a 720p source at 480i? Crap.


Two years is quite recent considering HD capable sets were, by definition, HD when they were released. The initial release of plasma was outrageously overpriced and undercapable in that regard. They've recently become a good enough value and capable enough to make them a reasonable choice.

I admitted to a lack of knowledge in true CRT capabilities due to my lack of interest in the format. My point was, and still stands, that CRT HDTV sets are probably capable of the same type of deinterlacing that makes most setups look remarkably close to what progressive scan DVD players do. Of course each piece of hardware is different and that's why I specified "high end" as in the high end of each type. That's all and since we basically agree on everything...

Remember, this thread was started by someone asking simply why one looked better than the other (tangent topic not exactly the same). In a thread where layman are asking questions fact like: "The YCbCr DVD colorspace evaluates to 12-bit RGB color weighted toward the brighter end of the spectrum, so you lose a lot of definition at low brightnesses. Additionally, MPEG2 compresses video in blocks of 8x8 pixels so blocking artifiacts are extremely commonplace. Modern DVD players and TVs use heavy motion compensation hardware and software to eliminate many of the artifacts but it's still a flaw in the underlying format. It's also huge, and there's no excuse for this amount of artifacting on 10 mbit/s video." don't do much to help.
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

2006-02-02, 11:20 AM #47
Originally posted by Wookie06:
Yes because interlaced implies that something is already in one piece. Therefore, deinterlaced would be deconstruction of that piece. The "re" prefix more accurately resembles the fact that someting is being reassembled which is what is happening in this case.
An interlaced video is already "in one piece". Each scan is one half of a single frame. Both interlaced scans equals one frame. Interlacing is the process of dividing image or video data up by alternating scanlines.
To "re-interlace" would be to do this twice.

Quote:
Two years is quite recent considering HD capable sets were, by definition, HD when they were released. The initial release of plasma was outrageously overpriced and undercapable in that regard. They've recently become a good enough value and capable enough to make them a reasonable choice.
HD-capable sets? 2 years ago 720p and 1080i were the HDTV formats. Still are. Fact is, Plasma has kept up with other HDTV displays quite well, and surpassed them 2 years ago.

Quote:
My point was, and still stands, that CRT HDTV sets are probably capable of the same type of deinterlacing that makes most setups look remarkably close to what progressive scan DVD players do.
HDTV sets don't "deinterlace" images, because they are interlaced. An incoming 480I signal is displayed at 480I. A higher-end LCD, LCD RPTV or Plasma will deinterlace the 480I signal and display it at the screen's native resolution. A CRT will either display the 480P signal (EDTV) or upsample it to its image processor's native resolution (usually 1080I). CRTs aren't going to "deinterlace" images, because they're designed to display interlaced ones.

Quote:
don't do much to help.
You asked me why I think DVDs are poor quality. Don't whine because you don't understand my answer.
2006-02-02, 12:20 PM #48
Originally posted by mscbuck:
You've never seen one?

[http://www.monkeyrivertown.com/images/up/flux%20capacitor.jpg]


thats a flux capacitor, actually.

ALSO

movies look SO MUCH better on tv's (doesn't even have to be HDTV) than on computer monitors in general. i dont see how it's even debatable.
2006-02-02, 2:08 PM #49
Originally posted by Jon`C:

How 'recent' are you thinking? Plasmas hit 1080p about 2 years ago already.



Whaaat? I've only seen a few 1080p TVs hit the market. Show me one that's two years old and costs less than $10K


Originally posted by ragna:
thats a flux capacitor, actually.

ALSO

movies look SO MUCH better on tv's (doesn't even have to be HDTV) than on computer monitors in general. i dont see how it's even debatable.


Maybe it's because you're comparing a 7 year old 15inch CRT to a brand new HD TV. It really depends on how everything is set up. Technically for DVDs, a good ED TV would be best, because it's 480p just like the DVD. No scaling or anything.
2006-02-02, 3:06 PM #50
Originally posted by Jon`C:
An interlaced video is already "in one piece". Each scan is one half of a single frame. Both interlaced scans equals one frame. Interlacing is the process of dividing image or video data up by alternating scanlines.
To "re-interlace" would be to do this twice.


You're not understanding my definition and your statement contradicts itself.

Originally posted by Jon`C:
HD-capable sets? 2 years ago 720p and 1080i were the HDTV formats. Still are. Fact is, Plasma has kept up with other HDTV displays quite well, and surpassed them 2 years ago.


HDTV sets were introduced over two years ago and, as I said, were by definition already HDTV. Plasma sets were not HDTV when released and only recently, within about the past two years, became so.

Originally posted by Jon`C:
HDTV sets don't "deinterlace" images, because they are interlaced. An incoming 480I signal is displayed at 480I. A higher-end LCD, LCD RPTV or Plasma will deinterlace the 480I signal and display it at the screen's native resolution. A CRT will either display the 480P signal (EDTV) or upsample it to its image processor's native resolution (usually 1080I). CRTs aren't going to "deinterlace" images, because they're designed to display interlaced ones.


So you are saying that no HDTV CRTs have deinterlacing or equivalent circuits (honest question)? Other than that you agreed with my points although you probably don't see that you did.

Originally posted by Jon`C:
You asked me why I think DVDs are poor quality. Don't whine because you don't understand my answer.


Once again you failed to understand that my comments were meant to be understood by the layman and I never said I didn't understand your post. I said it wouldn't be helpful to others who weren't familiar with the technical jargon you were throwing around.

I don't understand your attitude on this. I haven't disagreed with anything you've said other than to establish that the average consumer isn't going to realize any significant benefit from progressive scan DVD players which is mostly due to the relatively low quality of the format. You just seem determined to argue the semantics of it all.
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

2006-02-03, 12:10 AM #51
Originally posted by Obi_Kwiet:
Technically for DVDs, a good ED TV would be best, because it's 480p just like the DVD. No scaling or anything.

I have wanted to pull my hair out while reading this entire thread, but that was the dumbest thing I have seen yet.

Knightrider, do not buy component. If you tv supports it, go for the gold. HDMI. Get an up-convert player, and a decent hdmi cable. It'll look as good as possilble. You can do this for about 200 bucks. If you've got a Wega, you can afford that.

You do not want an EDTV. Ever. Jesus. Let's think about this for a second. UHHHH... UHHH... HD is coming more and more into the mainstream. Blu-ray is coming out. Xbox 360 is out. Yeah, I'm going to spend the same amount on a ****ing EDTV as I would on a LARGER DLP tv which supports resolutions nearly triple of a ****ing EDTV! Yeah. Just buy the EDTV. Smart move. Jesus again.

I have a 52" LG DLP. This thing is phenominal. Great sharpness and color. Displays dvds amazingly with my toshiba hdmi player. Really great. Comes with an HDTV tuner in it (like pretty much every other hdtv out there today) and it's got firewire, two hdmi, rgb, 2 component a bunch of svideo and composite and a digital coax. Supports every hdtv resolution, and also has fiber optic audio inputs (which are useless because they are found on every $300 starter surround sound system out there) but the speakers on the tv are great. Creates a nice theater effect when used in conjunction with digital surround (especially dts).

Also, yes, most high end tv's have a 3:2 pulldown effect, but most of the time it's pointless because your dvd player should have it. Mostly if you've got an hdmi player boosting the signal to 720p (1080p through a dvd player is not yet possible i think, but I could be, and probably am wrong) then it's going to make it progressive. And odds are unless you spend 6k on the tv it's not going to support 1080p anyway, only 1080i, which I don't think the best tv's out there can deinterlace (once again, I could be wrong. Feel free to correct me Jon).

But that aside, your problem is that you need a digital pipline for that info. HDMI all the way. Make sure your tv has a plug, though. You don't wanna waste all that money for nothing.


I'll post shots of my ht sometime. It's pretty impressive (especially since i live in a one room apartment).
>>untie shoes
2006-02-03, 7:57 AM #52
Originally posted by Bill:
And odds are unless you spend 6k on the tv it's not going to support 1080p anyway, only 1080i, which I don't think the best tv's out there can deinterlace (once again, I could be wrong. Feel free to correct me Jon)
It's certainly possible to deinterlace 1080i to 1080p but most HDTV manufacturers skimp on the image processor. Older HDTVs and Hitachi RPTVs have an optional fishbowl filter to resize 4:3 sources with less distortion. Other manufacturers don't bother. Hitachi HDTVs can also pull down an amazing picture off of S-video - it looks as sharp as a signal over component video but with slight color loss. If you have a Gamecube, Gamecube S-video cable and component video cable, you can really tell the difference. Other TVs I've seen don't bother - the picture looks as foggy as composite.

I haven't had much of a chance to play with 1080p TVs so I can't give you an acceptable cross-section. Chances are the 1080i signal just gets downsampled to 720p. This still looks better to the human eye than 1080i anyway. The human brain doesn't like interlacing - another reason CRTs suck.

Quote:
But that aside, your problem is that you need a digital pipline for that info. HDMI all the way. Make sure your tv has a plug, though. You don't wanna waste all that money for nothing.
HDMI and DVI are the same thing. HDMI is just a different vehicle for the same signal. Both DVI and HDMI support HDCP, although with HDMI it's mandatory. And, yes, if he has the option of going for DVI or HDMI he should take it.
2006-02-03, 10:06 AM #53
I should just mention that technically HDMI and DVI are not the same. Video signal is the same, but HDMI also carries the audio signal, whereas DVI does not. It doesn't make a difference as long as you have a good audio cable to go along with the DVI cable. But now I'm just nitpicking.
Marsz, marsz, Dąbrowski,
Z ziemi włoskiej do Polski,
Za twoim przewodem
Złączym się z narodem.
2006-02-03, 11:13 AM #54
I said that about HDMI because of the up-convert players... I'm not sure if up-convert players use dvi the same way as hdmi... I know it's the same thing... I connect my tv to my computer using a dvi/hdmi cable.
>>untie shoes
2006-02-03, 3:03 PM #55
I was going to say that I would personally prefer DVI to HDMI because I'm generally in the habbit of running my audio signals directly to the receiver. I could see the convienence, though, of running all the audio signals to the TV and then out to the receiver assuming it is kept entirely digital to the receiver. My current receiver is not digital. Still running the old pro-logic. I look forward to upgrading the receiver but that won't be for about another 8 months. To be honest, I've still been happy with the quality of my old receiver and since I've been spending a fair ammount of time in Iraq there's no real need to upgrade yet. I wanted to "future proof" my system as much as possible by allowing the technologies to continue to develop as well as prices to drop. Right now it seems that you can find very capable receivers under $500. I figure I'll spend about $1000 on a receiver and upgrading my surround speakers.
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

12

↑ Up to the top!