Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → hdmi vs displayport?
hdmi vs displayport?
2019-12-03, 2:35 PM #1
I bought a new 27" 144hz IPS gaming monitor. When I hooked it up to my macbook pro via hdmi -> usbc it would only allow me to choose 120hz. I looked it up and I needed to use displayport to get 144hz. The cable for this (displayport -> usbc) came in today and I plugged it in. Not only does it now choose 144hz by default, but the actual clarity of the screen is noticeably better. What gives?

I bought this hoping that 144hz would help as I'm sensitive to flickering (typical screens and LED bulbs); verdict is still out on whether this helps but I'm surprised that there is a difference in quality (other than the 144hz). Thoughts?
2019-12-03, 3:18 PM #2
DisplayPort has a much higher bandwidth. Even relatively recent HDMI cables are limited to 1080p@120 Hz. If your monitor is 1440p, OSX might have been running it at a lower resolution.
2019-12-03, 4:32 PM #3
It’s just 1080p. It’s possible it was scaled weirdly but I’m pretty sure it wasn’t. I’ll fiddle with it again when I get back to it tomorrow. Thank you.
2019-12-03, 4:47 PM #4
Was shopping for TV screens this past weekend, I have to say the "120 Hz" marketing nonsense is freaking awful when the screens can only do 60 Hz. The companies can't flat out say 120 Hz so I've seen all sorts of stupid terms. "120 Hz Clear Motion Index", "120 effective refresh rate", "Trumotion 120 Hz", "Motion Rate 120 Hz", I've seen a lot.

Sorry Brian, had to vent. Didn't settle on any TV in the end...
SnailIracing:n(500tpostshpereline)pants
-----------------------------@%
2019-12-03, 5:09 PM #5
Probably using a 120hz panel with a 60hz driver. Picture would still be clearer because fine details in motion wouldn’t get smeared out as badly, and most sources are 60 or lower so most people wouldn’t be able to tell the difference. It might not be a bad deal as long as you knew what you were getting.
2019-12-03, 6:34 PM #6
I see, I'm sure there are TVs that can accept a 120Hz signal in the higher end of the market. It's just frustrating for me because it seems like screens that use tricks like black frame insertion, motion interpolation and etc. are being sold as 120Hz when it shouldn't (or technically they are?)

Not going to lie, I've bought computer monitors recently but not a "smart TV" in half a decade. I just feel like a lot it deceptive. I've been recommended to "just buy a Samsung" by folks around me, but I've read stories online about how Samsung TVs would start displaying ads for example. :/

On second thought, I'm sure they all pretty much work the same sans "smart" features.
SnailIracing:n(500tpostshpereline)pants
-----------------------------@%
2019-12-03, 7:12 PM #7
Smart TVs dominate the market because spyware companies subsidize them. These subsidies have put so much downward pressure on TV prices that it’s impossible to make money unless you’re taking those bribes.

I’m not sure what the answer is. Buy a smart tv but don’t hook it up to wifi, and hope that the spyware companies don’t start paying for cellular internet I guess.
2019-12-03, 7:36 PM #8
Yeah. It was suppose to be a gift to my parents to replace a still functioning Sony Trinitron Wega CRT in one room. It has served the family well, and I guess it'll stay for now.

I'm sure some device later in the future will use S-Video...
SnailIracing:n(500tpostshpereline)pants
-----------------------------@%
2019-12-04, 2:30 AM #9
Oh, I see, there's a thing called DisplayPort these days... I thought HDMI was supposed to be the pinnacle of everything so I randomly ordered an extra HDMI cable "just in case" a few months back (haven't used it for anything yet).

Boh!
Star Wars: TODOA | DXN - Deus Ex: Nihilum
2019-12-04, 8:31 AM #10
I read on HN or somewhere that some "smart" tvs look for open wifi in the area and automatically connect. It's nuts.

Samsung does "upgrade" tvs and fill them with ads, and it was a huge PITA to get rid of all the crappy apps they installed that show up when you press the source/input button (like trying to switch between HDMI1 and HDMI2 and you get hundreds of bull**** streaming "channels" being listed as a "source." My son had connected two TVs (the samsung and a vizio) to the internet without my permission. The only way to get them OFF the internet is to change the wifi name or password or block them at the router level. There's no way to delete credentials out of either of them. There's no setting to make them disconnect. It's completely awful.

For the vizio there's no way to do a firmware update without connecting it to the internet and leaving it there for days in hopes that it will get the upgrade. I'm never buying one again. I'm soured on samsung, too. I wish someone made a decent non-smart TV but such a thing doesn't exist in any store I've been to.
2019-12-04, 8:43 AM #11
RF modem chips are one size fits all commodity parts that include cellular, wifi, Bluetooth, and the device SoC in a single package.

So no, it doesn’t really surprise me that they’d glom onto any wifi it can find. I’d expect it to do the same for Bluetooth and honestly at some point I’m expecting someone to catch a TV phoning home over cellular too. The hardwares already there, and the real money isn’t in selling TVs to people, it’s spying on them.
2019-12-04, 9:16 AM #12
The worse thing I found on our Samsung is the highly persistent TV Plus app. I finally managed to disable it but it wasn't really in any memorable way. The TV had latched on to the app as its OTA/cable source so accidentally hitting the channel button on the remote caused the app to hijack the TV no matter what else you were doing on it.
"I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16

2019-12-06, 10:27 PM #13
I went to a talk at Samsung Research and spoke to some engineers there about how they have a Scala codebase for streaming data to and from their TV's. Now I guess I know why, lol

↑ Up to the top!