Hardware interpolation is perfectly fine these days, or why PC Evangelists are ridiculous


#1

** N.B. I like and have owned all platforms, they all have their pros and cons, evangelists are the problem, not the platform itself.*

For years PC Evangelists (they call themselves mustard racers or something?) have dismissed TV hardware interpolation out of hand, claiming these solutions caused artifacts, stuttering and ghosting, even though that hasn’t been true for years. Yet a large number of them love using SVP Media Player to get 120/144 fps video on their PC’s, which in my experience, is software that is very prone to working for a bit, then struggling, which creates far, far worse artifacting.

Yes, hardware interpolation apparently introduces response lag so it’s not very suitable online multiplayer titles. But for single player console titles that are limited to 30fps, a decent brand television with features like this is a very real alternative. If I was buying a television this feature (+ high Hz native) would be far more important to me than 4K, to be honest.

Anyone with new-ish televisions with this feature, do you agree? Feel free to share your experiences and model numbers, might be a useful resource for people in the market for a panel.


#2

I got a 4k TV this winter (LG 65SK8000). If a PS4 Pro game gives an option between higher resolution or higher framerate, I choose higher resolution and turn on my TV’s motion smoothing. I have never had input lag or graphical artifacts affect my experience in any noticeable way whatsoever (although, as you mentioned, this is exclusively for single-player games).

I was afraid I would have to turn in my Tech Geek credentials if I ever admitted this in public. Thanks for providing a safe space.


#3

Can anyone let me know what hardware interpolation is? Is that the thing that gives you the soap opera effect when watching movies?


#4

Interpolation adds frames. ‘The Soap Opera Effect’ is a poor term to definite it by because afaik, soap operas are actually shot at 30fps…


#5

Huh, TIL. I guess my response would be to just play how you want and let others do the same. The PC folks bleating on about every last drop of performance you can’t get on consoles are simply not worth engaging with. Oh, you’re $2000 graphics card can run circles around my $300 console? Who would’ve thunk it!


#6

I’ve seen this line of thought a few times lately, often in reference to game streaming services, and it seems so odd to me. I don’t really play competitive multiplayer games at all, but I still do what I can to avoid latency. It makes action games more difficult and worse-feeling, why would I opt into that? Any kind of aiming with substantial input lag just feels awful and floaty to me. You can play however you want, of course, but I’d happily choose to play at native 30 FPS instead, or even lower if necessary.


#7

I think interpolation is fine for a video game but I cannot stand the way it looks on a film. 3:2 pulldowns are okay (not that we see those much anymore) but the 24fps->60 interpolation that a modern TV does is intolerable.


#8

Live action I’m kind of the same, but I quite like it for 3D animated flicks or CGI heavy movies.


#9

The added lag from motion interpolation on a decent modern set is about 20ms, usually enough to keep the total lag from input to output below 60ms, which is generally considered the acceptable cut-off. That’s not to say it wouldn’t be noticeable to you - people are different. But when I got my new TV, I ran through a couple test sections on AC Odyssey and Tomb Raider and didn’t have a single time where I missed a jump or shot with motion smoothing on. A 20ms increase basically amounts to 1-2 frames, and I would guess most non-twitch-reflex intensive games have enough buffer built in to cover that difference.

In case anyone is curious, this topic reminded me of something. Back when the split over 30 vs 60+ fps started getting heated a few years ago, there was a lot of discussion about how the human eye can’t see beyond {24fps; 30fps; 60fps}, so those extra frames were wasted.

As should be obvious, all of those are untrue. This myth seems to have grown out of an oversimplification of a couple studies and lack of understanding how vision works. (To be clear: I’m not claiming I “understand” how vision works. I’m just a layman who was curious and did a little reading.)

There was one study that showed people could distinguish 12 individual images per second, which is different from distinguishing motion. There was an older “study” by the motion picture industry that said people couldn’t distinguish between 24fps and interpolated motion (wow, what an amazing coincidence that the perfect frame rate just happens to be the one they’d been using for the industry’s entire history!) The “flicker-fusion threshold” (the rate at which an image is perceived as steady) is between 60-90hz, but again, that’s just one factor. Even all these things depend on individual performance and environmental conditions. Fighter pilots have been able to detect a target shown for 1/255 of a second. I’ve seen 1,000 - 2,000hz recently as the upper threshold.

And still, all of this stuff is a massive oversimplification of how vision works. We don’t see in consistent “frames.” It’s like asking what the frame rate is on a mirror. We process images differently from movement, direct attention differently from peripheral, people differently from objects, and conscious differently from subconscious. We perceive things changing when they don’t and we perceive things staying the same when they’re changing. The brain is wildly complicated, and vision might be one of the more complicated things the brain does.

tl;dr - Vision is complex, and trying to reduce it to a single number that can be slapped on TV marketing material is silly.