Yall I just noticed something weird


#1

I was toying around with my PS4 Pro (because its basically a modged podged computer cosplaying as a console like what kind of console tells u to choose a lower resolution to get better performance lmao good luck) and uhhhhhh can someone tell me why when i switch from 4k to 1080p resolution on my 4k TV that shit looks clear as fuck like i can see every pixel popping out sharper than anime hair like does the ps4 pro have some sort of weird 4k to 1080p scaler that nobody told me about or am i just losing my sight?

UPDATE:
I am shocked. I am bald. I am appalled. So on my 4k TV i tried 4 games running on the 1080p resolution:
Nier Automata, Fortnite, RE remake and Project Diva Future Tone.
On my 4K TV it was the same result for each game: on a native 4K resolution everything looked normal but on 1080p everything looked ULTRA sharp. Like in one of the music videos on project diva i witnessed for the first time since i bought the game: WALL TEXTURES that I didn’t even know were there.
So, I wired my PS4P to my PC monitor which has a native 1920x1080 (1080p) resolution. Keep in mind that super sampling wasn’t supported on my 4k TV. But on my monitor it was. I tried all 4 games on my monitor on 1080p with both super sampling on and off and it wasn’t able to replicate the super sharp effect my 4K TV provided. The thing is my 4K TV has HDR+ idk if that even plays a role in sharpening an image but looking more closely into things it says my monitor has an HDCP of 1.4 while my 4k tv has an HDCP of 2.2 (Idk what those even ARE) so like…Can someone who’s really tech savvy explain to me why my 4k TV provides a better image on a lower resolution. I’m genuinely interested and also super confused
LOVE U ALL XOXO

UPDATE 2:
I just noticed something:
When i switched the ps4 pros resolution on my 4K TV to 1080p it sharpened everything.
When i turned off the HDR+ special viewing mode in my TV settings it sharpened it significantly more
When i turned on “Game Mode” on my special TV viewing settings not only did it sharpen it to the fullest but it unlocked a new 4K resolution to be recognised by my PS4P called "2160p - RGB"
Now that I think about it this “game mode” on my TV is supposed to make my performance better by dumbing down the graphics but the first time i tried it was when i was playing Rise of the Tomb Raider in 4K and it messed up my frame rate CONSIDERABLY while the HDR+ option enhanced it.
Overall…Smart TVs are fucking weird and I’m mad that I spent so much time playing games on a blurry not vibrant screen :frowning:


#2

I don’t own a PS4 Pro, but I’m curious whether the same thing happens with a different HD console, or a normal Blu-Ray player. It could just be that you’re close enough to your TV to see the pixels?


#3

if it was a pc monitor i would say that 1080p is the native resolution of the monitor


#4

hold on lemme get back to u im seriously about to try this shit on my pc monitor


#5

I own 2 other 1080p consoles - a PS3 and a Nintendo switch. I don’t see this drastic sharpness when i play a game on either consoles


#6

there are sifferent kinds of 1080 like 1920x1080 is the normal but there are other blahx1080


#7

If we’re talking about 16:9 displays, no there isn’t “another” 1080, because the ratio dictates what the other number is, and it’s always 1920.


#8

is it though? I mean dont tvs sometimes fudge it?


#9

I think it’s because of the fact that since 1080p is exactly half the horizontal and half the vertical resolution of 4k, a 4k TV displaying a 1080p signal will map all the pixels perfectly, so it’ll still look sharp.


#10

Patch 5.50 that just came out enabled 4K supersampling. So even if you hook a PS4 Pro up to a 1080p screen (or set it to output 1080p to a 4K screen) then it is internally rendering to 4K and then scaling it down (which your TV will then scale back up).

This is effectively SSAA, something that provides significantly higher quality (on a per-pixel basis) than rendering that only samples once per pixel (even with post-processing effects like FXAA that are designed to reduce the visible jaggies, there is no extra underlying information there).

While the input signal has been crushed down to only a quarter of the pixels, each of them contains closer to the ground truth (ie what you’d get if Pixar spent an hour rendering each frame rather than using your GPU to render it in milliseconds) and so when sampled up to 4K, it’s a lot closer to what you’d want compared to something like the output of a PS3 or Switch - which often only sample at 720p in their underlying render (sometimes 900p) and so only have a grid of eg 1600 x 900 points where they work out which polygon is under the pixel to calculate what colour it is there - that’s a lot less data than 4K (even checkerboarded so only really half of 3840 x 2160). The PS4 Pro should look a lot sharper than the Switch or PS3 even though all of them can be set to output at 1080p.


#11

Thanks for the heads up! But I posted an update to my original post and found out it really was my TV doing all that magic not the super sampling trick which is really weird and confusing.


#12

Hmm, I’m not sure I would call it ‘good’ sharp then (without seeing it with a set of eyeballs, it’s hard to diagnose/comment with that much certainty - the following is speculative), rather I’d say you might be responding to an oversharpened image (via a sharpen kernel it runs on 1080p input).

A genuine 4K input should look sharper (as in the ability to see the details of the actual image like the most distant elements in their highest fidelity) than a 1080p input being upscaled to a 4K screen (even if originally of the same image before the cable/output). If it doesn’t then either there is something wrong with the 4K input or the 1080p upscale is such a transformative process (I don’t say that as a good thing - images being totally chewed up by TVs is a terrible failure of modern hardware) that you’re picking up on how it mangles the image. Even if your sharpening kernel doesn’t cause terrible halos then it’s still doing fake pumping up of the local contrast so is generally a destructive effect. It’s often what can be mistaken for an increase in underlying image detail (a bit like how blind testing shows people think making something louder is the same as increasing the fidelity of the speakers).

I would go into the picture settings on the TV (often in advanced) and compare the settings enabled for 4K native vs 1080p input (even just the basics like mode/colour/contrast/sharpness/brightness may be different for 1080p input on some TVs - some brands keep them fixed per input port, some don’t). Stuff labelled “Ultra Resolution” or similar in Advanced Sharpness that may be inactive when a 4K signal comes in but switches on for 1080p and so should be visible when you toggle it. Work out what setting in your TV you’re associating with sharpness (hopefully they aren’t hidden ones - another marvel of modern screens).

Edit: but also I’m the sort of girl who spends several hours with every new screen (which, when you think of how many hours you’ll spend looking at them, isn’t really that much as a percentage time used) making sure they’re configured to be the least bad they can be (because lots of TVs simply can’t be configured to take the array of pixel colour values you feed into the cable and just display that on the screen without any deterioration from it trying to be fancy, even in "game mode"s that turn at least the most lag-inducing parts of the processing off). YMMV.


#13

Now that u mentioned “game modes” I’ve discovered something new


#14

So “2160p - RGB” is the HDR-off version of 4K which is the only way you can send a full 4K@60fps signal down a HDMI 2.0 cable. Why is HDR off? Because it can only use 24-bit colour (8 bits per sub-pixel so each R, G, & B sub-element is assigned a value between dark (0) and light (255)) rather than the HDR spec that requires 30 to 36-bit colour (either 10 bits or 12 bits per sub-pixel).

Basically as HDMI 2.0 only has so much bandwidth down the cable, HDR doesn’t really fit (HDMI 2.1 will fix this by increasing the bandwidth). So it’s implemented via a hack known as chroma subsampling (which was originally used to hack 4K at 60fps down the HDMI 1.4 bandwidth cables when they only had enough bandwidth for 30fps) that sends a grayscale 4K signal along with a lower fidelity colour offset (in the case of 4:2:2 that’s a 2K x 2K grid rather than the native 4K x 2K; with 4:2:0 it’s a 1080p colour offset). That link has a good visualisation to understand what that does to the final output.

Having the PS4 Pro see the RGB option is good as that’s really something any good 4K TV should expose to the console. How the PS4 works is that when in a game that doesn’t support HDR, it’ll send the full RGB signal but when you switch to an HDR game it should switch to 4:2:2 output. The only slight issue with this is some 4K HDR TVs don’t like the PS4’s 4:2:2 output (which is 12-bit colour) and so causes banding - it’s not a universal issue but if you do see banding then you may be forced to tell the PS4 to use 4:2:0 mode (with fixes this issue with HDR TVs that don’t interpret the 4:2:2 signal correctly).

If anyone says “HDR/4K is a mess” then this is probably the mess they’re talking about. In theory then some of the spec changes and tweaks in HDMI 2.1 will fix all of this but that’s not necessarily going to help anyone who already has a 4K TV.

It also sounds like you maybe had a bug in the Game Mode when you enabled it. It shouldn’t break the framerate but possibly your TV turned off the motion interpolation badly and ended up giving you stuttering output - I’ve definitely seen a Philips TV break that way when disabling “Natural Motion” and I had to reboot the TV to fix it. Motion interpolation is generally very bad for games because it has to wait for extra frames to arrive (so it can work out some extra in-between frames that it thinks are a good fit) and so the thing you’re seeing on the screen are much further behind where the actual game is (ie it becomes less responsive) and is an acquired taste for films and TV (see Soap Opera Effect).


#15

The thing is this HDR+ mode on my TV not only limits the colours but it also blends some of the frames together and you get what you’d get if you forcibly converted a 30 fps video to 60 fps. Like even PS1 games I’ve played with that HDR+ mode have had moments where they blend frames to achieve this weird fluidity even if its for a short period of time. Basically - I don’t understand anything LMFAOO but even at 4k and with the HDR mode on my TV Nier didn’t really have that much of a problem keeping 60 fps. Now that it’s disabled and the resolution is lowered it never dips below 60 fps which is stupid to me because why waste so much effort stressing a console when u can get a more colourful sharp and well-performing visuals instead. Thank god the PS4 Pro ads more than just 4k resolution to games or I would’ve been mad


#16

I don’t know much about how HDR works, but depending on your brand I am pretty sure that the 'HDR+" mode is like the “Sports” mode that are available in a lot of televisions. Those types of modes are gimmick modes that usually make the image worse and are more inaccurate than leaving those modes off. My parents have a Samsung UHDTV that has an ‘HDR+’ mode that just use different settings for the image. That mode does not make HDR better in any way. Just wanted to mention this as I was confused by this as well when I got a 4K Blu-Ray player.


#17

yeah thats probably it btw asdlkasdlksdlksdlksdkl my tv is LYING to me selling me some bullshit hdr i am DONE with technology


#18

This whole thread makes me never want to go past 1080p ever.