I was toying around with my PS4 Pro (because its basically a modged podged computer cosplaying as a console like what kind of console tells u to choose a lower resolution to get better performance lmao good luck) and uhhhhhh can someone tell me why when i switch from 4k to 1080p resolution on my 4k TV that shit looks clear as fuck like i can see every pixel popping out sharper than anime hair like does the ps4 pro have some sort of weird 4k to 1080p scaler that nobody told me about or am i just losing my sight?
UPDATE:
I am shocked. I am bald. I am appalled. So on my 4k TV i tried 4 games running on the 1080p resolution:
Nier Automata, Fortnite, RE remake and Project Diva Future Tone.
On my 4K TV it was the same result for each game: on a native 4K resolution everything looked normal but on 1080p everything looked ULTRA sharp. Like in one of the music videos on project diva i witnessed for the first time since i bought the game: WALL TEXTURES that I didnât even know were there.
So, I wired my PS4P to my PC monitor which has a native 1920x1080 (1080p) resolution. Keep in mind that super sampling wasnât supported on my 4k TV. But on my monitor it was. I tried all 4 games on my monitor on 1080p with both super sampling on and off and it wasnât able to replicate the super sharp effect my 4K TV provided. The thing is my 4K TV has HDR+ idk if that even plays a role in sharpening an image but looking more closely into things it says my monitor has an HDCP of 1.4 while my 4k tv has an HDCP of 2.2 (Idk what those even ARE) so likeâŚCan someone whoâs really tech savvy explain to me why my 4k TV provides a better image on a lower resolution. Iâm genuinely interested and also super confused
LOVE U ALL XOXO
UPDATE 2:
I just noticed something:
When i switched the ps4 pros resolution on my 4K TV to 1080p it sharpened everything.
When i turned off the HDR+ special viewing mode in my TV settings it sharpened it significantly more
When i turned on âGame Modeâ on my special TV viewing settings not only did it sharpen it to the fullest but it unlocked a new 4K resolution to be recognised by my PS4P called "2160p - RGB"
Now that I think about it this âgame modeâ on my TV is supposed to make my performance better by dumbing down the graphics but the first time i tried it was when i was playing Rise of the Tomb Raider in 4K and it messed up my frame rate CONSIDERABLY while the HDR+ option enhanced it.
OverallâŚSmart TVs are fucking weird and Iâm mad that I spent so much time playing games on a blurry not vibrant screen
I donât own a PS4 Pro, but Iâm curious whether the same thing happens with a different HD console, or a normal Blu-Ray player. It could just be that youâre close enough to your TV to see the pixels?
If weâre talking about 16:9 displays, no there isnât âanotherâ 1080, because the ratio dictates what the other number is, and itâs always 1920.
I think itâs because of the fact that since 1080p is exactly half the horizontal and half the vertical resolution of 4k, a 4k TV displaying a 1080p signal will map all the pixels perfectly, so itâll still look sharp.
Patch 5.50 that just came out enabled 4K supersampling. So even if you hook a PS4 Pro up to a 1080p screen (or set it to output 1080p to a 4K screen) then it is internally rendering to 4K and then scaling it down (which your TV will then scale back up).
This is effectively SSAA, something that provides significantly higher quality (on a per-pixel basis) than rendering that only samples once per pixel (even with post-processing effects like FXAA that are designed to reduce the visible jaggies, there is no extra underlying information there).
While the input signal has been crushed down to only a quarter of the pixels, each of them contains closer to the ground truth (ie what youâd get if Pixar spent an hour rendering each frame rather than using your GPU to render it in milliseconds) and so when sampled up to 4K, itâs a lot closer to what youâd want compared to something like the output of a PS3 or Switch - which often only sample at 720p in their underlying render (sometimes 900p) and so only have a grid of eg 1600 x 900 points where they work out which polygon is under the pixel to calculate what colour it is there - thatâs a lot less data than 4K (even checkerboarded so only really half of 3840 x 2160). The PS4 Pro should look a lot sharper than the Switch or PS3 even though all of them can be set to output at 1080p.
Thanks for the heads up! But I posted an update to my original post and found out it really was my TV doing all that magic not the super sampling trick which is really weird and confusing.
Hmm, Iâm not sure I would call it âgoodâ sharp then (without seeing it with a set of eyeballs, itâs hard to diagnose/comment with that much certainty - the following is speculative), rather Iâd say you might be responding to an oversharpened image (via a sharpen kernel it runs on 1080p input).
A genuine 4K input should look sharper (as in the ability to see the details of the actual image like the most distant elements in their highest fidelity) than a 1080p input being upscaled to a 4K screen (even if originally of the same image before the cable/output). If it doesnât then either there is something wrong with the 4K input or the 1080p upscale is such a transformative process (I donât say that as a good thing - images being totally chewed up by TVs is a terrible failure of modern hardware) that youâre picking up on how it mangles the image. Even if your sharpening kernel doesnât cause terrible halos then itâs still doing fake pumping up of the local contrast so is generally a destructive effect. Itâs often what can be mistaken for an increase in underlying image detail (a bit like how blind testing shows people think making something louder is the same as increasing the fidelity of the speakers).
I would go into the picture settings on the TV (often in advanced) and compare the settings enabled for 4K native vs 1080p input (even just the basics like mode/colour/contrast/sharpness/brightness may be different for 1080p input on some TVs - some brands keep them fixed per input port, some donât). Stuff labelled âUltra Resolutionâ or similar in Advanced Sharpness that may be inactive when a 4K signal comes in but switches on for 1080p and so should be visible when you toggle it. Work out what setting in your TV youâre associating with sharpness (hopefully they arenât hidden ones - another marvel of modern screens).
Edit: but also Iâm the sort of girl who spends several hours with every new screen (which, when you think of how many hours youâll spend looking at them, isnât really that much as a percentage time used) making sure theyâre configured to be the least bad they can be (because lots of TVs simply canât be configured to take the array of pixel colour values you feed into the cable and just display that on the screen without any deterioration from it trying to be fancy, even in "game mode"s that turn at least the most lag-inducing parts of the processing off). YMMV.
So â2160p - RGBâ is the HDR-off version of 4K which is the only way you can send a full 4K@60fps signal down a HDMI 2.0 cable. Why is HDR off? Because it can only use 24-bit colour (8 bits per sub-pixel so each R, G, & B sub-element is assigned a value between dark (0) and light (255)) rather than the HDR spec that requires 30 to 36-bit colour (either 10 bits or 12 bits per sub-pixel).
Basically as HDMI 2.0 only has so much bandwidth down the cable, HDR doesnât really fit (HDMI 2.1 will fix this by increasing the bandwidth). So itâs implemented via a hack known as chroma subsampling (which was originally used to hack 4K at 60fps down the HDMI 1.4 bandwidth cables when they only had enough bandwidth for 30fps) that sends a grayscale 4K signal along with a lower fidelity colour offset (in the case of 4:2:2 thatâs a 2K x 2K grid rather than the native 4K x 2K; with 4:2:0 itâs a 1080p colour offset). That link has a good visualisation to understand what that does to the final output.
Having the PS4 Pro see the RGB option is good as thatâs really something any good 4K TV should expose to the console. How the PS4 works is that when in a game that doesnât support HDR, itâll send the full RGB signal but when you switch to an HDR game it should switch to 4:2:2 output. The only slight issue with this is some 4K HDR TVs donât like the PS4âs 4:2:2 output (which is 12-bit colour) and so causes banding - itâs not a universal issue but if you do see banding then you may be forced to tell the PS4 to use 4:2:0 mode (with fixes this issue with HDR TVs that donât interpret the 4:2:2 signal correctly).
If anyone says âHDR/4K is a messâ then this is probably the mess theyâre talking about. In theory then some of the spec changes and tweaks in HDMI 2.1 will fix all of this but thatâs not necessarily going to help anyone who already has a 4K TV.
It also sounds like you maybe had a bug in the Game Mode when you enabled it. It shouldnât break the framerate but possibly your TV turned off the motion interpolation badly and ended up giving you stuttering output - Iâve definitely seen a Philips TV break that way when disabling âNatural Motionâ and I had to reboot the TV to fix it. Motion interpolation is generally very bad for games because it has to wait for extra frames to arrive (so it can work out some extra in-between frames that it thinks are a good fit) and so the thing youâre seeing on the screen are much further behind where the actual game is (ie it becomes less responsive) and is an acquired taste for films and TV (see Soap Opera Effect).
The thing is this HDR+ mode on my TV not only limits the colours but it also blends some of the frames together and you get what youâd get if you forcibly converted a 30 fps video to 60 fps. Like even PS1 games Iâve played with that HDR+ mode have had moments where they blend frames to achieve this weird fluidity even if its for a short period of time. Basically - I donât understand anything LMFAOO but even at 4k and with the HDR mode on my TV Nier didnât really have that much of a problem keeping 60 fps. Now that itâs disabled and the resolution is lowered it never dips below 60 fps which is stupid to me because why waste so much effort stressing a console when u can get a more colourful sharp and well-performing visuals instead. Thank god the PS4 Pro ads more than just 4k resolution to games or I wouldâve been mad
I donât know much about how HDR works, but depending on your brand I am pretty sure that the 'HDR+" mode is like the âSportsâ mode that are available in a lot of televisions. Those types of modes are gimmick modes that usually make the image worse and are more inaccurate than leaving those modes off. My parents have a Samsung UHDTV that has an âHDR+â mode that just use different settings for the image. That mode does not make HDR better in any way. Just wanted to mention this as I was confused by this as well when I got a 4K Blu-Ray player.
A discovery I made recently was that different modes on your fancy TV can introduce varying amounts of input lag (no, really). I was sucking hard when I tried playing Bayonetta 2 on my switch, and doing much better with the precise dodging when I was in handheld mode, which is what piqued my interest.
I dug into it and learned that putting my TV on game mode would reduce input lag by over a tenth of a second!! So thatâs another way your 4K screen might be messing up your games.