One question that's posed every new console generation is "When will games finally run at 60 frames-per-second across the board?" It never happens, with developers always erring on the side of throwing cooler stuff on the screen, and folks chasing the 60 FPS lifestyle being forced to buy a PC and an expensive graphics card. And yet, the dream lives on.
It’s something I’ve never really given a shit about beyond extreme cases (like Broforce ), but since I don’t actually have a PS5 and don’t like to watch a lot of gameplay for stuff I’m looking forward to, all I have to fill the void is backwards compatibility performance stuff. I feel this.
I’ve spent way too much time thinking about the Medievil remake’s performance.
The 60/30 dilemma is really impacting my Destiny playing. Bungie has the next gen patch dated for December 8, so for now both PS5 and Xbox are stuck at 30 FPS. A very beautiful 4K 30 FPS, but when my “ancient” gaming PC can run it at 1080/60, it’s making the wait for the patch more than a little rough. But that’s not to say the game is “unplayable” at 30, I’m just being a spoiled brat at this point!
It’s silly but yeah I’m feeling this. Spider-Man and Ghost of Tsushima at 60 is transformative, it makes the worlds more alive and a lot less dreamlike (which is incidentally why HFR for films was such a bad idea).
I wish more games had future-proofed themselves by allowing for an uncapped framerate. There are too many games from last gen that will unfortunately never get updated for this, and we’re right back to the modularity of PC versus the instant convenience of consoles.
I was considering putting off playing Beyond Light until December 8th for this reason, but that would also put we way behind on getting raid-ready anytime in the near future. So I’m dealing with 30 FPS (and I’ve never played Destiny at 60 FPS, so it’s not like I really know what I’m missing fully). Can’t wait to play it at 60FPS, though. AC Valhalla at 60 makes me never want to revisit the last (pretty darn good!) games in the series due to how smooth it is.
Destiny always looked absolutely fine to me, until literally this week when I happened to put a lot of hours in Titanfall 2 multiplayer and then jumped directly into Destiny 2. I’m sure the difference in movement speed is a contributing factor in how much I noticed, but holy framing hell.
I was fortunate enough to pick up a 144hz monitor about 2 years ago and it really looks night and day now when comparing against my much older TV. I think games on it look fine when they’re built with a lower frame rate in mind but comparing say a shooter or driving game it just doesn’t look good at all to me.
I think a lot of the greater internet framerate discourse has been awful. But also, I have a 144Hz monitor, and I notice when Forza Horizon has decided to reset my options back down to 60fps. Our eyes do notice higher framerates, and particularly in games, where moving is the thing you do, it can contribute greatly to the feel of the game.
24fps works for movies, in part because directors know how fast you can do things before the illusion is spoiled. We aren’t all directors, and action games are not limited by cinema conventions.
Because consoles are working within constraints, and “next gen graphics” gets attention, I think there’s a real incentive for developers to prioritize fidelity over framerate, and I think I like that “resolution” vs “performance” is a tradeoff that we can make this generation. It’s sort of a way out of the gravity well of nicer graphics.
Even making the options a choice between “performance” and “resolution” is a qualitative decision on the part of developers - personally I’d sacrifice fidelity for both framerate AND native display resolution. Especially since, given where graphics are today, that might not be much of a sacrifice! Luckily I’m in no particular hurry to jump to 4K, so next-gen “performance” modes may well offer me both.
This is kind of funny to me because PC folks have been stuck with this dilemma for ages, except we usually get ten billion small options rather than one big one.
I’m gonna be honest I find a very odd joy in the ones that let you do previews of 5 minutes of gameplay where it’s a scripted sequence but gives you all the stats in regards to performance. I remember liking the GTA IV version in college because it had line graphs I want to say?
Having some of the visual options from PC games would be great, the only problem being that console games are under more aggressive certification rules than on PC. If uncapping the framerate can cause consistent crashes (or in a rare case like Anthem, potentially damage the console), then it just won’t pass cert.
That, and the complicated ramifications of 30 vs 60 online situations, is why a game like Bloodborne probably won’t get an update this far after release to uncap the framerate.
Wasn’t there a racing game not too long ago where people managed to unlock the frame rate on PC and it turned out physics was directly tied to it causing cars to just completely flip out and fly away?
I can definitely see why you can’t in cases like that but I wonder if we will see more options that don’t break games similar to a low, medium, high setting.
It’s happened before with Souls games, too. Mechanics like weapon durability were tied to the framerate, so degradation would tick twice when doubled to 60.
Yeah there have been a handful of games I can think of off the top of my head that have weird ties to frame rate
The original release of Dark Souls on PC was infamous for this. There was the DSFix fan patch that unlocked the framerate, but it would also bind the Backspace key to limiting it back down the 30 because sometimes the game would break sliding down ladders at too high a framerate.
Also, as someone who grew up on PC gaming with a 60Hz monitor and who’s used a 144hz monitor. I don’t get the discourse around this. Framerate being a dealbreaker or a stickingpoint is wild to me.
Yeah, I remember in the 1024x768 days when most games would struggle to run at a consistent 20fps unless you stepped down to like 800x600. But I think I’ve heard that there’s a real difference between CRT and LCD tech for this kind of thing, like the slight image persistence of the CRT phosphor panels made both low resolutions and low refresh rates more tolerable or something.
Frame rates (and graphics options in general) have always had some weird game play ramifications. There are certain jumps in Quake 3 that you can only do if you cap your frame rate at certain numbers. There are certain combos in Street Fighter 5 that only work in the PC version after disabling VSync.
Destiny 2 will be an interesting one shortly because the aim assist that you get on a controller is tied to frame rate (more frames = stronger aim assist), so I have a feeling a lot of people on PS5/XSX are magically gonna start hitting a lot more head shots in a few weeks. Won’t be as significant as the people getting 200+ on PC, I guess.
Wow, I’m facing this same dilemma. I’ve never been one to care about frame rates, and have never even been able to tell whether a game was running at 30 or 60fps.
When I first loaded up the Demon’s Souls Remake, I changed my settings to Cinematic mode out the gate. While I was playing, I kept feeling this very tiny input delay for some of my actions like rolling and parrying, but didn’t think to tie it to the cinematic mode; I just thought “it’s been so long since I played the original Demon’s Souls, maybe it was always like this.” Then for fun, a couple hours in I decided to try out the Performance just to see what it was like and I immediately noticed that everything just felt smoother, particularly panning the camera, and that input delay I was feeling earlier was gone.
I don’t think I’ve been ruined (or at least I hope not) in that I can never go back to cinematic mode for this game, but at the very least I prefer how performance mode feels, so I’ll be sticking with that for the time being.