Next-gen has to mass-transition to 60fps

I mean the vast majority of gamers don't know what ray-tracing or multi-sample anti-aliasing is either, but they can visually see a difference and care about the way games look. If the new COD suddenly looked like a Sega Saturn game, there would be an uproar.

That's why it's impossible to market FPS. You have to physically get a controller in someone's hands and have them feel the difference. It's virtually impossible.
Exactly.

Also worth mentioning: 60fps YouTube didn’t exist until 2014. TV commercials and off-screen footage online were pretty much the only ways marketers could show off 60fps to their audience outside of having them play it.
 
Last edited:
I think with all the things they can improve with games 60fps is the hardest sell, It's the least marketable.

I know it seems to be a divisive thing to say in this thread but outside of heavily competetive multiplayer games like beat em ups and FPS games the majority of the gaming public just don't care. Even with those games I don't think most casual gamers notice those games are 60fps compared to their single player 30fps games (although Im sure they would notice if they dropped them to 30)
 
Yes, I’m familiar with that Insomniac ‘study,’ which was just a poorly executed Internet survey with a bad case of selection bias.

If there were a study of people being presented a game at both 30fps and 60fps, only for most to say they prefer 30, then that would be something.

Without that, the tired “nobody cares about framerate’ line is both ignorant and (frankly) idiotic.
I do not support the assertion that "nobody cares about framerate" primarily because thanks to places like DigitalFoundry, consistent playable framerate has achieved one of the highest priorities for consistency in 3D games today. However, looking at the games that sell like hot cakes today, it is evident that they do not sell by the virtue of them being 60 fps over 30 fps. The bitter sweet reality is that visual fidelity sells games as much as other salient factors. And in a closed spec box, it is all about trade off between the playable framerate allied to added post processing fx (like aforementioned motion blur) and gorgeous visuals (including more robust LoD systems) or 60 fps and comparatively simpler visuals.
 
y'know I care about framerate so I play on PC.... That said, I have LOVED console games such as [insert Sony exclusive here] and Breath of the Wild and wouldn't have loved them anymore even if the framnerate was better. If anything I'd rather graphics options we're removed from console games in general - I just want to play the game MAXED or the way the developer thought was sufficient, save the tinkering for PC.

I also just want to say developers in general are hugely talented folks, lets leave all of this to them and trust them.
 
Please read the OP before replying to the thread. If you don’t understand the topic ask others questions.

The OP explains that technology and the realities of game development are alligning in such a way that we are on the brink of 60fps becoming the norm next gen.

This current generation has already seen a shift in this direction. Halo, Battlefield, Gears of War and Uncharted multiplayer, Resident Evil, MGS5, all Codemasters racing games. They’ve all gone 60fps this gen despite being 30 in the prior generation. This trend is going to accelerate next gen.

This isn’t a debate as much as a simple statement of fact.
We have always been on the "brink" of 60 fps being the norm, except developers and publishers will always use the extra power for better graphics. Every single time. Unlike your speculation, this is a fact.
 
Guess what folks, the MASS majority of people are just fine with 30fps, me included. 60fps isn't going to be going yhrough some sort of "cultural transition". It will always be relegated to the few odd games. The rest will Target 30fps and higher fisual fidelity.

I for one would gladly take better graphics any day for 30fps cause 30fps is a perfectly acceptable and playable frame rate.
This. It also doesn't make me nauseated after 20 minutes of playtime. Shit looks gross.

I'd rather 30fps locked rather than 45-60fps depending on complexity of the scene.
 
I wonder if more and more games will offer 60fps options at lower resolutions or graphics on new consoles. We already see this on the Pro and X with God of War and Forza Horizon 4.

I hope it becomes more common.
 
Show people COD, LoL, BF, FIFA and fortnite at 30 FPS and they will instantly realize something is wrong.

Because thats what most people play.

At least most MP games Made the jump to 60FPS. Nos we just need to wait for SP. Hoping "performance mode" become a standar.
That didn't answer my question.
 
Playing most games the past few months at 120+fps is gonna make it hard for me to buy a console next gen. 60fps minimum please. It's better, no contest.
 
That didn't answer my question.
People care about the framerate.
It's just that most people don't know what ITBIS that makes a game more fluid, responsiva and fast.

People noticed last gen in COD vs BF how "fluid" COD was.
And my 12 years old sister ask me why the last guardian stutter.

People care, they won't see it in ads, but the game Will FEEL better, making more chances of keeping users or them buying a sequel.
 
The vast majority of gamers don't even know what 60 fps is. Dorks like us who post on video game message boards do. That's it.
they may not know what it is, but they may actually unconsciously understand that certain games are better because of the framerate. I don't think its coincidence that some of the biggest games out there target 60fps.
 
We have always been on the "brink" of 60 fps being the norm, except developers and publishers will always use the extra power for better graphics. Every single time. Unlike your speculation, this is a fact.
You’re still not understanding the bigger picture here.

We are entering a crossing point between graphical fidelity, game budgets, and hardware. At a certain point making big AAA games look notably better than they do now starts to cost exponentially more money.

It costs nothing to make your game run at 60fps. It costs tens of millions to make another huge graphical leap from what we have in the best games right now.
 
I do not support the assertion that "nobody cares about framerate" primarily because thanks to places like DigitalFoundry, consistent playable framerate has achieved one of the highest priorities for consistency in 3D games today. However, looking at the games that sell like hot cakes today, it is evident that they do not sell by the virtue of them being 60 fps over 30 fps. The bitter sweet reality is that visual fidelity sells games as much as other salient factors. And in a closed spec box, it is all about trade off between the playable framerate allied to added post processing fx (like aforementioned motion blur) and gorgeous visuals (including more robust LoD systems) or 60 fps and comparatively simpler visuals.
Please, do explain.

I'm genuinely curious how you don't think buyers see higher framerate (and therefore, smoother gameplay) as a selling point.
 
I prefer they give an option, I rather put my resources on bells and whistle unless it's needed like on mp games (I prefer sp though) I'm shallow, I like my graphics to be the best possible there is
 
Playing most games the past few months at 120+fps is gonna make it hard for me to buy a console next gen. 60fps minimum please. It's better, no contest.
There's something I don't get, because I'm in the same position.

Whenever the topic of console framerates appears, people are consistently saying "you can't market frames per second", but over in desktop land we are constantly upgrading for that very reason, almost as if the industry is built around it.

The thought that you can't sell based on superior performance baffles me.
 
People care about the framerate.
It's just that most people don't know what ITBIS that makes a game more fluid, responsiva and fast.

People noticed last gen in COD vs BF how "fluid" COD was.
And my 12 years old sister ask me why the last guardian stutter.

People care, they won't see it in ads, but the game Will FEEL better, making more chances of keeping users or them buying a sequel.
Your 12 year old sister noticed that The Last Guardian stuttered because its framerate was shitty and unstable, not because it was 30 fps instead of 60fps.

30 vs 60fps is just not that important to most people. Especially not in comparison to pretty graphics.
 
Please, do explain.

I'm genuinely curious how you don't think buyers see higher framerate (and therefore, smoother gameplay) as a selling point.
Well, with due respect, if you stopped your selective reading perhaps you could understand the point better. You continue to put words in my mouth and this is the last time I will be engaging in a response.

I have never asserted that consumers "don't" see higher framerate. Unless someone is visually impaired in some way, I would assume anyone can readily see the difference. However, consistent 30 fps is the benchmark for playability that has been established for the last few generations of consoles. The sales of games like Horizon Zero Dawn, Spiderman, Arkham Knight, Dark Souls III, Bloodborne, FFXV, AC Origins, Uncharted 4, UC LL, Gears of War 4, God of War etc were not handicapped because they were not 60 fps (presumably RDR 2 will be no different). If there was some form of consensus for consistent 60fps mandate for next gen without exception among all developers, then that would indeed be an objective improvement. However, given the desire to attain more with a fixed hardware year after year by pushing technological boundaries afforded by halving the framerate (especially when it is considered playable across the industry) and ultimately attracting more potential clients will win out on consoles.

The only way to be certain of getting over 30 fps consistently for all releases in console space comes courtesy of VR which mandates 90fps.
 
You’re still not understanding the bigger picture here.

We are entering a crossing point between graphical fidelity, game budgets, and hardware. At a certain point making big AAA games look notably better than they do now starts to cost exponentially more money.

It costs nothing to make your game run at 60fps. It costs tens of millions to make another huge graphical leap from what we have in the best games right now.
Come again?
 
You’re still not understanding the bigger picture here.

We are entering a crossing point between graphical fidelity, game budgets, and hardware. At a certain point making big AAA games look notably better than they do now starts to cost exponentially more money.

It costs nothing to make your game run at 60fps. It costs tens of millions to make another huge graphical leap from what we have in the best games right now.
Explain how it costs nothing to make a game run at 60fps.
 
just have 2 options, one frame rate one highest res with all the bells and whistles. Devs can make games look waaaay better in 30fps in some cases
That only works if you actually have multiple SKU's, and hardware that can actually can take advantage of FPS through getting rid of graphics or resolution. Consoles arent PC's after all, and that will never work as smoothly as how PC games do as Pro and X show. Its much easier for devs to simply work with a solid base and scale up from there as is already customary on consoles. 100% of games are made primarily for the base machines.
 
Well, with due respect, if you stopped your selective reading perhaps you could understand the point better. You continue to put words in my mouth and this is the last time I will be engaging in a response.

I have never asserted that consumers "don't" see higher framerate. Unless someone is visually impaired in some way, I would assume anyone can readily see the difference. However, consistent 30 fps is the benchmark for playability that has been established for the last few generations of consoles. The sales of games like Horizon Zero Dawn, Spiderman, Arkham Knight, Dark Souls III, Bloodborne, FFXV, AC Origins, Uncharted 4, UC LL, Gears of War 4, God of War etc were not handicapped because they were not 60 fps (presumably RDR 2 will be no different). If there was some form of consensus for consistent 60fps mandate for next gen without exception among all developers, then that would indeed be an objective improvement. However, given the desire to attain more with a fixed hardware year after year by pushing technological boundaries afforded by halving the framerate (especially when it is considered playable across the industry) and ultimately attracting more potential clients will win out on consoles.

The only way to be certain of getting over 30 fps consistently for all releases in console space comes courtesy of VR which mandates 90fps.
Ironically, you're putting words in my mouth. I haven't once said/suggested that a game coming out at 30fps rather than 60fps will limit its sale potential. I've only dismissed this asinine notion that people 'don't care' for 60fps, which you also think is wrong. Good, we agree on something.

You then suggested that certain games don't sell on the virtue of their high framerates: when I asked for proof, you listed me some 30fps games that have sold well (half of which have high framerate support on PS4 Pro, btw).

However, 30 fps games selling well says absolutely nothing about a buyer's feelings towards 60fps. They only prove that consumers like 30fps itself, something we've known for decades. I can also name many 60fps games that sell well, thus proving that consumers also like 60fps. PSVR games selling well shows that people like 90fps. However, all of this data says nothing about which they'd prefer.

Right now, there is absolutely no data to support that consumers, when given a choice, will choose 30fps over 60fps, or vice versa. None whatsoever. Therefore, nobody can make a definitive point about how framerate impacts sales. Nobody.

Perhaps Crystal Dynamics, Team Ninja, and Sony Santa Monica could enlighten us on player preference. But everyone else? Pure speculation. Myself included.
 
Last edited:
We have always been on the "brink" of 60 fps being the norm, except developers and publishers will always use the extra power for better graphics. Every single time. Unlike your speculation, this is a fact.
FIFA, Call of Duty, Battlefield, Wolfenstein, Doom, Halo, Fortnite, and many other extremely games are 60fps. Games like Call of Duty pride themselves on their fluid animation, responsive controls, and bleeding edge graphical edge technology. Epic Games are quite specifically pushing for developers to target 60fps through the various performance optimizations they've been rolling into Unreal Engine through Fortnite. You'll notice that Microsoft have a bit of a 60fps fetish. Forza Horizon 4 is 60fps on Xbox One X. Gears of War 4 was 60fps on Xbox One X, and of course Gears of War 5 is 60fps, too.

Shadow of the Tomb Raider is 60fps on both Xbox One X and PS4 Pro, although the X holds the framerate better.

Dying Light 2 is targeting 60fps on consoles and Techland are quite notably arguing that higher framerates trump resolution.

There's also VR, where high framerates are mandatory.

So no, developers will not "always use the extra power for better graphics".
 
Surely this is good enough fidelity
It's not even fucking close.

just have 2 options, one frame rate one highest res with all the bells and whistles. Devs can make games look waaaay better in 30fps in some cases
Hey Devs just do this thing that requires a bunch of extra work and technology when you're already crunching to meet deadlines and make demos and other bullshit. Just, like, do it.
 
60 fps isn't something we lack due to power. PS2 titles were frequently at 60 frames per second.

Frame rate is a design decision a development team makes. A racing game should probably be designed with 60 fps in mind for a given piece of hardware/console, but does something like God of War of Spider-Man feel awful by virtue of not being 30 FPS? No.

Was GTA 3 appalling because Rockstar decided "Hey, we want to be able to see a few blocks away and have some textures on every NPC, so this isn't going to be 60fps?" No, they just made design decisions around what the current target hardware could handle.

Studios will always have to make the decision between resolution, frame rate, and the actual fidelity assets being manipulated on screen.
 
FIFA, Call of Duty, Battlefield, Wolfenstein, Doom, Halo, Fortnite, and many other extremely games are 60fps. Games like Call of Duty pride themselves on their fluid animation, responsive controls, and bleeding edge graphical edge technology. Epic Games are quite specifically pushing for developers to target 60fps through the various performance optimizations they've been rolling into Unreal Engine through Fortnite. You'll notice that Microsoft have a bit of a 60fps fetish. Forza Horizon 4 is 60fps on Xbox One X. Gears of War 4 was 60fps on Xbox One X, and of course Gears of War 5 is 60fps, too.

Shadow of the Tomb Raider is 60fps on both Xbox One X and PS4 Pro, although the X holds the framerate better.

Dying Light 2 is targeting 60fps on consoles and Techland are quite notably arguing that higher framerates trump resolution.

There's also VR, where high framerates are mandatory.

So no, developers will not "always use the extra power for better graphics".
VR of course not.

But we will see the non-multiplayer titles at the end of this generation.
 
Figures you had nothing to say on the topic after all.

Bye.
Easy there champ. I totally did have stuff to say on the topic, remember? You already replied to me. Chill.


Hey Devs just do this thing that requires a bunch of extra work and technology when you're already crunching to meet deadlines and make demos and other bullshit. Just, like, do it.
And like, all that extra work will surely be fully appreciated by a miniscule but very loud vocal minority! It'll totally be worth it!
 
And my 12 years old sister ask me why the last guardian stutter.
That's because Last Guardian often runs at like 18fps or something a lot of the time. Once you go below around 20fps or your framerate becomes highly inconsistent, that's when ordinary consumers start to notice bad performance. Most don't notice or care about the difference between 30 or 60 as long as it's consistent. The only time people didn't complain about this was maybe on the N64 because 3D was still pretty new back then. Ocarina of Time was probably 15fps most of the time but we didn't care in 1998 because it was freaking Ocarina of Time.

Some franchises will undoubtedly stay 60fps next gen like fighting games, FIFA, Madden, or Call of Duty because their developers think that's crucial to their games, and many people will notice the difference in gamefeel you usually get with 60fps, but that's still something really hard to advertise without hands-on play.
 
You think this is bad? The same people are the ones pushing for 60 FPS or more on motion pictures and TV just because they like smooth video games.
Criticism of the crappy 24fps standard that the American film industry pushed onto the world because it was cheap has nothing to do with videogames. It was a point of contention decades before videogames were a thing.
 
Top