Any of you feel like we’ve become so fixated on graphics and perfomance to the point where the actual game part of a video game is often overlooked, or at least underemphasized? I don’t know about the rest of you, but all I come across on social media regarding gaming is about resolution, ray tracing, DLSS/FSR, frame rates, frame time, CPU and GPU untilization, and all of that stuff, and I’m honestly sick of it! I mean performance markers have always been discussed when it comes to PC gaming, but now even console gaming is getting this treatment! Don’t you miss the days when you just installed the game and just played it? I know I do. What do you think?
To me it sounds more like the social media algorithms put you into the “gaming tech” corner so that’s all you see. Indie gaming is huge and not at all about graphics. Look at the currently popular games on Steam and a ton of them are technologically very basic.
Exactly. My feeling is more the opposite of what OP is saying. Gameplay and innovation is king. Just not in AAA games.
Also don’t forget retro games.
Even new games can be run on midrange hardware if you don’t crank up the settings.
People want big numbers and companies watch to sell the latest stuff. No one gives a platform for advocating low budgets, cheaper hardware and patient gaming.
I definitely think you’re in a bubble of AAA games. This is literally the middle of an indie game renaissance.
Get off of consoles, and get a midrange gaming PC.
You may as well have typed this in 2009 or 2015.
It used to be that people argued that it’s worth getting the new game console because “better graphics”. The console wars hasn’t gone anywhere, it’s just expanded.
In any case, in regards to just installing a game and playing it, no, not really. When I was playing games in college in 2012 it was still a time when you would open a game and go to the settings menu to adjust settings.
Sometimes it was just turning off motion blur, but there was always settings to change to try to reach a stable 60FPS.
Nothing changed, it just expanded. Now instead of 60FPS it’s a variable 60-240FPS. Instead of just 720p-1080p resolution, unless it’s portable, it’s 1080p minimum otherwise variable up to 4k. Instead of “maxing out” we now have raytracing which pushes software further than our hardware is capable.
These aren’t bad things, they’re just now 1) slightly marketed, 2) more well known in the social sphere. There isn’t anything stopping you from opening up the game and going right away, and there’s nothing stopping other people from wondering about frame timings and other technical details.
Sure, focusing on the little things like that can take away from the wider experience, but people pursue things for different reasons. When I got Cyberpunk 2077 I knew that there were issues under the hood, but my experience with the game at launch was also pretty much perfect because I was focused on different things. I personally don’t think a dip here and there is worth fretting over, but some people it ruins the game for them. Other people just like knowing that they’re taking full advantage of their hardware, hence figuring out the utilization of their components.
There’s one last aspect not mentioned. Architectures. 10 years ago games would just boot up and run… But what about games from 10 years before then? Most players not on consoles were having to do weird CPU timing shenanigans to be able to boot up a game from (now 20) years ago. We’re in the same boat now with emulation, which while emulation is faring better, X360/PS3 generation games that had PC ports are starting to have issues on modern Windows. Even just 5 or 6 years ago games like Sleeping Dogs wouldn’t play nice on modern PC’s, so there’s a whole extra aspect of tinkering on PC that hasn’t even been touched on.
All this to say, we are in the same boat we’ve always been in. The only difference is that social media now has more knowledge about these aspects of gaming so it’s being focused on more.
The one thing I do agree with though is that this is all part of software development. Making users need better hardware, intentional or not, is pretty crazy. The fact that consoles themselves now have Quality vs Performance modes is also crazy. But, I will never say no to more options. I actually think it’s wrong that the console version of games often are missing settings adjustments, when the PC counterpart has full control. I understand when it’s to keep performance at an acceptable level, but it can be annoying.
Always turn off motion blur and DoF if you can.
Well, game journalists need to sell gaming hardware and AAA games. Those guys have the ad money.
Just play what you like.
I definitely don’t see a fixation of performance lol
The reliance on AI upscaling and frame generation, while the entire game takes up half or your entire SSD shows that optimization is an after thought. These solutions make everything look pretty and smooth, at the cost of how it actually feels to play (input lag up the fucking ass that makes the game feel way worse). Couple that with the myriad of performance issues the majority of AAA games have at launch.
The focus is entirely on making something visually good looking that will sell millions in pre-orders alone.
Valheim was one of the best selling games and is still a huge success. Indies are getting better and more popular to the point that even big companies like Nexon are indiewashing their studio and pretending that Dave the Diver is an indie game with pixel art instead of a work of one of the biggest publishers there is. In my experience most of the gamers nowadays are people that grew up on minecraft, terraria or probably more likely today - roblox.
So basically no, I don’t think so. Maybe big studios want you to believe that and it might be true for a casual FIFA or CoD gamer but for anyone else, there are more options than ever and the supply of good smaller simpler games is just overwhelming, the days are too short to even keep track of them anymore.
I don’t really relate as I typically linger two or more years behind the cutting edge games and tech so by the time I get it my hardware can easily run it and I can actually just install the game and play.
That and all tge good games float to the top of the pile in that time so I rarely end up spending money on something I don’t enjoy.
That’s just triple A games, I think. Indie games are varied and don’t necessarily focus on graphics
There are a lot of phenomenal indie games. There also are still a couple of really good AAA games, but AAA gaming isn’t what it used to mean. In fact I’d be careful with AAA by default unless reviews state that the game is actually good. Ubisoft even tried to establish an “AAAA quality” game with Skulls & Bones or how it’s called and it’s a total flop.
The real quality these days lies in indie games or (mostly) independent gaming studios. I think it’s kind of safe at this point to just assume by default that Bethesda, Microsoft, EA, Activision-Blizzard and so on simply cannot produce actual good games anymore (there may be some exceptions, but again, wait for independent reviews, and unless it was independently verified, don’t trust them to produce a good game).
Another problem is the sheer mass of games flooding the market, because it means that true gems aren’t found so easily. But they exist. There’s no shortage of great games, you just have to look harder, and look in the right places.
this makes me nostalgic… people were saying this about fallout 3.
To be faaaaaaaaaaaiiiiir, a lot of that was tied up in the switch from overhead isometric view to first-person view.
Fallout 1/2 didn’t focus on graphics, they were in many ways point-and-click adventures. A lot of things you had to hover over for “flavor text” and every once in a while something only four pixels wide exists that you need to notice.
So the gameplay actually actively eschewed graphics in favor of things like flavor text and reading.
Further, the switch to first person broke the SPECIAL system, because how to you even manage a gun skill in a first person shooter without it feeling absurd? It made sense in isometric, even if it was often frustrating to miss an enemy when you had a 79% chance to shoot them in the balls. Putting that in a first person when you mag dump into someone standing right in front of you and half your shots feels a lot less realistic, and can quickly become frustrating in a more fast-paced first-person-shooter environment. The SPECIAL system feels absolutely slapped on as an afterthought in Fallout 3.
Also, the writing in Fallout 3 was that shitty Bethesda writing. The writing was just subpar compared to the prior two installments. Especially the fucking stupid ass end of the game.
I’d say a lot of those complaints were driven more by the perspective switch than anything else.