Whenever you launch a game for the first time and head to the graphics settings menu, you may have noticed how it has already made most of the decisions for you. Things like texture quality, shadows, and post-processing are all selected based on what the game believes your GPU can handle. Some games, likeBattlefield 6, for example, default to the "Auto" preset to balance performance and visuals. Even the Nvidia app has an "Optimize" button for each game that pretty much does the same thing.
In my first couple of years as a PC gamer, I used to settle for these recommended settings because I didn't know any better. I assumed fiddling with the settings too much would mess up my experience unless I knew exactly what I was doing. But over the years, as I started tinkering more after watching game optimization tutorials on YouTube and paying closer attention to how games actually felt while playing, I realized those presets were far from optimal. Now I know they're nothing more than a safe guess.

5 graphics settings in games that you can disable without a visual downgrade
It's worth knowing which graphics settings you can do without
Recommended settings aren't for smoothness
Most of the time, they aim for a balance between visuals and performance

The problem with recommended settings, at least for me, is that they're still chasing visuals to an extent. For someone who's expecting buttery-smooth performance at high frame rates, that trade-off becomes noticeable very quickly. Sure, these settings may be better than the "High" or "Ultra" preset, but they still aim to make the game look good rather than making it feel as smooth as possible. While that makes sense for a default preset, it doesn't line up with how I want my games to feel while playing on a high refresh rate monitor.
For instance, if shadow quality, post-processing, or view distance are set to high, they can introduce frametime spikes, especially during fast camera pans. Those frametime spikes add up over longer sessions and make the game feel less consistent than it should. You might not notice it immediately, but once you do, it's hard to ignore. At that point, the game never quite feels locked in, even though nothing looks obviously wrong when youmonitor your frame rates using MSI Afterburner.
Recommended presets don't account for bottlenecks
They focus on how powerful your GPU is, not how balanced your PC really is

You probably already know that the recommended preset relies heavily on GPU detection, and that's the problem. If a game detects a high-end GPU like the RTX 4090, it automatically assumes that the rest of the system can keep up. In reality, most PCs aren't perfectly balanced, including my own. I have my RTX 4090 paired with the 5800X3D, which is a 4-year-old CPU at this point. In fact, at one point, I had it paired with a 5900X, which used to limit its performance at 1440p, especially in competitive FPS titles.
Recommended settings don't account for that kind of imbalance. They see GPU headroom and crank up view distance, shadow quality, and world detail without realizing that it wouldoverwhelm older CPUs. That's exactly why I complained about frametime spikes and inconsistencies earlier. When an aging CPU, or even slower RAM, becomes the limiting factor, recommended settings stop being a reliable starting point and hurt performance more than they help.
They're "good enough" for most people
But a balanced preset doesn't do the job when you're chasing extremes
I get that not everyone has the time or patience to dig through the graphics menu and understand what each setting does. For many gamers, a balanced preset is exactly what they want, because games usually run fine and look good without you having to do any trial and error. If you're someone who switches between multiple games in a day or just wants something that works without thinking about it, recommended settings do their job reasonably well.
The problem is that balance only works if you're happy sitting in the middle. The moment you start chasing a specific outcome, whether that's peak visuals or the smoothest possible performance, the recommended preset stops making sense. It doesn't push graphics settings as far as they can go without unnecessary compromises or dial things down enough for consistent frame delivery. That's why I said it's "good enough." It works for the average case, but falls short the moment you care aboutoptimizing for what you actually want.
I've learned to trust my own judgement over presets
Over the years, I've slowly learned exactly what each setting does, whether it makes the game more CPU- or GPU-bound, and how it affects smoothness in real gameplay. I didn't learn that from chasing average FPS or watching an optimization guide on YouTube. I just paid attention to how games actually felt while playing, especially during longer sessions. Once I knew which options caused frametime spikes, unnecessary CPU load, or visual changes I barely noticed, tweaking graphics settings felt more predictable and rewarding than recommended presets ever did.

3 reasons I’d rather lower graphics settings than upgrade my GPU
Why spend thousands for modest gains?








