CPU Sweet Spot- Multi Use

You have a fast CPU, what is a somewhat faster and considerably more expensive CPU going to gain for you?
My wife games in 4K. When she selects some of the options such as RTX (not that I care one bit) , it absolutely uses up processing and GPU capability. It "seems" that games can, if you select the options, use more computing power than is commonly available, at which point, problems arise. Her video card is nearly always running the fans at 100% and showing that its processing power is maxed out.

So by purchasing the best you can afford, you get the best results. A $200 processor upgrade may not get you much, but over a generation or two, it does seem to add up.

tsqVwJetsB7L9BazpFkheZ-1200-80.png.webp
 
I usually assemble my own computers but sometimes I will order one from a custom shop and they only upcharge $44 from 16gb to 64 gb and I go for 64 because I have had great luck selling it afterwards.

At times in the past you got scalped for 8gb .When it's cheap I buy.
 
More ram doesn’t necessarily mean faster computer. If you’re playing a game or doing whatever and using 12gb of ram, having 64Gb over say 16Gb or 32Gb isn’t going to improve performance.
Correct, unless you are having memory pressure and your kernel starts paging, then the slowdown starts. I routinely run video and photo manipulation software that will consume massive amounts of RAM, up to 50GB! If I only had 8 or 16GB RAM, my system would be intolerably slow when editing.
 
My wife games in 4K. When she selects some of the options such as RTX (not that I care one bit) , it absolutely uses up processing and GPU capability. It "seems" that games can, if you select the options, use more computing power than is commonly available, at which point, problems arise. Her video card is nearly always running the fans at 100% and showing that its processing power is maxed out.

So by purchasing the best you can afford, you get the best results. A $200 processor upgrade may not get you much, but over a generation or two, it does seem to add up.
Interesting and it shows you what I know about gaming, nothing.
 
Interesting and it shows you what I know about gaming, nothing.
Me either. She keeps trying to get me to play some of the easier games. I am beyond inept at them, that is probably why I don't find them fun.

I can play the old Portal game, the first one, but can't finish the last chapter, no matter how hard I try. Now, I noticed that Steam will no longer support my PC and I'll be out o luck. What a shame :)
 
I don't find video games to be a good use of my time. So many other interesting things to do in life, but that's just my opinion.
 
Me either. She keeps trying to get me to play some of the easier games. I am beyond inept at them, that is probably why I don't find them fun.

I can play the old Portal game, the first one, but can't finish the last chapter, no matter how hard I try. Now, I noticed that Steam will no longer support my PC and I'll be out o luck. What a shame :)
Yeah me neither...
Never had any interest in games
 
My wife games in 4K. When she selects some of the options such as RTX (not that I care one bit) , it absolutely uses up processing and GPU capability. It "seems" that games can, if you select the options, use more computing power than is commonly available, at which point, problems arise. Her video card is nearly always running the fans at 100% and showing that its processing power is maxed out.

So by purchasing the best you can afford, you get the best results. A $200 processor upgrade may not get you much, but over a generation or two, it does seem to add up.

tsqVwJetsB7L9BazpFkheZ-1200-80.png.webp
In most games, at 1080P, a top-tier graphics card is no longer the bottleneck. But I have to ask the question: What is gained by going with a $700 CPU that yields an extra ~60 fps over a $300 CPU? Most midrange CPUs according the chart are maintaining 170+ FPS. Is there something gained from all that extra FPS above a 170fps baseline?

I find games at 60hz (the 4K TV I'm using as a monitor can only support that) perfectly playable. Personally I'd rather have the image quality of a higher resolution (say QHD or 4K) than an uber high frame rate which most monitors don't support anyhow.
 
But I have to ask the question: What is gained by going with a $700 CPU that yields an extra ~60 fps over a $300 CPU? Most midrange CPUs according the chart are maintaining 170+ FPS.
I run heavily modified Fallout 4.
it is resource hungry even in the base despite being older.
Some of my mods break precombines multiplying the design load on the system.

My specs are very much above whjat was recomended back trhen for 60 fps ( game hardceiling) but with 270 mods acting on it at same time I still get stutter.
 
In most games, at 1080P, a top-tier graphics card is no longer the bottleneck. But I have to ask the question: What is gained by going with a $700 CPU that yields an extra ~60 fps over a $300 CPU? Most midrange CPUs according the chart are maintaining 170+ FPS. Is there something gained from all that extra FPS above a 170fps baseline?

I find games at 60hz (the 4K TV I'm using as a monitor can only support that) perfectly playable. Personally I'd rather have the image quality of a higher resolution (say QHD or 4K) than an uber high frame rate which most monitors don't support anyhow.
Can’t just look at the average FPS. You need to look at the 1% and 0.1% lows, as a CPU that’s capable of cranking out 100fps with all settings cranked might dip down to 60 or lower for super brief periods and you can feel feel that is a stutter.

Lots of gaming monitors now are also high refresh rate, they’re definitely smoother than a 60hz panel.
 
If it were me and I wanted an Intel chip, definitely a 13th gen. The 13600k equals 12900k in most tasks and beats it in games. And it’s more power efficient the the 12600k.

I went with AMD and for that I would recommend staying with the prior AM4 platform and a Ryzen 5800x. These are very well priced now and their new gen AM5 chips don’t have as big of a generational jump as the Intel ones.

Future proofing is just a slogan to get people to spend more money now. After 5 years or so, there is no changing single components to make upgrades, well maybe a graphics card.
 
Can’t just look at the average FPS. You need to look at the 1% and 0.1% lows, as a CPU that’s capable of cranking out 100fps with all settings cranked might dip down to 60 or lower for super brief periods and you can feel feel that is a stutter.

Lots of gaming monitors now are also high refresh rate, they’re definitely smoother than a 60hz panel.
The so called “cpu” bottle necking is really only applicable to low res gaming at 1080p. Pretty much all modern games, once you go to 1440p and above will all be GPU bound.

In reality it only happens for the reviews where these guys set specific game setting or mismatch the hardware to make the cpu bottleneck. In reality it doesn’t really happen, especially with new builds. I can only see it happening if someone had something like a 9th gen Intel and decided to get a RTX4090 for some odd reason. But who would actually do that?
 
The so called “cpu” bottle necking is really only applicable to low res gaming at 1080p. Pretty much all modern games, once you go to 1440p and above will all be GPU bound.

In reality it only happens for the reviews where these guys set specific game setting or mismatch the hardware to make the cpu bottleneck. In reality it doesn’t really happen, especially with new builds. I can only see it happening if someone had something like a 9th gen Intel and decided to get a RTX4090 for some odd reason. But who would actually do that?
Even top end CPU’s paired to a 4090 will average high FPS but have dips that halve the fps or worse, granted it’s still high.
IMG_8204.png
 
Even top end CPU’s paired to a 4090 will average high FPS but have dips that halve the fps or worse, granted it’s still high. View attachment 171620

You have to understand that they are "forcing" the CPU bottleneck just for the sake of the comparison. The game is only at 1080p, and medium settings, which makes the 4090 probably run at 30-40%, so no wonder the CPU cannot keep up. Crank up the resolution and settings and the CPUs, even the lower end ones will be just fine.

No one in their right mind would buy top of the line hardware only to game on the lowest settings. This is just a test to make a comparison, because some people always want the fastest, so the reviewers cater to them to keep the viewership.
GamersNexus actually mentions this on every CPU gaming test that it's just for comparisons sake.

Under normal use case these CPUs do not bottleneck the 4090, or a GPU in general, even something like the Ryzen 3600 will keep up just fine. Will it be slower overall than the latest gen CPUs, sure, but you will still get a nice gaming experience.
 
Last edited:
I used to obsess over CPUs when the performance jumps mattered, 1987-2010ish. After 2011 CPU performance performance became abstract and less important vs SSDs and GPUs. Today humming along with a Intel Core i5-9400F just before the pandemic hit, and a Radeon 6700 XT for gaming which handles everything I care about, mostly RPGs or tactical/strategy games.
 
I used to obsess over CPUs when the performance jumps mattered, 1987-2010ish. After 2011 CPU performance performance became abstract and less important vs SSDs and GPUs. Today humming along with a Intel Core i5-9400F just before the pandemic hit, and a Radeon 6700 XT for gaming which handles everything I care about, mostly RPGs or tactical/strategy games.
There was a time when AMD essentially died in the PC market. This was during the Phenom / Phenom II X2/3/4, bulldozer/piledriver shenanigans where AMD was conjuring up any marketing reason possible (like incorrect core counts which resulted in a class settlement) to buy their chips that were seriously outclassed by Intel.

Intel responded by basically... doing nothing but suck up market share and pay dividends to shareholders. They did minor / semi-minor improvements via the tick-tock cycle from perhaps Sandy Bridge/Ivy Bridge (if memory serves) on, until AMD finally released the first gen Ryzen. That period before Ryzen was the only time I thought CPU / platform updates were a complete waste. I encouraged people looking for new systems at that time to buy on the off-lease business market where you could get a top-notch PC for half the price of new.

From then on, Intel has largely been playing catch-up and it's impressive what they've done considering they're still using a 10nm fab process (albeit an advanced one.) There's decent gains to be had now in the CPU realm with each generation, but it only makes sense if you need it.
 
If it were me and I wanted an Intel chip, definitely a 13th gen. The 13600k equals 12900k in most tasks and beats it in games. And it’s more power efficient the the 12600k.

I went with AMD and for that I would recommend staying with the prior AM4 platform and a Ryzen 5800x. These are very well priced now and their new gen AM5 chips don’t have as big of a generational jump as the Intel ones.

Future proofing is just a slogan to get people to spend more money now. After 5 years or so, there is no changing single components to make upgrades, well maybe a graphics card.
The 13600k is right up there with the top tier. The increase above it is pretty minimal and I agree... Most cost effective to me is the 12600k at about $150 less. But the 13600k is very impressive.
 
Back
Top Bottom