PC finally too old

If he went from a 1080 to a 4060 to a 4080 super, it's safe to say that he plays games. 7 years after it's release, an i7-7700k is fine for office processing tasks but won't hold a candle to a z790 system.
Sure but it depends on the game. Most of them are far more demanding of the GPU than the CPU.
 
Sure but it depends on the game. Most of them are far more demanding of the GPU than the CPU.

Indeed, most of them are more GPU demanding but pairing a 4080 with a 7700k is like putting a hellcat motor in an otherwise non-modified 91 civic. When I remote into a user's company computer I can almost always tell if they have an older pre-11 series CPU vs one newer by how smooth the mirroring program is (network strength not withstanding.) FWIW, I have a i5-7500 at work because I don't need anything stronger but it would drag down my 4090 at home like a lead anchor. An I7-7700k is equal to an i3-14100.
 
Indeed, most of them are more GPU demanding but pairing a 4080 with a 7700k is like putting a hellcat motor in an otherwise non-modified 91 civic. When I remote into a user's company computer I can almost always tell if they have an older pre-11 series CPU vs one newer by how smooth the mirroring program is (network strength not withstanding.) FWIW, I have a i5-7500 at work because I don't need anything stronger but it would drag down my 4090 at home like a lead anchor. An I7-7700k is equal to an i3-14100.
At 4k, which is what an RTX4080 owner would likely be playing, the CPU bottleneck is negligible. Even a dinosaur CPu would not drag it too much in modern games.
CPU bottleneck talks are an over-exaggerated problem.

Don't get me wrong, I'm not advocating for pairing top tier GPUs with bottom tier CPUs, but at the same time, just because one buys a top tier GPU, that doesn't mean you need a top tier CPU. A modern i5 or i7 CPU will not bottleneck a 4080.
 
At 4k, which is what an RTX4080 owner would likely be playing, the CPU bottleneck is negligible. Even a dinosaur CPu would not drag it too much in modern games.
CPU bottleneck talks are an over-exaggerated problem.

Don't get me wrong, I'm not advocating for pairing top tier GPUs with bottom tier CPUs, but at the same time, just because one buys a top tier GPU, that doesn't mean you need a top tier CPU. A modern i5 or i7 CPU will not bottleneck a 4080.

Right, an 13600k would be fine to pair with a 4090 but not a 7700k.
 
Are you sure about that? These seem to be doing pretty OK and the guy doesn't even test the 4k performance, likely because he knows the difference will be even less apparent.



Yes I'm sure. That video is horrible and nothing close to an actual test or benchmark. While his average FPS mirrors my own experiences with generational upgrades he doesn't list [what is commonly known as] the bottom 1% FPS; the stability of the frames that causes studders/microstudders.
 
You have plenty of horsepower there. The problem is most likely Windows. I get 10+ years from PCs. It's not like the 1990s when computers got twice as fast every year and we were constantly upgrading. Linux, unlike Windows, doesn't over time gradually consume the hard disk with crap and get incrementally slower. If you absolutely must keep running Windows, wipe/reformat the system/OS partition and reinstall, and that often restores performance. It at least worth a try before you bite the bullet and start replacing perfectly good hardware.
Problem is, at 2560x1440, I am no longer able to play 100fps+ with graphics set to Ultra. It is time. I am noticing a lot of stutters in PUBG without any associated lag/ping jumps, etc. I have rolled back NVIDIA drivers, set up a cache cleaner to automate, changed shader caches, all the tricks. The i7-7700k kinda caps out with the 4060ti, and it's just not enough VRAM and power to get the job done. The 1080Ti actually seemed more stable, but it developed heat synch issues, and there is no sense paying for ANOTHER new card to go into a nearly decade old system.

My Windows is actually super stable with zero issues.
 
Yes I'm sure. That video is horrible and nothing close to an actual test or benchmark. While his average FPS mirrors my own experiences with generational upgrades he doesn't list [what is commonly known as] the bottom 1% FPS; the stability of the frames that causes studders/microstudders.
Exactly, my 1% lows are KILLING me.
 
I bought my PC back in 2017. Since then, I upgraded the PSU, and graphics card, and added another 32gb of ram.

It began:

i7-7700k
1080gtx--->1080ti--->4060ti
32gb 2133ms----64gb
500w PSU--->850 gold+ PSU

Anyway, it's no longer pulling the performance I want, so...

i9-14900kf
Gtx 4080 Super
32gb 5600ms


Im proud to have gotten 7 years of use out of it with minor plug and play upgrades. How long do you all usually get from a PC?
Yours would be an upgrade for me. I am not a power user which is why my relics get the job done for me.
 
Yes I'm sure. That video is horrible and nothing close to an actual test or benchmark. While his average FPS mirrors my own experiences with generational upgrades he doesn't list [what is commonly known as] the bottom 1% FPS; the stability of the frames that causes studders/microstudders.
Well, there aren’t many people willing to test these old CPUs. But the point of this was to showcase that an older CPU can still work for some, especially if you’re on a budget and play non-competitive games, like elder ring (it’s locked to 60fps anyways) or similar.

Now OP stated he plays PUBG, which being a competitive shooter you want high and smooth FPS, so naturally it would not make sense to keep using his old CPU since it already stutters with a 4060ti.
 
  • Like
Reactions: Pew
I bought my PC back in 2017. Since then, I upgraded the PSU, and graphics card, and added another 32gb of ram.

It began:

i7-7700k
1080gtx--->1080ti--->4060ti
32gb 2133ms----64gb
500w PSU--->850 gold+ PSU

Anyway, it's no longer pulling the performance I want, so...

i9-14900kf
Gtx 4080 Super
32gb 5600ms


Im proud to have gotten 7 years of use out of it with minor plug and play upgrades. How long do you all usually get from a PC?
I was disappointed with the performance of Windows after Windows 7. Windows 10 and 11 have so much bloat and overhead and use so much memory unnecessarily. Also due to the poor design of Windows OS where each module has dependencies on several other OS modules, Windows will always have numerous new zero day vulnerabilities no matter how much you patch it.

Now, I don't like the deterioration in privacy of Windows 10, 11 and Office 365 where your data is no longer your data.

I'm looking at alternatives to Microsoft OS's.
 
Every 10 or so years I drop $25-50 and get a new computer. Last year I got an i5-6500 for $40, seems pretty fast to me, I even dropped the coin to up it to 16GB of RAM and a 128GB SSD. Who knows what will come next to replace it.
 
Our W10 desktop was purchased from Dell in Jan 2013. It gets used 4-6 hours per day. Nothing swapped-out or upgraded.
Originally was Windows 8 and certainly not eligible for W11. When Microsoft drops it's W10 protection in 2025, that's when I'll upgrade both the desktop and nine year-old Toshiba laptop,
 
Back
Top