As we’ve already reported, The Last of Us Part I is a really CPU-heavy game. And this made me wonder. How would the game run on our older Intel Core i9 9900K? According to reports, it should perform horribly. However, it appears that this old Intel CPU can run the game with constant 60fps at Ultra Settings.
For our tests, we used our Intel i9 9900K with 16GB of DDR4 at 3800Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 531.26 drivers. Yeap, you read that right. Although the game recommended using the latest one, we used an older driver. In theory, this should introduce some stability problems, right? Well, the game ran fine and we didn’t experience any crashes.
As always, we used the game’s Prologue sequence for our benchmarks. This sequence is more demanding than all other later areas. Thus, it can give us a better idea of how the rest of the game will run.
As the title suggests, our Intel Core i9 9900K was able to push over 60fps at all times. And yes, that was on Ultra Settings. In fact, and in order to further test the CPU, we used a native 1440p resolution (instead of a 4K res).
So, here is the game running at 1440p/Ultra settings on our Intel Core i9 9900K without Hyper-Threading.
As you will see, the frametime graph is a bit weird. Since our eight CPU cores were maxed out, we did witness some framepacing issues. And that’s where Hyper-Threading comes to save the day.
By enabling HT, we were able to get smooth frametimes. Surprisingly enough, our framerates did not increase during the heavy CPU scenes. Regardless of that, with HT, our Intel Core i9 9900K was able to provide an enjoyable experience.
Now I’m not saying that the game’s CPU utilization is justified. We’ve said it before and we’ll say it again. This game should be running better, and Naughty Dog should further optimize it. However, we’ve proved that even a 2018 CPU can run the game with constant 60fps. So no, you don’t need a top-of-the-line CPU in order to enjoy it.
What also surprised me was the fact that 16GB of total RAM was enough for it. Truth be told, there were some additional stutters (compared to our AMD Ryzen 9 7950X3D system which has 32GB of RAM). However, the game has fewer stutters than, say, Hogwarts Legacy.
Lastly, I should note that it took over 40 minutes to compile the shaders on our Intel Core i9 9900K. And no, we didn’t crash during that procedure. So apparently, we now have two NASA PC systems.
In summary, here are the things we believe Naughty Dog should be focusing on. The team should optimize the shader compilation procedure, as well as the initial loading. It’s inexcusable for this game to take this long to compile its shaders or load a save. The team should also improve performance on GPUs with 8GB of VRAM, as well as the quality of Medium Textures. Then we have the mouse issues (which should be addressed later today). And then, Naughty Dog should optimize the game to reduce overall CPU usage.
So yeah, there is a lot of work to be done here. Still, The Last of Us Part I is nowhere close to being described as an unoptimized mess. Because if it was, our Intel Core i9 9900K would not be running it as smoothly as it does. It’s nowhere close to what Batman Arkham Knight was. And no, it’s nowhere close to the launch versions of WILD HEARTS, Forspoken or Gotham Knights.
Stay tuned for more!
John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email