Imagine a ±6GHz CPU with 3D cache. Now we just have to wait for LTT to fuckup the graphs
Stacked 3D cache is coming to Intel CPUs, and gamers should be excited (should we?)
Submitted 1 year ago by ekZepp@lemmy.world to games@lemmy.world
https://www.pcgamer.com/stacked-3d-cache-is-coming-to-intel-cpus-and-gamers-should-be-excited/
Comments
notaviking@lemmy.world 1 year ago
KalabiYau@lemmy.world 1 year ago
I think so. The AMD 3D cache CPUs are impressive in terms of gaming performance (though the inability to overclock them still leaves some benefit to non-3D cache CPUs. Non-3d cache CPUs are also great for everything other than gaming).
candyman337@sh.itjust.works 1 year ago
Oh boy can’t wait to have cups that burn a hole right through their coolers
deranger@sh.itjust.works 1 year ago
This only got bad with the most recent generation of CPUs. AMD 5xxx series is very efficient as demonstrated by Gamers Nexus. The Intel CPUs from 2500k to idk, 8xxx series? were efficient until they started slapping more cores and then cranking the power on them.
candyman337@sh.itjust.works 1 year ago
Yes the second thing about cranking power and cores is what I’m talking about.
Also, as far as gpus, the 2000 series was ridiculously power hungry at the time, and it looks downright reasonable now. It’s like the Overton window of power consumption lol.
Image
Fermion@feddit.nl 1 year ago
The 7 series are more efficient than the 5 series. They just are programmed to go as fast as thermals allow. So the reviewers that had really powerful coolers on the cpus saw really high power draw. If instead you set a power cap, you get higher performance per watt than the previous generations.
Having the clocks scale to a thermal limit is a nice feature to have, but I don’t think it should have been the default mode.
dandroid@dandroid.app 1 year ago
I know someone who works at Nvidia, and he said the problem is that Moore’s law is dead. Apparently the only way we can generate more performance right now is to input more energy and/or increase size.
Obviously that doesn’t scale forever, and the 40 series are already fucking massive. So where does that leave us with the 50 series? We need some breakthrough.
candyman337@sh.itjust.works 1 year ago
The real answer is ARM based systems and a new pcie slot standard that makes the traces closer to the cpu similar to that dell ram standard.
Also I genuinely doubt an architecture as recent as Lovelace is optimized as much as possible
ono@lemmy.ca 1 year ago
I felt the same when the current-gen CPUs were announced, but when I looked closer at AMD’s chips, I learned that they come with controls for greatly reducing the power use with very little performance loss. Some people even report a performance gain from using these controls, because their custom power limits avoid thermal throttling.
It seems like the extreme heat and power draw shown in the marketing materials are more like competitive grandstanding than a requirement, and those same chips can instead be tuned for pretty good efficiency.
candyman337@sh.itjust.works 1 year ago
Yeah I’m talking about Nvidia and Intel here, but tbh ryzen 4000 cpus run pretty hot, but they also optimized ryzen quite a bit before they changed to this new chip set, which makes sense to me. Seems like Nvidia and Intel are worried about what looks good power wise on paper rather than optimization sometimes.