I still have no idea what you mean by “denial”, again, no one is denying that Intel consumes 254 (or 241 watts) in rendering tests like Cinebench, for example.
But that’s not the whole point, you simply don’t read my post at all as it seems. Of course you have to fully test 13900K in Cinebench and then you will actually see such consumption. Where it goes wrong is that the average reviewer only tests Cinebench in terms of consumption test and then records ‘This CPU consumes 254W’, but doesn’t test further, thus showing that on other workloads the stock is 13900K (or 13700 A thousand as you like) can be more efficient and can consume less than, say, a stock 7900X or 7950X.
Which is why you see people here, but also on other forums or reddit (like the one I answered at the beginning) who have no idea about TDPs and actual consumption in different workloads on both Intel and AMD CPUs and think, for example, that The 13900K always draws 254 watts, while for a good portion of potential buyers this will never be the case as they don’t run workloads during their daily use where the CPU requires a lot of power. Which is just a shame, because this is only because many of the reviewers don’t do their jobs properly. It’s best for reviewers to simply paint a good picture of the consumption of different CPUs across all types of workloads and not just a table or mixer as many do. That would also seem more appropriate to AMD since they opted for 230W PPTs.
By the way, I think AMD made a rather stupid mistake there, instead of opting for 230W PPTs and following Intel, implementing a new economy mode on the ‘old’ 142W PPTs, I didn’t think it was a smart move. I would have preferred 142W PPT to remain the standard PPT and that they added a “performance” mode with 230W PPT and then only the 65W economy mode remained the same as before. Then you maintain your significantly better efficiency in heavy workloads on inventory, while there will be fewer complaints about performance mode with its associated higher utilization and lower efficiency, since it is optional.
In addition, I definitely don’t think you should praise Intel for being able to consume a few W less in games. And you certainly won’t see me doing that, for example, I think I’ve recommended more high-end Intel CPUs to someone for a long time, except for very specific scenarios. After all, they’re not going to want to fight a war a little more efficiently in games, but you name it.
Well, I’m a little tired of the Intel folks defending their consumption (the 7000 series isn’t defensible at AMD’s stock settings either). But the non-X and X3D versions actually fix that.
I hope you don’t mean that, because you don’t really know me My systems are full from AMD and I’ve been an AMD shareholder for years However, I am also a tech enthusiast and from that point of view I would like people who rely on reviews to give a complete picture, which doesn’t happen very often now.
[Reactie gewijzigd door Dennism op 26 februari 2023 09:41]
“Lifelong zombie fanatic. Hardcore web practitioner. Thinker. Music expert. Unapologetic pop culture scholar.”
Fiplay president: “I don’t think we’ll get the Eredivisie”
Astronomers have discovered one of the largest black holes thanks to gravitational lensing | Sciences
Figures on young people’s mental health rarely tell the whole story health