After reviewing CPUs throughout the year and revisiting the most interesting matchups post-launch, in this buying guide we put it all together to keep your CPU shopping as simple as possible.
After reviewing CPUs throughout the year and revisiting the most interesting matchups post-launch, in this buying guide we put it all together to keep your CPU shopping as simple as possible.
You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.The 13900k and 14900k are faster than the 7950x3d in productive apps. See here: https://www.pugetsystems.com/labs/a...tion-review/#Video_Editing_Adobe_Premiere_Pro
And since when did power users care about energy consumption?
Love is irrational.You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.
It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.
I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
Did you even look at the benches? It wins in everything but blender and vray...You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.
It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.
I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
oh yeah, 5-10% (at most) “winning” for 92% more power usage…Did you even look at the benches? It wins in everything but blender and vray...
Intel draws double the power if you want it to draw double the power. Reviewrs go into the bios and remove power limit and then test. That's nonsense. A 14900k at the same power limit as the 7950x offers very similar performance. Amd is not more power efficient, reviewers are just running intel balls to the wall just so they cna pretend they are not efficientYou mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.
It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.
I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
Again, that's nonsense. Performance and power draw doesn't scale linearly. You say the difference is only 10%, ok, how much power do you think the 7950x would need to run 10% faster? Probably double, exactly as much as the 14900k needs.oh yeah, 5-10% (at most) “winning” for 92% more power usage…
Kinda embarrassing Intel doesn’t wipe the floor lets be honest, for such massive power draw, I’d want, minimum 20-30% more performance but for 92% more power, you’d be looking for 50% more performance.
As it stands, the fact AMD keep up or even beat Intel is simply embarrassing.
Then why 14900K takes 400 watts and not 220? Because Intel decided so.Fact is at 220w the 14900k does 40k. Pushing it to 400 watts to get 43k is just dumb, you can't do that and then complain about the power draw.
No, because the reviewer went into the bios and CHOSE the unlimited option.Then why 14900K takes 400 watts and not 220? Because Intel decided so.
Try to make the 7950x 3d match the 14900ks performance and then tell me how much power it used. If I had to bet I'd say more than the 14900k, which is the point.“For example, running Cinebench, the 14900K system consumed 533 watts, which was 92% more than the 277 watts used by the 7950X3D”
Soo when you limit both CPU's to lets say... 80 watts? Sound fair?Try to make the 7950x 3d match the 14900ks performance and then tell me how much power it used. If I had to bet I'd say more than the 14900k, which is the point.
He tested a multithreaded workload and the 14900k is laughably behind? I call that bs. Sorry havent seen the video, but if that is what he found out I'm not clicking it, it's just flawed.Soo when you limit both CPU's to lets say... 80 watts? Sound fair?
Lets go with 80 watts... let me find a reviewer that's done that hold on...
Ok, der8auer has done exactly this, found the 14900K to be laughably behind AMD
Are you going to now try and argue that the Intel CPU is better on power at a very specific range? Because if you are, you're just making AMD look better and better
Oh yeah, der8auer is famous for flawed reviews /sHe tested a multithreaded workload and the 14900k is laughably behind? I call that bs. Sorry havent seen the video, but if that is what he found out I'm not clicking it, it's just flawed.
So you are saying that he actually tested multithreaded workloads with both cpus at 80w and the 14900k was laughingly bad? Is that what you are saying? I asked before but you never answered that. I have to assume you are lying cause as you've said yourself, he is not famous for flawed reviews,but you get the benefit of the doubt. Please, answer the question, thanks.Oh yeah, der8auer is famous for flawed reviews /s
I'll leave it at that, it's just hilarious to see the Intel fanboys like yourself literally just, cover your eyes and ears and pretend away.
I'm squirming? The failure of my position?! HAHAHAHA! You are a funny one.You call me an Intel fanboy but you keep squirming around the actual point not to admit the failure of your position
Your original position was that the 14900k needs 90% more power for 10% more MT performance than it's competitors. Which is of course correct, but on the same note, those competitors ALSO need 90% more power for 10% more performance. So - it's completely irrelevant - in mt workloads both amd and intels high end solutions are kinda tied (and intel wins in mid to low end). Set them both to the same wattage and then test - that's what sane people do that care about efficiency. If you allow one to draw 4096 watts and limit the other one to 150w, of course the latter is going to be more efficient, lolI'm squirming? The failure of my position?! HAHAHAHA! You are a funny one.
I have no position, Fact is, the AMD CPU is better in gaming, its usually faster, and in other workloads it's 5-10% behind for 90% less power usage.
Is that what I said? I wasn't just quoting the article at all?Your original position was that the 14900k needs 90% more power for 10% more MT performance than it's competitors.
Set them both to the same wattage and then test...
Okay, Okay, you got me, I very quickly searched up anyone who'd done this (not a huge amount tbf since nobody buying a 14900K would limit like this) but it still shows, Intel sucksThen you posted a video of der8auer that supposedly tested a 14900k at 80w in MT workloads and it was laughably bad. Which is a lie, cause that never happened, I in fact checked the video. So...why are you making stuff up? I don't get it