It's time we take a fresh look at DDR4 vs DDR5 memory performance using the Core i9-14900K. We're benchmarking with fast DDR5-7200 and DDR4-4000 memory, noting that these days DDR5 is cheaper than DDR4.
It's time we take a fresh look at DDR4 vs DDR5 memory performance using the Core i9-14900K. We're benchmarking with fast DDR5-7200 and DDR4-4000 memory, noting that these days DDR5 is cheaper than DDR4.
Interestingly, in a role reversal, DDR4 memory now costs quite a bit more than DDR5 (!)
Interestingly, in a role reversal, DDR4 memory now costs quite a bit more than DDR5 (!)
True.Spending the money to upgrade memory and mobo money for 7% doesn't make sense to me. Just keep what you have.
They're both actually pretty cheap now, too. You can get "optimal" speeds for both for not much over the base speeds. If you're building a completely new system, the speed difference is more than the cost difference at this point. If it was this time last year then that'd be a vastly different answer, but things have settled down now in the memory spaceTrue.
I think this comparison is more useful for someone building a new system for how much DDR5 is worth over DDR4 (as both are widely available currently).
Yeah, it is, but if you're building a new system DDR5 is cheaper or equivalent price and performs slightly better.For casual gaming DDR4 is very sufficient..
Then you'd be absolutely wrong. The typical setting is "auto" on all the brands. It determines the best gear based on memory speed. ASRock does it this way, and so do the other major brands. Also, there is no gear 1 on Raptor lake. It's 2 and 4. They have different "gear ratios" from previous generations.Did you run Gear 1 mode for the DDR4? I doubt you did because you didn't even mention it in your article. If you didn't run your DDR4 with Gear 1 mode then those benchmarks as well as this article is meaningless. As far as I know, most motherboards run DDR4 memory with Gear 2 mode by default.
If you're a casual gamer, it wont. Because your GPU is nowhere near 4090 level, and your CPU is not a 14900k.Yeah, it is, but if you're building a new system DDR5 is cheaper or equivalent price and performs slightly better.
Those who upgrade every 1-2 years dont care about "sense", they buy the fastest they can whenever possible. A 7% increase is enough to justify it to them.Spending the money to upgrade memory and mobo money for 7% doesn't make sense to me. Just keep what you have.
That's funny, because I've been running Gear 1 DDR4 3600 CL14 on my MSI Tomahawk z690 with Raptor Lake since release... "13700K"Then you'd be absolutely wrong. The typical setting is "auto" on all the brands. It determines the best gear based on memory speed. ASRock does it this way, and so do the other major brands. Also, there is no gear 1 on Raptor lake. It's 2 and 4. They have different "gear ratios" from previous generations.
See my earlier post. Comparing DDR5-6400 vs DDR4-3600, DDR4 was substantially cheaper.Yeah, it is, but if you're building a new system DDR5 is cheaper or equivalent price and performs slightly better.
Those who upgrade every year wouldn't buy the DDR4 board version for 12th gen+. They would pay the premium for DDR5 board and memory.If you're a casual gamer, it wont. Because your GPU is nowhere near 4090 level, and your CPU is not a 14900k.
Future proofing would be a better argument.
Those who upgrade every 1-2 years dont care about "sense", they buy the fastest they can whenever possible. A 7% increase is enough to justify it to them.
For everyone else CPUs have long stopped being an upgrade thing and have become something that you use until it fails, like PSUs or cases. Really, only the GPU needs updated frequently anymore.
https://www.techspot.com/article/2769-why-refresh-rates-matter/If all you care about is FPS and price go DDR4. The human eye cannot percieve anything beyond about 60 fps. All of these people shooting for 100+ are just wasting money. You literally cannot see the difference. Same that we cant really visually spot the differences between 4K and 8K. Buy a nice 4K monitor that does at least 60 FPS and build your computer to run that. The tests I want to see are loading times, how fast apps and games open, how smooth they operate. Show the difference between DDR4 and DDR5 on speed, even between similarly priced parts.
If all you care about is FPS and price go DDR4. The human eye cannot percieve anything beyond about 60 fps. All of these people shooting for 100+ are just wasting money. You literally cannot see the difference. Same that we cant really visually spot the differences between 4K and 8K. Buy a nice 4K monitor that does at least 60 FPS and build your computer to run that. The tests I want to see are loading times, how fast apps and games open, how smooth they operate. Show the difference between DDR4 and DDR5 on speed, even between similarly priced parts.
This is backwards, it takes about 15-20fps to create the illusion of motion to our eyes. Our eyes do not have a frame rate. As we get older, the amount of frames we can perceive drops. Depending on age, the amount of frames we can perceive is between 40 and 90. As a general rule, the frame rate of a display should be twice our perceivable frame rate. At 37 years old I don't get any benefit from anything higher than 100-120 fps so I'm going to make the assumption that I can perceive about 50fps. I had a teenager sit down in front of my PC and he immediately noticed my "low refresh rate" 120hz display but you can put a 120 and 240 next to each other and I can't tell the difference.You'll find that the human eye cannot perceive anything beyond about 24 fps.
Currently, I agree. Right now, DDR4 is the best it's ever going to be and I feel we still have another 2ish years of DDR5 development before we see it peak. Although, the cost difference now is really about $20-30 for comparable sets so I wouldn't let that stop me if I was doing a new build. Not like it was a year ago where there was 100% cost difference for a slower set of DDR5 and that was if it was even in stock at the time.DDR5 looks more like a marketing gimmick imo. Or could be that it's still in early stages. The performance we're seeing now is not worth it imo.
No gear 1 for DDR5* . Gear one exists for DDR4 still, I'm not sure what the actual limit is for G1D4 but I think it's 4000MHz for 13th gen i7.Then you'd be absolutely wrong. The typical setting is "auto" on all the brands. It determines the best gear based on memory speed. ASRock does it this way, and so do the other major brands. Also, there is no gear 1 on Raptor lake. It's 2 and 4. They have different "gear ratios" from previous generations.
You got a source for the 60fps claim? Cause right now it looks like you just pulled it out your rear end.If all you care about is FPS and price go DDR4. The human eye cannot percieve anything beyond about 60 fps. All of these people shooting for 100+ are just wasting money. You literally cannot see the difference. Same that we cant really visually spot the differences between 4K and 8K. Buy a nice 4K monitor that does at least 60 FPS and build your computer to run that. The tests I want to see are loading times, how fast apps and games open, how smooth they operate. Show the difference between DDR4 and DDR5 on speed, even between similarly priced parts.
90 is about the minimum I think that's "fluid" enough. I can tell the difference between 120 and 144. I got someone a 240Hz screen and I can tell the difference from my own 144 screen.This is backwards, it takes about 15-20fps to create the illusion of motion to our eyes. Our eyes do not have a frame rate. As we get older, the amount of frames we can perceive drops. Depending on age, the amount of frames we can perceive is between 40 and 90. As a general rule, the frame rate of a display should be twice our perceivable frame rate. At 37 years old I don't get any benefit from anything higher than 100-120 fps so I'm going to make the assumption that I can perceive about 50fps. I had a teenager sit down in front of my PC and he immediately noticed my "low refresh rate" 120hz display but you can put a 120 and 240 next to each other and I can't tell the difference.
I don't feel comfortable with you on the road.You'll find that the human eye cannot perceive anything beyond about 24 fps.
It's my experience that response time is more important at high refresh rates than the actual refresh rate. I've seen 240hz displays that look like they're 60hz and I've seen some OLED panels running at 60hz that look much higher than that. But with all things being equal, I can't tell the difference between 120 and 240. I remember looking at 144hz OLED once and thinking that it looked like a glasses free 3D display.90 is about the minimum I think that's "fluid" enough. I can tell the difference between 120 and 144. I got someone a 240Hz screen and I can tell the difference from my own 144 screen.
Boy you act your 97 and not 37. There's professional athletes out there your age hitting 100mph fastballs and returning tennis serves that are faster.