AMD’s Ryzen 5 2400G is competent under stock clocks but how does it handle overclocking? Howdy howdy guys ponchato here, and welcome to part 2 of my Ryzen 5 2400G review: overclocking. In my last video I gave an overview and specs of the 2400G so if you want the full details of the processor, that video is linked in a card and in the description. I also added Fortnite and Rainbow Six Siege to my benchmark lineup – you guys requested them, so here they are. Now we’re all caught up so let’s get into the benchmarks. Using my Deepcool Gammaxx 400 to cool the 2400G, I was able to overclock it to 3.95GHz at 1.35V from a stock speed of 3.6GHz – it just couldn’t hit that magic 4GHz number, but it’s close enough. I also overclocked the built in Vega 11 graphics from 1.25GHz to at 1.2V. That’s about a 10% OC for the CPU and a fairly significant 28% overclock on the graphics core, but we’ll see how much of a difference that makes given that the graphics are relying on DDR4’s significantly lower bandwidth than the memory typically included with discrete GPUs.
The test setup for the 2400G is an MSI B350M Gaming PRO motherboard with two 4GB sticks of DDR4-2400 memory. Big thanks to Seasonic for providing the Focus Plus Gold 850FX which is used for all CPU and GPU tests so power efficiency is constant. First up we’ll look at the CPU benchmarks. Starting with the synthetic benchmarks, here are the Cinebench results. Even the stock 2400G demolished every other processor I’ve tested so far in multicore performance, and the overclock pushed it even higher. An interesting thing to note here is that the single core performance was only slightly higher after overclocking, likely due to the large frequency boost that second gen Ryzen processors can pull for single threaded applications. Next, another synthetic benchmark with CPU-Z. Single core performance was slightly under the i5-7500 at stock clocks and slightly over after overclocking, but multithreaded performance was in a completely different class. The 2400G scored almost 2500 after overclocking, versus 2300 at stock, versus the i5’s 1700.
That’s a huge increase in performance. Now we’ll look at the gaming results. First up, Battlefield 1. Overall, not much difference between stock and OC results. Low settings showed almost no change, while medium showed a 6% gain in average FPS and slightly better lows. Keep in mind, however, that’s only a 2FPS difference for the average, almost within measurement error. Performance on ultra improved quite a bit but still fell well short of the minimum playable 30FPS, so it’s kind of a moot point. Looks like the biggest limiting factor in Battlefield is something other than the CPU or GPU clock speeds – as mentioned earlier, probably memory bandwidth. Next we’ll look at CSGO. The OC results averaged 3 to 6% faster than stock, which is a decent improvement. The biggest difference, however, showed up in the 1% lows; almost 50% higher on medium settings and 65% on low – that’s a big change. Even 1% lows on ultra increased by a third, but that was only from 12 to 16 FPS. Improvement of 0.1% lows was less impressive but still pretty significant with a 21% increase on both low and medium settings.
Counter Strike does seem to make good use of higher clock speeds. Next we’ll look at Deus Ex: Mankind Divided. This is a very demanding game and I was kind of hoping that the overclock would push it into playable territory but unfortunately, it didn’t. The roughly 1 to 2 FPS improvement at all settings still fell well below that 30FPS cutoff, so performance gains here aren’t worth much. Now we’ll look at the first of two new additions to my benchmarking suite: Fortnite. This game has a surprising range of performance between the lowest and highest settings, but that’s mostly because at lower settings it’ll significantly decrease its rendering resolution. As I’ve occasionally experienced while overclocking components before, overclocking the 2400G raised average FPS slightly but dropped the lows quite a bit. Having an extra 5 or 6 FPS on average isn’t worth the significantly more noticeable stuttering. A more conservative OC could help mitigate this, but since you’re only looking at a max of a few FPS increase, for Fortnite it probably isn’t worth the extra power and noise to overclock at all.
Next in line is GTA 5. The margins between stock and overclocked results are so small as to be basically meaningless, and the averages were actually all less than 1 FPS apart. Lows saw some improvement but again, this could just as easily be attributed to benchmark variability. In this case, GTA 5 is probably limited mostly by memory bandwidth, rather than CPU or GPU clock speeds. Next up, Just Cause 3. After overclocking , average FPS improved by about 8 to 10% across the board, a pretty decent increase in performance. Lows, oddly enough, dropped by about 10% at the same time.
The most notable difference I saw is that running the game on the highest settings becomes viable with the overclock: it is right at that 30 FPS cutoff, but that’s good enough to call it playable. Now, Overwatch. Like several other games, overclocking gave very modest performance improvements. One FPS here, two FPS there, and you’ve got a fairly underwhelming picture. If you want every single frame you can get, overclocking for Overwatch would be worth it, but in my opinion that tiny, arguably undetectable performance difference wouldn’t be worth the extra power draw and noise. And that difference in power is far from small, but we’ll get to that later in the video. PUBG is up next. As before the only really playable settings were very low, but the OC does give a decent performance bump. Higher settings showed similar performance gains but all fell below 30 FPS so they’re all moot points.
On very low, the 3 FPS higher average and 1 FPS higher 1% lows aren’t much in absolute terms, but when you’re dealing with frame rates in the 30s, that’s well worth the time spent setting up the overclock. Second from last is the second new addition to my benchmarking lineup: Rainbow Six Siege. On low and high settings (which should really be called medium since they’re the middle option) average FPS improved by about 4.5%. Lows saw a pretty good improvement as well, but ultra settings are what stood out to me; an 8.5% increase in average FPS and about a third faster in the lows.
The big thing to note there is that on ultra settings, after overclocking, the game never dropped below 30 FPS which makes for very smooth gameplay. Rainbow Six had arguably the best results for overclocking of the entire lineup, and the fact that it can run well even on ultra is a pretty big benefit. Last in the list is Rocket League. Similar to Overwatch, performance before and after overclocking was almost identical. Averages were all the same, 1% lows were within a few frames, but 0.1% lows did see some changes.
On low settings the 0.1% low actually dropped after overclocking, but on the highest settings they improved by 6 frames per second. That may be the most significant change in the Rocket League benchmarks; after overclocking, it never dips below 30 FPS on the highest settings. To give a general overview of game performance, here are the average results in Esports titles – CSGO, Overwatch, and Rocket League. Average FPS improved by about 2-5%, 1% lows by 9 to 15%, and 0.1% lows were roughly the same. That’s a modest improvement, but on average the overclock does give you free extra performance, so you might as well take it.
Now the results in non-Esports titles, like PUBG, GTA 5, and Rainbow Six. Average FPS improved by about 3 to 5% while lows were generally up or down a few frames. The big takeaways here are that the 2400G is generally good for about 60FPS on the lowest settings, remains playable at medium, but falls short on high or ultra settings. While it did help in some games, you shouldn’t expect overclocking to make ultra settings playable in every title. Finally FPS per dollar. Keep in mind the 2400G is a CPU and GPU all in one – a typical graphics card can hit .6 to .8 FPS per dollar but that’s just using the price of the GPU itself. If you include the cost of an equivalent CPU, most graphics cards would be around .3 like the 2400G is here. Finally we’ll look at power and temperatures. Measurements are taken under 3 conditions: idle results which means sitting idle with no programs running, CPU load running Prime95 to stress the processor, and APU load which means Prime95 to stress the CPU and Unigine’s Valley to stress the integrated graphics.
Power results are for the entire test setup, and temperatures are reported as deltas; degrees above ambient temperature. First, deltas with the stock cooler. I include these to show out-of-the-box performance without an aftermarket cooler. An interesting thing about the 2400G is with Precision Boost 2, AMD’s equivalent to Intel’s Turbo Boost, the 2400G will increase its frequency and voltage right up until it hits 85C or 3.9GHz, basically overclocking itself automatically depending on the load and cooling provided.
That’s why the stock deltas are so high – the processor tries to maximize its performance while staying at a safe temperature. And while 85C does sound hot, it is well under the maximum temperature of 105. After manually overclocking, however, the stock cooler couldn’t take the heat and the system very quickly hit thermal shutdown. Next the temperature deltas with all processors using the same cooler, a Deepcool Gammaxx 400. AMD seems to do a better job of reducing power consumption at idle, with their processors running 3 or 4 degrees cooler than Intel’s. Load performance quickly turns that around though, with the overclocked 2400G hitting a nearly 50 degree delta under the APU load test – the hottest processor I’ve tested so far, and by a pretty big margin. Now for power consumption with just the CPU installed – the Ryzen 3 1200 isn’t included here because it doesn’t have an integrated graphics card and can’t function without a discrete GPU. Not surprisingly, the 2400G drew way more power than either of the Intel CPUs, especially after overclocking. And since the only other power consumers were the motherboard, SSD, and memory, the OC’d 2400G was taking around 130W to 150W under the APU test.
That’s a lot of power for such a tiny space. Finally, power consumption with an RX 550 – this is about the closest we can get to a direct comparison of CPU power draw between processors on different sockets. As with the CPU-only tests, the 2400G takes quite a bit more power than anything else tested here. Idle power consumption is a bit wonky and I’m 100% certain it’s due to the drivers. The RX 550 and 2400G use different versions of the AMD drivers, and I couldn’t get the 550 drivers to install with the 2400G drivers already installed, or vice versa. Until they fix that (or until I realize there’s some non-obvious way to install the drivers properly), idle consumption with an AMD GPU is going to suck like this. But, most people getting a 2400G probably aren’t going to pair it with a graphics card to begin with.
The 2400G is hot but overclocks well, and hits both check boxes as an APU: strong CPU performance combined with reasonably strong entry level gaming. It won’t compete with entry to midrange discrete GPUs like the 1050 Ti or 1060, but for a basic budget gaming system that can handle CPU-dependent tasks like video work (and handle them well, at that), the 2400G is a really strong contender. Plus, the range of future upgrades without needing a new motherboard or memory is expansive and will continue to grow for the next few years. Maybe not future “proof” but the 2400G is certainly future ready. If you want to pick up a Ryzen 5 2400G for yourself, click the link in the description. If you want to get notified of new videos as soon as they’re up, hit subscribe then click the bell icon to enable notifications. So guys if you liked this video hit the like button, if you want to see more hit subscribe, and if you have any questions on the 2400G or these results, leave them in the comments below.
Thanks for watching, I hope I helped, and I’ll see you in the next video..
As found on Youtube