AMD Ryzen 5 2400G and Ryzen 3 2200G Integrated Graphics Frequency Scaling
by Gavin Bonshor on September 28, 2018 12:30 PM EST- Posted in
- CPUs
- AMD
- GPUs
- Overclocking
- Zen
- APU
- Vega
- Ryzen
- Ryzen 3 2200G
- Ryzen 5 2400G
Overclocking Ryzen APU Integrated Graphics
As we detailed in our dedicated AMD Ryzen 2000 series APU overclocking guide and analysis, both the Ryzen 5 2400G and the Ryzen 3 2200G chips benefit from overclocking the CPU core frequency. The results on heavy computational workloads gave noticeable improvements across the board, while the gaming results didn't seem to have the expected 'oomph' so to speak. When it comes to overclocking the integrated graphics, the key directive and conclusion is that when a user is overclocking the CPU core frequency, it lessens the headroom available to overclock the integrated graphics and visa versa. It is a real push and pull situation.
Overclocking the integrated graphics on the MSI B350I Pro AC motherboard was quite easy as there are three main variables to consider.
- The graphics clock frequency is the primary frequency setting, measured in MHz
- The SoC voltage (or CPU NB Voltage) is directly linked to the integrated graphics voltage.
- The CPU core frequency and CPU core voltage can also help/hinder.
Beware the Black Hole!
When overclocking the integrated graphics core on both APUs, rising through the frequencies, we experienced an anomaly or 'black hole' between 1350 and 1500 MHz. Regardless of the settings, there was a clear issue in the system.
While the system would boot at these frequencies, the combination of overclock and drivers caused random reboots of the test system, constant and consistent crashes during gaming, and Windows Crash Reporter stating a 'Video_TDR_Failure' of the atikmpag.sys process. This error is usually an indication of a faulty or corrupt graphics driver, but in this instance, it happened across multiple systems with three different motherboards and operating systems using the AMD's Adrenalin Edition 18.5.1 WHQL drivers (the latest at the time of testing). The issue bestowed itself regardless of the GPU driver used, or BIOS revision on any of the motherboards tested.
At 1500 MHz however, the issue went away. Our experience is not unique - other professionals in the industry have experienced similar issues. We have reached out to AMD about this issue and will update when an official response is received.
Overclocking Results
The Ryzen 5 2400G has a base graphics frequency of 1250 MHz and the Ryzen 3 2200G has a base frequency of 1100 MHz. We were able to push both of them up to 1600 MHz, giving a 28% and 45% overclock respectively. For the completeness of data, we ran both APUs from 1100 MHz up to 1600 MHz, missing out the black hole. Here were our settings:
Integrated Graphics Frequency | SoC/iGPU GFX Voltage |
1100 MHz | 1.15 V |
1150 MHz | 1.15 V |
1200 MHz | 1.15 V |
1250 MHz | 1.15 V |
1300 MHz | 1.15 V |
1350 MHz | 1.15 V |
1400 MHz | ERROR |
1450 MHz | ERROR |
1500 MHz | 1.20 V |
1550 MHz | 1.25 V |
1600 MHz | 1.25 V |
49 Comments
View All Comments
PeachNCream - Friday, September 28, 2018 - link
Interesting analysis, though it's a bit of a foregone conclusion these days to expect a GPU overclock to improve performance in games more than a CPU overclock since the central processor, after a point, has very little role in increasing framerates.This one struck me as odd though - "...Ryzen APUs are marketed for 720p gaming, and while resolutions such as 2160p and 1440p are out of reach purely for performance reasons, we have opted to use moderate settings at 1080p for our testing."
Were the tests executed at 1080p so they would align better in the Bench? It seems more reasonable to test at 720p given the various limits associated with iGPUs in general and the use of 1080p just comes across as lazy in the same way Anandtech tests CPU performance in games at resolutions so high that GPU performance masks the differences in various processors. Tom's Hardware, back when the good doctor actually ran it, yanked resolution down as low as possible to eliminate the GPU as a variable in CPU tests and it was a good thing.
stuffwhy - Friday, September 28, 2018 - link
Just purely speculating, is it possible that 720p results are just great (60+ fps) and need no testing? One could hope.gavbon - Friday, September 28, 2018 - link
My reasoning for selecting 1080p gaming tests over 720p was mainly because the other scaling pieces were running at the same resolution. Not just the iGPU tests, but the dGPU testing with the GTX 1060 too. It wasn't a case of being 'lazy' but the majority of gamers who currently use steam use 1080p and as it's the most popular resolution for gamers, I figured that's where I would lay it down.neblogai - Friday, September 28, 2018 - link
Even if monitor is 1080p, a lot of 2200G users may want to run games on 1080p with resolution scaling, for better fps. In effect, at 720p or 900p. Most games support it these days. So, popularity of 1080p monitors does not really make 720p tests less useful for this level of GPU performance.V900 - Friday, September 28, 2018 - link
Would be great if you had tested just one game at 720p.I know this is what I would be interested in knowing/reading if I was a possible customer.
usernametaken76 - Sunday, September 30, 2018 - link
I honestly think this "majority of gamers who currently use steam use 1080p" argument is affected by a) laptop users (see the high number of 1366x768 users) and therefore game at whatever resolution their laptop panel is set to...Which leads one to ask what the point of testing desktop parts is when you use that as a basis for what and how to test.
TheJian - Friday, October 5, 2018 - link
agree 100%. They do a lot of dumb testing here. Ryan has been claiming 1440p was the "enthusiast resolution" since Geforce 660ti. I don't even think you can say that TODAY as I'm staring at my two monitors (have a 3rd also 1080p), both of which are 1200p/1080p.For me, I need everything turned on, and you need to show where it hits 30fps at that level. Why? Because the designers of the games didn't want you to play their game at "MODERATE SETTINGS"...ROFL. Just ask them. They design EXACTLY WHAT THEY WANT YOU TO SEE. Then for some reason, reviewers ignore this, and benchmark the crap out of situations I'd avoid at all costs. I don't play a game until I can max it on one of my two monitors with my current card. If I want to play something that badly early, I'll buy a new card to do it. All tested resolutions should be MAXED OUT settings wise. Why would I even care how something runs being degraded? Show me the best, or drop dead. This is why I come to anandtech ONLY if I haven't had my fill from everywhere else.
One more point, they also turn cards etc, down. Run as sold, PERIOD. If it's an OC card, show it running with a simple checkbox that OC's it the max the card allows as their defaults. IE most have game mode, etc. Choose the fastest settings their software allows out of their usually 3 or 4 default settings. I'm not talking messing with OCing yourself, I mean their choses 3-4 they give in the software for defaults. Meaning a monkey could do this, so why pretend it isn't shipped to be used like this? What user comes home with an OC card and reverts to NV/AMD default ref card speeds? ROFL. Again, why I dropped this site for the most part. Would you test cars with 3 tires? Nope. They come with 4...LOL. I could go on, but you should get the point. Irrelevant tests are just that.
flyingpants265 - Tuesday, March 5, 2019 - link
Hate to tell you this, but 4k is the enthusiast resolution now.808Hilo - Saturday, October 13, 2018 - link
Is it just me or are these test just for borderline ill people?Playing 4k with a 1080/1800/32/S970. Works reasonably well. I also do everything else in 4k. Would I go back to lower res? No way. Artifical benchmarking is one, real world is 4k. Test this rez and we get a mixed GPU, APU, CPU bench. Build meaningful systems instead of artificially push single building blocks. Push for advancements. T
Targon - Friday, September 28, 2018 - link
The big problem with these APUs is that they limit the number of PCI Express channels, so if you DO decide to add a video card, the APU in this case will reduce performance, compared to a normal CPU without the graphics.