Intel Core with Radeon RX Vega M Graphics Launched: HP, Dell, and Intel NUC
by Ian Cutress on January 7, 2018 9:02 PM ESTIntel’s Performance Numbers
*Disclaimer: All performance numbers in this section are from Intel and have not been independently verified
On the face of it, this new product is a 7th Generation H-Series CPU combined with a mid-range RX 560-570 class graphics chip, albeit paired with super-fast memory, but with an overall power budget that can cap performance. As a result, pure CPU workloads are not going to change from a Kaby Lake-H series processor. What will change is anything that needs graphics – moving up from the standard HD graphics to something discrete class offers performance for gaming and for OpenCL accelerated applications.
If you have heard that line before, it is because this is the exact way that AMD has promoted its combined CPU/GPU offerings in recent generations: get the benefits of high-performance graphics at a lower cost. Ultimately the biggest issue with the AMD devices based on Carrizo and prior (we haven’t tested Ryzen Mobile, yet) was the OEM choices to limit memory bandwidth that crippled gaming, the poor devices on offer, and the efficiency of the first few generations of parts.
What Intel has produced here is something that sits between AMD’s APUs but below a full discrete graphics solution, and arguably the target market there are the mobile devices currently running Intel CPUs with NVIDIA’s mid-range GPUs, such as the MX-150 or GTX 950M/1050/1060. Intel (or Intel’s customers) clearly believe that this is a market worth going after, enough of a market to buy semi-custom graphics silicon to do so. I bet NVIDIA is really happy (judging by their enterprise and high-end GPU business, they probably are very happy).
However, in Intel’s briefings, results were given compared to systems such as those listed above. Intel also has this habit of comparing new products to 3-year old systems, because in their mind these are the users that need to upgrade. As a result there are not so many apples-to-apples comparisons here, with benefits being shown coming from CPU and GPU improvements.
Comparisons with Vega M GL
First up is a comparison between the Core i7-8509G with Vega M GL, the 20 CU graphics part, up against a 3-year old Haswell-based Core i7-4720HQ with an NVIDIA GTX 950M.
Intel with Radeon vs 3-year Old System Data from Intel, not AnandTech |
|||
i7-8505G Vega M GL |
i7-4720HQ + GTX 950M |
Improvement | |
Sysmark 2014 SE | ? | ? | 1.6x |
3DMark Time Spy Graphics Score |
? | ? | 2.3x |
3DMark 11 Graphics Score |
? | ? | 2.2x |
Vermintide 2 (Av FPS) 1080p High |
47 FPS | 15 FPS | 3.0x |
Handbrake 1.07 4K to 1080p H.264 |
7 min | 48 min | 6.7x |
Adobe Premiere Pro 1min 4K H.264 |
6.5 min | 9 min | 42% |
The big increases are in the graphics, particularly the synthetic graphics scores: 3DMark on average scored 2.2x better. It is worth noting here that Intel only quoted these metrics in terms of the graphics score, and did not include the physics score (which would have been similar) or the combined score, as it would not have given a bigger comparison number. The game being used for comparison is one I’ve never heard of being used for comparisons before, but ‘3.0X better FPS’ is listed. I would have expected Intel to come on thick with the games here, as for Windows-based systems it could be a big draw. The other numbers are content creation related, relying on OpenCL acceleration to get anywhere from 42% quicker to a 6.7x transcode speedup (it wasn’t specified if the transcode was done on Haswell QuickSync or NVIDIA).
The next comparison is more recent, a Kaby Lake-Refresh based Core i7-8550U wuth a GTX 1050 up against the same Core i7-8509G with Vega M GL, the 20 CU graphics part.
Intel with Radeon vs i7-8550U + GTX 1050 Data from Intel, not AnandTech |
|||
i7-8505G Vega M GL |
i7-8550U + GTX 1050 |
Improvement | |
3DMark 11 Graphics Score |
? | ? | 1.3x |
Hitman (Av FPS) 1080p DX12 High |
46 FPS | 33 FPS | 1.4x |
Deus Ex: MD (Av FPS) 1080p DX12 High |
36 FPS | 27 FPS | 1.3x |
Total War: Warhammer 1080p DX12 High |
47 FPS | 42 FPS | 1.1x |
The graphics score on the 3DMark synthetic quoted by Intel is 1.3x that of the system with NVIDIA graphics. As a pure graphics score, this should be a pure GPU to GPU comparison. However, the three games listed are interesting choices: all are somewhat dependent on CPU performance.
What Intel has done here is compared a Core-U class 15W Core i7 with 60W NVIDIA graphics against a new system that has a Core-H class processor and Radeon graphics that can leverage 65W independently of each other. So for games like Deus Ex, it was clear that the CPU that can draw a lot more power is going to have a lot more fun here.
It comes across as a mismatched comparison, especially if we consider the price. Intel has not disclosed the prices for the new Intel with Radeon chips, but given the fact that it requires a semi-custom chip from AMD, a stack of HBM2, and the EMIB connection, it is fair to say that it probably costs substantially more than what an OEM would pay for a 15W Core i7 and an NVIDIA GTX 1050.
The plus side for Intel is that these tests can, arguably, show the benefits of Intel’s Dynamic Tuning technology, allowing the CPU or the GPU to pull up the power that is needed.
Comparisons with Vega M GH
For the high-powered testing, again Intel pushed a 3-year old system as well as a newer NVIDIA Max-Q system against the new chips. It is clear that from the data provided, Intel wants to promote the GH systems more for gaming.
Intel with Radeon vs 3-year Old System Data from Intel, not AnandTech |
|||
i7-8809G Vega M GH |
i7-4720HQ + GTX 960M |
Improvement | |
Sysmark 2014 SE | ? | ? | 1.6x |
3DMark Time Spy Graphics Score |
? | ? | 2.4x |
3DMark 11 Graphics Score |
? | ? | 2.7x |
Hitman (Av FPS) 1080p DX12 High |
62 FPS | 22 FPS | 2.7x |
Vermintide 2 (Av FPS) 1080p High |
64 FPS | 24 FPS | 2.6x |
Total War Warhammer* 1080p High |
70 FPS | 34 FPS | 2.0x |
Rise of the Tomb Raider 1080p DX12 High |
62 FPS | 31 FPS | 2.0x |
*Total War Wahammer was run on DX12 for the 8th Gen, and DX11 for the 7th Gen.
For these benchmark numbers, Intel is really being selective with what it is showing: ideally we would see all the systems used in these benchmark runs actually running all the same tests, however that might not put the systems in the best light. We will have to do our own testing here at AnandTech to get the full picture.
For the a more recent system, Intel put the GL up against a Core i7-7700HQ with GTX 1060 Max-Q graphics solution, the new combination announced last year designed to provide gaming systems with high end graphics in nicer looking chassis with tempered power draw. This is probably the best comparison for competing designs in the market.
Intel with Radeon vs i7-7700HQ + GTX 1060 Max-Q Data from Intel, not AnandTech |
|||
i7-8809G Vega M GH |
i7-7700HQ + GTX 1060 |
Improvement | |
3DMark 11 Graphics Score |
? | ? | 1.07x |
Hitman (Av FPS) 1080p DX12 High |
62 FPS | 57 FPS | 1.07x |
Deus Ex: MD (Av FPS) 1080p DX12 High |
49 FPS | 43 FPS | 1.13x |
Total War: Warhammer 1080p DX12 High |
70 FPS | 64 FPS | 1.09x |
Not much to say here, except it would be interesting to see one of the new chips go up against a Max-Q design in the same chassis.
66 Comments
View All Comments
Hixbot - Sunday, January 7, 2018 - link
Pigs just flew.tipoo - Sunday, January 7, 2018 - link
I'm pretty confident we'll see this in the 15" rMBP. If you remember Iris Pro which Apple nearly single handedly asked Intel for, I can see them being the ones to push for this as well.Wonder if that means a redesign, if not even just more batteries in the same space would be nice, or more room for cooling. And, cough, SD card. Most likely none of that, but the move up from the meager 80GB/s feeding the current GPUs would go a long way, even aside from the extra execution resources.
tipoo - Sunday, January 7, 2018 - link
That, uh, was meant to be a new comment.Comments, but editable.
nico_mach - Monday, January 8, 2018 - link
Based on the TDPs and looking around at the market, they seem tailor made for VR work on imacs, possible mac minis (though I think that line is dead as soon as the stock is sold off).willis936 - Monday, January 8, 2018 - link
Not even Tim Cook's Apple is crazy enough to put a 65W part in a 15" laptop. Hopefully at least.tyaty1 - Monday, January 8, 2018 - link
It is less than the combined TDP of CPU and the dGPU on the latest MBP.WinterCharm - Monday, January 8, 2018 - link
The MacBook Pro's current TDP is 87W.I'm amazed that they top out at 85ºC and barely throttle. It's kind of insane.
cditty - Sunday, January 21, 2018 - link
Actually, the 15" throttles a whole lot if you do anything 'Pro' with it. When I do video encodes, it begins slowing down after the first 10 solid minutes or so. I love the machine, the build, the look, BUT, you are wasting money if you get a CPU above the base, because you are rarely going to get full speed under sustained load. If they move to this model and can actually maintain speed, I will most certainly upgrade. The Kaby Lake MBP has been a disappointment to me.skavi - Monday, January 8, 2018 - link
You have to realize that we're already well past that point. There's a 45W CPU and 35W GPU in the current MacBook Pro.lefenzy - Monday, January 8, 2018 - link
The 15" rMBP ships with an 89W charger. So only the 65 W solutions would work here.