The Ivy Bridge Preview: Core i7 3770K Tested
by Anand Lal Shimpi on March 6, 2012 8:16 PM EST- Posted in
- CPUs
- Intel
- Core i7
- Ivy Bridge
Note: This preview was not sanctioned or supported by Intel in any way.
I still remember hearing about Intel's tick-tock cadence and not having much faith that the company could pull it off. Granted Intel hasn't given us a new chip every 12 months on the dot, but more or less there's something new every year. Every year we either get a new architecture on an established process node (tock), or a derivative architecture on a new process node (tick). The table below summarizes what we've seen since Intel adopted the strategy:
Intel's Tick-Tock Cadence | |||||
Microarchitecture | Process Node | Tick or Tock | Release Year | ||
Conroe/Merom | 65nm | Tock | 2006 | ||
Penryn | 45nm | Tick | 2007 | ||
Nehalem | 45nm | Tock | 2008 | ||
Westmere | 32nm | Tick | 2010 | ||
Sandy Bridge | 32nm | Tock | 2011 | ||
Ivy Bridge | 22nm | Tick | 2012 | ||
Haswell | 22nm | Tock | 2013 |
Last year was a big one. Sandy Bridge brought a Conroe-like increase in performance across the board thanks to a massive re-plumbing of Intel's out-of-order execution engine and other significant changes to the microarchitecture. If you remember Conroe (the first Core 2 architecture), what followed it was a relatively mild upgrade called Penryn that gave you a little bit in the way of performance and dropped power consumption at the same time.
Ivy Bridge, the tick that follows Sandy Bridge, would typically be just that: a mild upgrade that inched performance ahead while dropping power consumption. Intel's microprocessor ticks are usually very conservative on the architecture side, which limits the performance improvement. Being less risky on the architecture allows Intel to focus more on working out the kinks in its next process node, in turn delivering some amount of tangible power reduction.
Where Ivy Bridge shakes things up is on the graphics side. For years Intel has been able to ship substandard graphics in its chipsets based on the principle that only gamers needed real GPUs and Windows ran just fine on integrated graphics. Over the past decade that philosophy required adjustment. First it was HD video decode acceleration, then GPU accelerated user interfaces and, more recently, GPU computing applications. Intel eventually committed to taking GPU performance (and driver quality) seriously, setting out on a path to significantly improve its GPUs.
As Ivy is a tick in Intel's cadence, we shouldn't see much of a performance improvement. On the CPU side that's mostly true. You can expect a 5 - 15% increase in performance for the same price as a Sandy Bridge CPU today. A continued desire to be aggressive on the GPU front however puts Intel in a tough spot. Moving to a new manufacturing process, especially one as dramatically different as Intel's 22nm 3D tri-gate node isn't easy. Any additional complexity outside of the new process simply puts schedule at risk. That being said, its GPUs continue to lag significantly behind AMD and more importantly, they still aren't fast enough by customer standards.
Apple has been pushing Intel for faster graphics for years, having no issues with including discrete GPUs across its lineup or even prioritizing GPU over CPU upgrades. Intel's exclusivity agreement with Apple expired around Nehalem, meaning every design win can easily be lost if the fit isn't right.
With Haswell, Intel will finally deliver what Apple and other customers have been asking for on the GPU front. Until then Intel had to do something to move performance forward. A simple tick wouldn't cut it.
Intel calls Ivy Bridge a tick+. While CPU performance steps forward, GPU performance sees a more significant improvement - in the 20 - 50% range. The magnitude of improvement on the GPU side is more consistent with what you'd expect from a tock. The combination of a CPU tick and a GPU tock is how Intel arrives at the tick+ naming. I'm personally curious to see how this unfolds going forward. Will GPU and CPUs go through alternating tocks or will Intel try to synchronize them? Do we see innovation on one side slow down as the other increases? Does tick-tock remain on a two year cadence now that there are two fairly different architectures that need updating? These are questions I don't know that we'll see answers to until after Haswell. For now, let's focus on Ivy Bridge.
195 Comments
View All Comments
sabot00 - Tuesday, March 6, 2012 - link
How long will Intel keep its HD Graphics increases?MonkeyPaw - Tuesday, March 6, 2012 - link
I don't understand the logic of selling a high end CPU with the best IGP. Seems like anyone running an it isn't going to stick with the IGP for games, and if they aren't gaming, then what good is that high-end GPU? Maybe the entire "Core i" line should use the HD 4000.Flunk - Tuesday, March 6, 2012 - link
Because the low end chips are just die-harvested high end chips it makes sense. No reason to disable it so they leave it on.And some people do actually use high end processors with IGPs. It's fairly easy to get one from a major OEM. It's stupid but most people don't know any better.
aahkam - Tuesday, March 27, 2012 - link
Funny comments I saw.What's wrong even if the High End CPU that comes with IGP?
Is High End CPU = Gaming Machine CPU? If that is your logic, you're a rich but shallow boy!
I do lots of Video Editing and Transcoding, I need High End CPU but None of the High End GPU beats Quick Sync in Transcoding in terms of Quality and Speed.
dqniel - Friday, April 6, 2012 - link
"and if they aren't gaming, then what good is that high-end GPU?"I feel like you missed that part. He's not saying that only gamers use high-end CPUs. He's saying that gamers using a high-end CPU won't care about the high-end iGPU because they won't use it. Also, non-gamers who need a high-end CPU generally won't see the benefits of the included high-end iGPU. So, he proposes that the better niche for the high-end iGPU would be on the more affordable CPUs, because then budget-minded gamers could buy an affordable CPU that has a relatively powerful iGPU integrated into it.
defter - Wednesday, March 7, 2012 - link
This is a mid-range CPU, not high-end one.High desktop CPUs (i7 3800-3900) don't have IGP.
KoolAidMan1 - Wednesday, March 7, 2012 - link
It is because laptops continue to get slimmer and slimmer. Getting good GPU performance without the compromises on the chassis that a dedicated GPU would force is the point.Tormeh - Wednesday, March 7, 2012 - link
This.My next laptop will have processor graphics for the sake of battery life and size, and whoever has the best graphics gets my money.
bznotins - Wednesday, March 7, 2012 - link
Seconded.aguilpa1 - Wednesday, March 7, 2012 - link
If the 4000HD is on the level of lets say a 560m I would not hesitate to get a laptop with no dedicated graphics but if it isn't I'm still going to go for the dedicated.