That Z2580/XMM7160 combo would have been more exciting if not for two things:
1) Qualcomm is going be shipping an LTE-A baseband in Q4 2012. Network support will be coming in 2013, so it is more of a "nice to have", but still...
2) Like you said in the article, that GPU will probably be behind the curve by Q1 2013. CPU performance will probably still compare favorably to Krait/Tegra 4/etc, though. Should be interesting times.
(...sigh) don't expect them to make the same mistake as IBM did back in the day. There will likely be alot of pressure from OEMs to make sure that everyday average builders are locked out. I mean think about it, we don't even really get to build our own laptops... chances of getting to build some phones are going to be fairly slim.
I can see how java and devlik cache can also run on x86 (since java is portable by nature), but wouldn't that require a LOT of optimization before an OEM can release a phone? What about updates? It would even extend the delay problems of updates and bug-fixes. There are other technologies that are (currently) ARM dependent such as Renderscript, and now we're talking about android kernel app development... I can only see this making Android more fragmented since devs will also now need to optimize their applications/apps for x86...
On the other hand, a viable market would be an x86 version of Windows Phone 8 (hint: PowerVR SGX 544 with DirectX compatibility). I hear that metro-style apps (also winRT apps) should transcend CPU architecture. It's still unclear whether the same WinRT app build would work on both architectures without modifications or recompiling, but if it does, then it'll be perfect, and it's just exactly what .NET should have been from the first place.
A "smaller" copy of Windows 8 for tablets (without desktop view, and only running metro-style apps) might also benefit from the new Atoms. IvyBridge (and beyond) for high end tablets, and these Atoms for lower-end budget tablets... sounds cool.
Huh? The Renderscript compiler compiles to byte code, similar to .Net or Java. The runtime decides what to run on GPU, what to run on CPU, etc. Very similar to CUDA. Once there is a Renderscript runtime and compiler for a platform, apps using Renderscript simply need a recompile. The Renderscript compiler takes care of optimizations at the byte code level.
Its much harder, in general, to develop an app for both Android and Windows than it is to develop an Android app for different hardware architectures. One of the benefits of Android's Linux roots.
Of course there has to be device drivers for any new platform's hardware, regardless of OS, but Intel has already done that. Since nearly all apps are written in high level languages and let the compilers handle optimization, I really don't think it is a big deal.
Did Intel provide numbers regarding power envelope or max TDP for those new chips?Is Intel preparing something DX11 capable, because PowerVR is only DX9?
Intel doesn't give too much information out until the chip is actually out in the market but the first single core version of the Medfield platform reportedly has a 2.6W TDP at idle and a maximum power consumption of 3.6W when playing 720P Flash video. While more details of the dual core version is still pending.
The SGX544 is limited to DX9 but the PowerVR series has versions that supports DX10.1 and the newest Power VR Series 6 supports DX 11.1 and OpenGL 4.2...
That's not CPU power, but overall platform power usage when idle and in load scenarios. So everything like memory, storage, screen, auxiliary devices use power.
Are those numbers right? It seems quite high to me compare to what Anand present to us earlier on.
I am getting slightly worried, because Intel has the resources and power as well as known how to literally everything in the semiconductor industry. And how they manage to catch up in such a short space of time.
22nm will bring them on par with power usage.
Then there is price which will sort itself out once Intel manage to overlap its SoC competitor by one nodes.
It looks like they won't have a 22nm until 2014. They might be able to catch-up in energy efficiency by then, but they can't do that by keeping pace with the performance increases that ARM chips also get.
And their catching up is by no means fast. They've been trying to launch an Atom based phone since 3-4 years ago, and it still seems they have more catching up to do in the power consumption department.
Also, ff a single core has a ~3W TDP, then their 2 cores chip will probably have an even higher TDP. They couldn't have cut the power consumption in half, while also doubling the performance in just one year. That's basically a 4x difference. Intel chips usually have only 30-40% difference in either power consumption or performance every year. They can't advance faster than that.
"It looks like they won't have a 22nm until 2014" That's just wishful thinking.
Intel's x86 PC chips could never match ARM at 45nm. They knew they'd need at least 32nm & they just used 45nm to get everything ready.
The delay until June for 22nm volume production will have no effect on the 22nm entry into smartphones. It seems that lower volume 22nm production server chips will start shipping in April and a delay in volume fab starts won't hold up converting the 32nm Medfield design to 22nm and getting it ready for prime time. We'll see 22nm Medfields shipping in 2013. We'll probably see first 14nm samples in 2014.
At what point do the chips get so small and power efficient that they pass ARM in performance, power & cost ? Probably 22nm.
I'm sure they will do once it hits the market. I hope they compare it with whatever is the most powerful then, and they also do a comprehensive test, including a video of how both react in the same time.
A single Sunspider test will not be nearly enough, and even that should be done with the same browser, so the browsers' own JS engines don't skew the results too much.
A netbook with the extra battery life boost/capabilities of one of these chips seems potentially fun. Are there technical reasons it won't happen, or won't be as awesome as it first sounds if it does?
the biggest red flag for me is the GPU. Intel has never had there GPU's run right, and I'm not crazy enough to think that this problem will magically disappear over night. Yes there taking the GPU from another company, but I am still skeptical. If a company that big, with that much money cant get there stuff to work at in there most heavily invested market, I am not going to assume that they will get it right on the first run in a market they have no experience in.
I just want Windows on my phone. I want to hook it up to a display and keyboard and use Visual Studio and play old games. These SoC's seem fully capable of this.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
25 Comments
Back to Article
A5 - Monday, February 27, 2012 - link
That Z2580/XMM7160 combo would have been more exciting if not for two things:1) Qualcomm is going be shipping an LTE-A baseband in Q4 2012. Network support will be coming in 2013, so it is more of a "nice to have", but still...
2) Like you said in the article, that GPU will probably be behind the curve by Q1 2013. CPU performance will probably still compare favorably to Krait/Tegra 4/etc, though. Should be interesting times.
iamlilysdad - Monday, February 27, 2012 - link
We'll also have TI's OMAP 5 with the all new A15 cores by then.A5 - Monday, February 27, 2012 - link
Hence the "etc".klmccaughey - Monday, February 27, 2012 - link
Here is a video: http://www.youtube.com/watch?v=66VeS0_tAJMI look forward to the day when we can get customizeable phones from Intel for builders like us :)
The battery life seems excellent given the power.
just4U - Monday, February 27, 2012 - link
(...sigh) don't expect them to make the same mistake as IBM did back in the day. There will likely be alot of pressure from OEMs to make sure that everyday average builders are locked out. I mean think about it, we don't even really get to build our own laptops... chances of getting to build some phones are going to be fairly slim.Kumouri - Monday, February 27, 2012 - link
In the article you said 1h next year (1H 2013) for release of the 2580, but in the chart you said 1H 2012, is this a typo? Just want to make sure :)lilmoe - Monday, February 27, 2012 - link
I can see how java and devlik cache can also run on x86 (since java is portable by nature), but wouldn't that require a LOT of optimization before an OEM can release a phone? What about updates? It would even extend the delay problems of updates and bug-fixes. There are other technologies that are (currently) ARM dependent such as Renderscript, and now we're talking about android kernel app development... I can only see this making Android more fragmented since devs will also now need to optimize their applications/apps for x86...On the other hand, a viable market would be an x86 version of Windows Phone 8 (hint: PowerVR SGX 544 with DirectX compatibility). I hear that metro-style apps (also winRT apps) should transcend CPU architecture. It's still unclear whether the same WinRT app build would work on both architectures without modifications or recompiling, but if it does, then it'll be perfect, and it's just exactly what .NET should have been from the first place.
A "smaller" copy of Windows 8 for tablets (without desktop view, and only running metro-style apps) might also benefit from the new Atoms. IvyBridge (and beyond) for high end tablets, and these Atoms for lower-end budget tablets... sounds cool.
Jaybus - Monday, February 27, 2012 - link
Huh? The Renderscript compiler compiles to byte code, similar to .Net or Java. The runtime decides what to run on GPU, what to run on CPU, etc. Very similar to CUDA. Once there is a Renderscript runtime and compiler for a platform, apps using Renderscript simply need a recompile. The Renderscript compiler takes care of optimizations at the byte code level.Its much harder, in general, to develop an app for both Android and Windows than it is to develop an Android app for different hardware architectures. One of the benefits of Android's Linux roots.
Of course there has to be device drivers for any new platform's hardware, regardless of OS, but Intel has already done that. Since nearly all apps are written in high level languages and let the compilers handle optimization, I really don't think it is a big deal.
lilmoe - Monday, February 27, 2012 - link
thanks for the clarificationmosu - Monday, February 27, 2012 - link
Did Intel provide numbers regarding power envelope or max TDP for those new chips?Is Intel preparing something DX11 capable, because PowerVR is only DX9?zeo - Monday, February 27, 2012 - link
Intel doesn't give too much information out until the chip is actually out in the market but the first single core version of the Medfield platform reportedly has a 2.6W TDP at idle and a maximum power consumption of 3.6W when playing 720P Flash video. While more details of the dual core version is still pending.The SGX544 is limited to DX9 but the PowerVR series has versions that supports DX10.1 and the newest Power VR Series 6 supports DX 11.1 and OpenGL 4.2...
IntelUser2000 - Monday, February 27, 2012 - link
That's not CPU power, but overall platform power usage when idle and in load scenarios. So everything like memory, storage, screen, auxiliary devices use power.Lucian Armasu - Tuesday, February 28, 2012 - link
Regardless, Krait and other ARM chips use a lot less, even under maximum load, let alone for the idle mode.iwod - Monday, February 27, 2012 - link
Are those numbers right? It seems quite high to me compare to what Anand present to us earlier on.I am getting slightly worried, because Intel has the resources and power as well as known how to literally everything in the semiconductor industry. And how they manage to catch up in such a short space of time.
22nm will bring them on par with power usage.
Then there is price which will sort itself out once Intel manage to overlap its SoC competitor by one nodes.
I am worried.....
Lucian Armasu - Tuesday, February 28, 2012 - link
It looks like they won't have a 22nm until 2014. They might be able to catch-up in energy efficiency by then, but they can't do that by keeping pace with the performance increases that ARM chips also get.And their catching up is by no means fast. They've been trying to launch an Atom based phone since 3-4 years ago, and it still seems they have more catching up to do in the power consumption department.
Also, ff a single core has a ~3W TDP, then their 2 cores chip will probably have an even higher TDP. They couldn't have cut the power consumption in half, while also doubling the performance in just one year. That's basically a 4x difference. Intel chips usually have only 30-40% difference in either power consumption or performance every year. They can't advance faster than that.
Hector2 - Tuesday, February 28, 2012 - link
"It looks like they won't have a 22nm until 2014" That's just wishful thinking.Intel's x86 PC chips could never match ARM at 45nm. They knew they'd need at least 32nm & they just used 45nm to get everything ready.
The delay until June for 22nm volume production will have no effect on the 22nm entry into smartphones. It seems that lower volume 22nm production server chips will start shipping in April and a delay in volume fab starts won't hold up converting the 32nm Medfield design to 22nm and getting it ready for prime time. We'll see 22nm Medfields shipping in 2013. We'll probably see first 14nm samples in 2014.
At what point do the chips get so small and power efficient that they pass ARM in performance, power & cost ? Probably 22nm.
iwod - Tuesday, February 28, 2012 - link
3W? Anand article stated otherwise.And as Hector has pointed out TriGate 22nm LP will bring power / performance on par with ARM.
It is only a matter of cost.
mosu - Monday, February 27, 2012 - link
thank you, zeo.can you provide a link?scores87 - Monday, February 27, 2012 - link
can someone at anand can post a benchmark between intel and krait?Lucian Armasu - Tuesday, February 28, 2012 - link
I'm sure they will do once it hits the market. I hope they compare it with whatever is the most powerful then, and they also do a comprehensive test, including a video of how both react in the same time.A single Sunspider test will not be nearly enough, and even that should be done with the same browser, so the browsers' own JS engines don't skew the results too much.
twotwotwo - Monday, February 27, 2012 - link
A netbook with the extra battery life boost/capabilities of one of these chips seems potentially fun. Are there technical reasons it won't happen, or won't be as awesome as it first sounds if it does?khimera2000 - Monday, February 27, 2012 - link
the biggest red flag for me is the GPU. Intel has never had there GPU's run right, and I'm not crazy enough to think that this problem will magically disappear over night. Yes there taking the GPU from another company, but I am still skeptical. If a company that big, with that much money cant get there stuff to work at in there most heavily invested market, I am not going to assume that they will get it right on the first run in a market they have no experience in.Lucian Armasu - Tuesday, February 28, 2012 - link
Intel is coming out with a SGX540 GPU in the second half of 2012?ET - Tuesday, February 28, 2012 - link
I just want Windows on my phone. I want to hook it up to a display and keyboard and use Visual Studio and play old games. These SoC's seem fully capable of this.name99 - Tuesday, February 28, 2012 - link
Why? What exactly is the value in being able to use a core+chipset optimized for low power in a situation where low-power is not essential?