NVIDIA Introduces dual Cortex A9 based Tegra 2
by Anand Lal Shimpi on January 7, 2010 2:00 PM EST- Posted in
- Smartphones
- Mobile
A month ago NVIDIA shared this slide with me:
It's a graph of the total available market, according to NVIDIA, for its Tegra SoC (System-on-Chip). This year alone NVIDIA estimates that there's around a $4B market for Tegra. Next year it grows to $6B. By 2013 the total available market for NVIDIA's Tegra SoC reaches over $10B. That's more money than NVIDIA ever made from the PC market.
In order to compete in that space you need a competent chip. Today NVIDIA is announcing its second generation Tegra SoC. It's creatively named the Tegra 2 and this is what it looks like in block diagram form:
The SoC is made up of 8 independent processors, up from 7 in the original Tegra. The first two are the most exciting to me - a pair of ARM Cortex A9 cores. These are dual-issue out of order cores from ARM running at up to 1GHz. If you thought the A8 was fast, these things should be much faster.
The original Tegra used a single ARM11 core. It was multi-core capable but the only version NVIDIA ever shipped only had a single ARM11. By now you know that ARM11 is unreasonably slow and thus you see my biggest problem with Tegra 1. Tegra 2 addresses this in a grand way. NVIDIA skipped over Cortex A8 entirely and went to what it believes is a more power efficient, higher performing option with the A9. I'll go deeper into the A9's architecture shortly, but to put it bluntly - A8 is dead in my eyes, Cortex A9 is what you want.
The next processor is an audio decode core. NVIDIA acquired PortalPlayer in 2007 for somewhere around $350M. PortalPlayer SoCs were used in the first five generations of iPods. PortalPlayer contributed to much of NVIDIA's know how when it came to building SoCs and audio decoders. NVIDIA is particularly proud of its audio decode core, claiming that it can deliver system power in the low 10s of mW while playing an MP3. It's difficult to quality that claim. Microsoft lists Zune HD battery life at 33 hours while playing MP3s, while Apple claims the iPod Touch can do the same for 30 hours. Is NVIDIA responsible for the Zune's longer MP3 playback battery life? I've got no clue.
Given that this isn't 1995, audio decoding isn't very hard nor very interesting so let's move on. The next two cores are for video encode and decode. On the encode side NVIDIA claims to be able to accelerate the encode of 1080p H.264 video. This is up from 720p in the original Tegra and particularly important for any handsets that might include a video camera. Bitrates, power consumption and other pertinent details remain unknown.
The video decode side is where NVIDIA believes it has an advantage. Tegra's video decode processor accelerates up to 1080p high profile H.264 video at bitrates in the 10s of megabits per second. The Samsung SoC in the iPhone 3GS is limited to only 480p H.264 decode despite Samsung claiming 1080p decode support on its public Cortex A8 SoC datasheets. NVIDIA insists that no one else can do 1080p decode at high bitrates in a remotely power efficient manner. Tegra's 1080p decode can be done in the low 100s of mW. NVIDIA claims that the competition often requires well over 1W of total system power to do the same because they rely on the CPU to do some of the decoding. Again, this is one of those difficult to validate claims. Imagination has demonstrated very low CPU utilization 1080p H.264 decode on its PowerVR SGX core, but I have no idea of the power consumption.
NVIDIA's numbers are interesting, but not 3rd party verified
So let's see, that's two ARM Cortex A9 cores, an audio core, video encode and video decode - we're up to five at this point. The next processor is used for image signal processing. In other words it's the core that drives a still/video camera in a Tegra handset. The processor supports up to 12MP sensors, auto whitebalance, auto focus and general video processing on either a still picture or a video stream. The output can be routed to the next core: Tegra 2's GeForce GPU.
NVIDIA wasn't willing to say much about Tegra's graphics core other than it was their own design. NVIDIA confirmed that the only 3rd party IP in Tegra 2 are the ARM cores, the rest is made in house. And if you were wondering, Tegra 2 is the other platform that Epic demonstrated its Unreal Engine 3 mobile technology on.
The GPU in Tegra 2 is the same architecture as Tegra 1 (OpenGL ES 2.0 is supported), just higher performance. NVIDIA expects a 2 - 3x performance increase thanks to improved efficiency, more memory bandwidth and a higher clock rate.
The original Tegra only supported LPDDR1, while Tegra 2 supports LPDDR2. The Zune HD's Tegra SoC had a 32-bit 333MHz datarate LPDDR1 memory bus, resulting in 1.33GB/s of memory bandwidth. Tegra 2 in a single package with integrated memory should deliver about twice that.
NVIDIA's believes while other SoC makers can promise higher theoretical performance, Tegra and Tegra 2 deliver better real world gaming performance thanks to everything from the hardware to the software stack. Given NVIDIA's experience in optimizing desktop GPU drivers, I've got no problems giving NVIDIA the benefit of the doubt here.
Tegra 1 was able to run Quake 3 at 720p with AA at over 40 fps, which according to NVIDIA was faster than any other SoC in a handset today. I haven't personally benchmarked Quake 3 on any SoCs so I can't really validate that claim either.
Ok, only one processor left and this one is simple. Tegra 2 (like Tegra) has an ARM7 processor that is used for chip management. It handles dataflow, power management and other similar tasks.
You'll notice the one thing missing from NVIDIA's Tegra 2 is a cellular modem. There simply isn't one. NVIDIA's philosophy is to focus on the core compute functions of an SoC that require no carrier or FCC testing. An OEM could mate a Tegra 2 with a tried and true modem, lose out on the integration side but win in time to market. Given the sheer number of different wireless networks in the world, leaving the modem out of the design makes sense to me. But then again I don't make smartphones. It may prevent Tegra 2 from going into the cheapest solutions, but that's not where NVIDIA wants to be in any case.
55 Comments
View All Comments
Genx87 - Friday, January 8, 2010 - link
The fusion of the phone and entertainment is heading towards gaming on our phones. At the end of the day it is very possible we ditch laptops and home computers for a phone that does it all.Taft12 - Thursday, January 7, 2010 - link
You sure did beat the crap out of that strawman!I'll take a Tegra for playing a decent looking 3D game on a smartphone thank you very much!
sprockkets - Thursday, January 7, 2010 - link
They already exist and are better than what nVidia offers, thank YOU very much!!!tomaccogoats - Thursday, January 7, 2010 - link
Though this article has me wondering if Nvidia has enough ppl to spread around, I honestly don't know. If they have to same manpower as ati (once again i have absolutely no clue) then wouldn't ati start gaining the upper hand in terms of desktop graphic solutions?Spoelie - Friday, January 8, 2010 - link
NVIDIA is a much larger company than ATi was before AMD's acquisition, something to the tune of over double the engineers (in 2006...)You can't really compare the 2 at this moment anymore though
Taft12 - Thursday, January 7, 2010 - link
Enough people to spread around? WTF are you talking about? Companies have multiple core competencies and engineers don't get moved from desktop graphics projects to Tegra.When they purchased PortalPlayer in 2007 for $350M, what they were buying was not just IP, but SoC expertise of the employees that surely was/is a huge part of this product.
janiszalitis - Thursday, January 7, 2010 - link
Where have you been?ATI is totally blasting Nvidia out of desktop graphics market!
If NVidia wants to retain reasonable share there, they have to release something good sometime soon.
tomaccogoats - Friday, January 8, 2010 - link
I just meant that given the same amount of manpower at each company, nvidia seems to spreading their business to more markets than ati, which seems to be focusing more on desktop solutionsBoushh - Thursday, January 7, 2010 - link
Desktop graphics are not that important any more. Already more than 50% of the market is based on laptops. And this will only grow.What Nvidia is doing is finding other markets it can make products for. Because building a company on a single product line is a problem if that market disappears.
Nvidia is hoping that products like the Tegra and Tesla will eventualy take over the profit they previously got from desktop graphics.
And that's not to late. Already CPU's are here (Westmere) that have on die GPU's. That will be intergrated GPU's in a year or so. That means that Intel can (and AMD when they get there) provide more than 90% of the needs for graphics on PCs or laptops. That leaves only a nice market for powerfull graphics.
So there will be a market for GPUs in the future. But the real money will be in the netbook, tablet and smartphone market.
tomaccogoats - Thursday, January 7, 2010 - link
If it means longer battery life than i'm a happy camper :)