Today as part of NVIDIA’s fall GTC event, the company has announced that the Jetson embedded system kits will be getting a refresh with NVIDIA’s forthcoming Orin SoC. Due early next year, Orin is slated to become NVIDIA’s flagship SoC for automotive and edge computing applications. And, has become customary for NVIDIA, they are also going to be making Orin available to non-automotive customers through their Jetson embedded computing program, which makes the SoC available on a self-contained modular package.

Always a bit of a side project for NVIDIA, the Jetson single-board computers have none the less become an important tool for NVIDIA, serving as both an entry-point for helping bootstrap developers into the NVIDIA ecosystem, and as a embedded computing product in and of itself. Jetson boards are sold as complete single-board systems with an SoC, memory, storage, and the necessary I/O in pin form, allowing them to serve as commercial off the shelf (COTS) systems for use in finished products. Jetson modules are also used as the basis of NVIDIA’s Jetson developer kits, which throw in a breakout board, power supply, and other bits needed to fully interact with Jetson modules.

NVIDIA Jetson Module Specifications
  AGX Orin AGX Xavier Jetson Nano
CPU 12x Cortex-A78AE
@ 2.0GHz
8x Carmel
@ 2.26GHz
4x Cortex-A57
@ 1.43GHz
GPU Ampere, 2048 Cores
@ 1000MHz
Volta, 512 Cores
@ 1377MHz
Maxwell, 128 Cores
@ 920MHz
Accelerators 2x NVDLA v2.0 2x NVDLA N/A
Memory 32GB LPDDR5, 256-bit bus
(204 GB/sec)
16GB LPDDR4X, 256-bit bus
(137 GB/sec)
4GB LPDDR4, 64-bit bus
(25.6 GB/sec)
Storage 64GB eMMC 5.1 32GB eMMC 16GB eMMC
AI Perf. (INT8) 200 TOPS 32 TOPS N/A
Dimensions 100mm x 87mm 100mm x 87mm 45mm x 70mm
TDP 15W-50W 30W 10W
Price ? $899 $99

With NVIDIA’s Orin SoC set to arrive early in 2022, NVIDIA is using this opportunity to announce the next generation of Jetson AGX products. Joining the Jetson AGX Xavier will be the aptly named Jetson AGX Orin, which integrates the Orin SoC.

Orin featuring 12 Arm Cortex-A78AE “Hercules” CPU cores and an integrated Ampere architecture GPU with 2048 CUDA cores, adding up to 17 billion transistors, Given Orin's mobile-first design, NVIDIA is being fairly conservative with the clockspeeds here; the CPU cores for Jetson AGX Orin top out at 2GHz, while the GPU tops out at 1GHz. Otherwise, the SoC also contains a pair of NVIDIA’s latest generation dedicated Deep Learning Accelerators (DLA), as well as a vision accelerator to further speed up and efficiently process those tasks.

Rounding out the Jetson AGX Orin package, the Orin SoC is being paired with 32GB of LPDDR5 RAM, which is attached to a 256-bit memory bus, allowing for 204GB/second of memory bandwidth. Meanwhile storage is provided by a 64GB eMMC 5.1 storage device, which is twice the capacity of the previous generation Jetson AGX.

All told, NVIDIA is promising 200 TOPS of performance in INT8 machine learning workloads, which would be a 6x improvement over Jetson AGX Xavier. Presumably those performance figures are for the module’s full 50W TDP, while performance is proportionally lower as you move towards the module’s minimum TDP of 15W.

Meanwhile, for this generation NVIDIA will be maintaining pin and form-factor compatibility with Jetson AGX Xavier. So Jetson AGX Orin modules will be the same 100mm x 87mm in size, and use the same mezzanine connector, making Orin modules drop-in compatible with Xavier.

Jetson AGX Orin modules and dev kits are slated to become available in Q1 of 2022. NVIDIA has not announced any pricing information at this time.

Source: NVIDIA

POST A COMMENT

19 Comments

View All Comments

  • mode_13h - Friday, November 12, 2021 - link

    > the fact that Nvidia is doing custom ARM chips in-house is also a HUGE cost factor,
    > and the division working on this can probably only focus on one chip release at a time.

    Taking a chip design and chopping it into multiple sizes is what they do with their GPUs like almost every year.

    Looking at the specs, it seems that Xavier NX already is a cut down version of full Xavier. I don't know what else they have to do to cut the cost and power, but both are still too much to be a proper Nano successor.
    Reply
  • bondr - Thursday, November 11, 2021 - link

    It is also notable that it looks like Nvidia is moving towards a less-custom design with the Orin, choosing to adopt Cortex-A78 cores from ARM instead of developing a custom core (and focusing their customization more on the GPU and NPU aspects). Reply
  • bondr - Thursday, November 11, 2021 - link

    I was a bit rusty on the history of this whole setup. Now that I look back on it I'm remembering that Nvidia's custom cores were originally motivated by wanting to use Project Denver as a means of creating cores which abstract over both ARM and x86, but they were unable to obtain the licensing from Intel. That's probably why they are now winding back to using Cortex IP. Reply
  • kgardas - Tuesday, November 9, 2021 - link

    This is NVidia, please do not confuse that with "open-source", it's quite the contrary unfortunately. Reply
  • Yojimbo - Tuesday, November 9, 2021 - link

    most of nvidia's software is open source. And as far as open source, most of the world's open source software is designed and maintained by cloud companies like aws and google. their platforms still result in steep customer lock-in because each cloud builder has their own custom way of doing things. it's expensive for a customer to switch from one cloud to another or from cloud to on-site because of all the IT changes that are necessary (plus there are data egress fees).

    The point is that Nvidia is quite open source. but just like the cloud builders that doesn't prevent them from building a platform that allows them to profit from r&d investment. At this point nvidia surely invests more money in software r&d than hardware r&d. it's greedy to expect a company to spend billions of dollars developing functionality for its products only to turn around and make its products commodities. it might as well give away the verilog code for its chips too.
    Reply
  • Sivar - Tuesday, November 9, 2021 - link

    Quick note is that Microsoft is by far the largest open source contributor.

    They do have a large cloud service, but most of their open source is unrelated (e.g. the entire .NET ecosystem, Visual Studio Code, PowerShell, and contributions to MySQL and Linux).
    Reply
  • mode_13h - Tuesday, November 9, 2021 - link

    > most of nvidia's software is open source.

    Not their drivers or CUDA. Sure, the stuff around it, but they keep their crown jewels locked up tight.

    > their platforms still result in steep customer lock-in

    If Nvidia open sourced their drivers & API stack like AMD and Intel do, then we could have a community OpenCL implementation atop the bare hardware/drivers, rather than having to layer it atop CUDA.

    > it's greedy to expect a company to spend billions of dollars developing functionality for
    > its products only to turn around and make its products commodities.

    All of Nvidia's peers oupensource their drivers and API stacks, and they're hardly going out of business. We just want Nvidia to be a good Linux citizen, not a... um... what do you call it when an organism only takes from a host without giving back? Oh, that's right: a parasite.

    And this is the price they pay for their sociopathic tendencies:

    https://www.phoronix.com/scan.php?page=news_item&a...
    Reply
  • bondr - Tuesday, November 9, 2021 - link

    I don't like that Nvidia are the worst corporate citizen that is a major player in the open source world. Still, one could argue that their competitors are only better precisely because that offers them a competitive differentiation from Nvidia. Reply
  • mode_13h - Wednesday, November 10, 2021 - link

    > one could argue that their competitors are only better
    > precisely because that offers them a competitive differentiation

    By itself, that's not reason enough to do it. It's an added bonus, though.
    Reply

Log in

Don't have an account? Sign up now