Developers or those of you interested in learning more about the Deep Learning Accelerator on NVIDIA’s Jetson Orin mini PC will be pleased to know that NVIDIA has published a new article over on its technical blog providing an overview of the Deep Learning Accelerator (DLA) when used with the Jetson system that combines a CPU and GPU into a single module. Providing developers with an expansive NVIDIA software stack in a small, low-power package that can be deployed at the edge.
Deep Learning Accelerator
Though the DLA doesn’t have as many supported layers as the GPU, it still supports a wide variety of layers used in many popular neural network architectures. In many instances, the layer support may cover the requirements of your model. For example, the NVIDIA TAO Toolkit includes a wide variety of pre-trained models that are supported by the DLA, ranging from object detection to action recognition. “
“While it’s important to note that the DLA throughput is typically lower than that of the GPU, it is power-efficient and allows you to offload deep learning workloads, freeing the GPU for other tasks. Alternatively, depending on your application, you can run the same model on the GPU and DLA simultaneously to achieve higher net throughput.”
“Many NVIDIA Jetson developers are already using the DLA to successfully optimize their applications. Postmates optimized their delivery robot application on Jetson AGX Xavier leveraging the DLA along with the GPU. The Cainiao ET Lab used the DLA to optimize their logistics vehicle. If you’re looking to fully optimize your application, the DLA is an important piece in the Jetson repertoire to consider. “
For more information on using the Deep Learning Accelerator with the Jetson Orin jump over to the official NVIDIA blog by following the link below.
Source : NVIDIA
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.