- Joined
- May 6, 2019
- Messages
- 12,595
- Points
- 113
Image: NVIDIA
NVIDIA today introduced its A2 Tensor Core GPU, an entry-level GPU that’s capable of delivering 20 percent more inference performance than CPUs despite its low power and small footprint. Inference is a component of the Deep Learning process whereby trained models developed during the training (i.e., fact gathering or learning) phase are actually applied and processed in an application. NVIDIA’s A2 Tensor Core GPU provides up to 1.3x more performance in various intelligent edge use cases for this purpose.
From NVIDIA:
A2’s versatility, compact size, and low power exceed the demands for edge deployments at scale, instantly upgrading existing entry-level CPU servers to handle inference. Servers accelerated with A2 GPUs deliver higher inference performance versus CPUs and more efficient intelligent video analytics (IVA) deployments than...
Continue reading...