JAX-Toolbox
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
Example Docker image for JAX on AWS with EFA
Benchmarking collective communication operations in JAX
How to deploy a pre-trained YOLOv5 model on SageMaker