Easy Data-Parallel Distributed Training in Keras

Stacey Svetlichnaya
1
Mar
2020

Did you know you can massively accelerate model training time with a Keras utility wrapper function? Especially useful if you’re laser-focused on one experimental direction while extra GPUs idle on your system. Discover the magic trick of data-parallel distributed training:

Join our mailing list to get the latest machine learning updates.