Skip to main content

Build, train, deploy, and operationalize Hugging Face models on Amazon SageMaker (Level 200)

The field of natural language processing (NLP) is developing rapidly, and NLP models are growing increasingly large and complex. Through strong ecosystem partnerships with organizations like Hugging Face and advanced distributed training capabilities, Amazon SageMaker is one of the easiest platforms to quickly train NLP models. In this session, learn how to quickly train an NLP model from the Hugging Face transformers library with just a few lines of code using PyTorch or TensorFlow as well as SageMaker’s distributed training libraries. Download slides »
Speaker: Tapan Hoskeri, Principal Solutions Architect, AWS India
Duration: 30mins