The field of natural language processing (NLP) is developing rapidly, and NLP models are growing increasingly large and complex. Through strong ecosystem partnerships with organizations like Hugging Face and advanced distributed training capabilities, Amazon SageMaker is one of the easiest platforms to quickly train NLP models. In this session, learn how to quickly train an NLP model from the Hugging Face transformers library with just a few lines of code using PyTorch or TensorFlow as well as SageMaker’s distributed training libraries.
Speaker: Praveen Jayakumar, Principal Solutions Architect, AISPL