SageMaker Fridays Season 3, Episode 5 — NLP at scale with Hugging Face and distributed training

In this episode, we use state of the art models for natural language processing available in the Hugging Face collection. Then, we fine-tune BERT on a sentiment analysis dataset, and predict with the model. Finally, we show you how to scale your training jobs with data parallelism and model parallelism.


About the Author

Julien Simon is the Chief Evangelist at Arcee AI , specializing in Small Language Models and enterprise AI solutions. Recognized as the #1 AI Evangelist globally by AI Magazine in 2021, he brings over 30 years of technology leadership experience to his role.

With 650+ speaking engagements worldwide and 350+ technical blog posts, Julien is a leading voice in practical AI implementation, cost-effective AI solutions, and the democratization of artificial intelligence. His expertise spans open-source AI, Small Language Models, enterprise AI strategy, and edge computing optimization.

Previously serving as Principal Evangelist at AWS and Chief Evangelist at Hugging Face, Julien has authored books on Amazon SageMaker and contributed to the open-source AI ecosystem. His mission is to make AI accessible, understandable, and controllable for everyone.