Manage episode 268290162 series 2385063
Sash Rush, of Cornell Tech and Hugging Face, catches us up on all the things happening with Hugging Face and transformers. Last time we had Clem from Hugging Face on the show (episode 35), their transformers library wasn’t even a thing yet. Oh how things have changed! This time Sasha tells us all about Hugging Face’s open source NLP work, gives us an intro to the key components of transformers, and shares his perspective on the future of AI research conferences.
Join Changelog++ to support our work, get closer to the metal, and make the ads disappear!
- DigitalOcean – DigitalOcean’s developer cloud makes it simple to launch in the cloud and scale up as you grow. They have an intuitive control panel, predictable pricing, team accounts, worldwide availability with a 99.99% uptime SLA, and 24/7/365 world-class support to back that up. Get your $100 credit at do.co/changelog.
- The Brave Browser – Browse the web up to 8x faster than Chrome and Safari, block ads and trackers by default, and reward your favorite creators with the built-in Basic Attention Token. Download Brave for free and give tipping a try right here on changelog.com.
- Fastly – Our bandwidth partner. Fastly powers fast, secure, and scalable digital experiences. Move beyond your content delivery network to their powerful edge cloud platform. Learn more at fastly.com.
- Sasha Rush – Twitter, Website
- Chris Benson – Twitter, GitHub, LinkedIn, Website
- Daniel Whitenack – Twitter, GitHub, Website
Notes and Links
Giveaway details!! Check this blog post for all the details to win a free copy of Dracula PRO && 14 Habits of Highly Productive Developers
- Hugging Face
- Transformers library
- Tokenizers library
- NLP (data and evaluation metrics) library
- Previous Practical AI episode with Hugging Face
- TechCrunch announcement about Hugging Face’s recent fundraising
- The annotated transformer
- 2000+ models in Hugging Face’s model hub
- Attention is all you need paper
- Mini Conf tools