Learn PyTorch: The best free online courses and tutorials

feature
Jul 06, 20208 mins
AnalyticsArtificial IntelligenceDeep Learning

Look no further than these excellent free resources to master the development of deep learning models using PyTorch

abstract fire rays 100152558
Credit: Thinkstock

Deep learning continues to be one of the hottest fields in computing, and while Google’s TensorFlow remains the most popular framework in absolute numbers, Facebook’s PyTorch has quickly earned a reputation for being easier to grasp and use.

PyTorch has taken the world of deep learning research by storm, outstripping TensorFlow as the implementation framework of choice in submitted papers for AI conferences in the past two years. With recent improvements for producing optimized models and deploying them to production, PyTorch is definitely a framework ready for use in industry as well as R&D labs.

But how to get started? You’ll find plenty of books and paid resources available for learning PyTorch, of course. But there are also plenty of resources on the Internet that will help you get to grips with the framework — for absolutely nothing. Plus, some of the free resources are of even higher quality than what you can pay for. Let’s take a look at what is on offer.

PyTorch.org tutorials

Perhaps the most obvious place to start is the PyTorch website itself. Along with the usual resources such as an API reference, the website includes more digestible works such as a 60-minute video and text blitz through PyTorch via setting up an image classification model. There are guides for both the standard and the more esoteric features of the framework, and when a new major capability is added, such as quantization or pruning of models, you’ll normally get a quick tutorial on how to implement them in your own applications.

On the downside, the code in the various tutorials tends to vary quite a lot, and sometimes standard steps will be missed or passed over in order to show off the feature that the tutorial is concentrating on rather than producing idiomatic PyTorch code. In fairness, the tutorial code has definitely improved over the past couple of years, but you do sometimes have to be a little careful. For this reason, I wouldn’t recommend using the PyTorch website as your primary resource for learning. Nevertheless, it’s a useful resource to have on hand — and the best place to learn how to use the latest new features.

Udacity’s and edX’s PyTorch deep learning courses

I’m bundling Udacity’s Introduction to Deep Learning with PyTorch and edX’s Deep Learning with Python and PyTorch together here as they have similar structures, cover a lot of the same ground, and appear to suffer from the same issues. They both have a traditional series of lectures that build up from the foundations of deep learning, introducing you to concept after concept, then tackling more complex scenarios such as image and text classification by the end of the course. This is a completely fine way to go about teaching deep learning, but it does mean that you’ll be sinking some considerable time into the lessons before you get to do anything exciting with PyTorch, unlike, say, what happens with the Fast.ai course.

Both the Udacity and edX courses do appear to suffer from being a little out of date in terms of content and PyTorch itself. You won’t learn anything about generative adversarial networks (GANs) or Transformer-based networks in either course, and the Udacity course is based on PyTorch 0.4. This isn’t necessarily a problem, but we’re currently at PyTorch 1.5, so you may find yourself running into deprecation warnings when trying to replicate code on the latest version. If you’re choosing between these two courses, I would give Udacity a slight edge over edX due to the Facebook stamp of approval.

Fast.ai’s Practical Deep Learning for Coders

Since its beginnings 2016, fast.ai has been the gold standard for free deep learning education. Every year, it has released a new iteration of its two-part course, iterating on the previous incarnation and pushing things forward a little every time. While the first year was based on Keras and TensorFlow, fast.ai switched to PyTorch from year two and hasn’t really looked back (though it has cast a few glances at Swift for TensorFlow).

Fast.ai has a somewhat unique approach to teaching deep learning. Other courses devote many of the early lectures and material laying the foundations before you even consider building even the tiniest neural network. Fast.ai is, well, faster. By the end of the first lesson, you’ll have built a state-of-the-art image classifier. This has led to some criticism that the Fast.ai course leans too heavily on “magic” rather than teaching you the basics, but the following lectures do give you a good grounding in what is happening under the covers.

And yet, I’d be a little hesitant to recommend Fast.ai as your sole resource for learning PyTorch. Because Fast.ai uses a library on top of the framework rather than pure PyTorch, you tend to learn PyTorch indirectly rather than explicitly. That’s not to say it’s a bad approach; the Part Two Lessons of the 2019 course include an astonishing set of lectures that builds a somewhat-simplified version of PyTorch from scratch, solving bugs in actual PyTorch along the way. (This set of lectures, I think, puts paid to any notion that Fast.ai is too magical, for what it’s worth.) That said, you might want to use Fast.ai in conjunction with another course in order to understand what Fast.ai’s library is doing for you versus standard PyTorch. 

EPFL’s Deep Learning (EE-559)

Next up, how about a course from an actual university? EE-559, taught by François Fleuret at the École Polytechnique Fédérale de Lausanne, in Switzerland, is a traditional university course, with slides, exercises, and video clips. While it begins with the basics, it does ramp up beyond what’s on offer with the Udacity and edX courses by taking in GANs, adversarial samples, and closes out with Attention mechanisms and Transformer models. It also has the advantage of being current with recent PyTorch releases, so you should be confident that you’re learning techniques and code that are not using deprecated features of the framework.

Other PyTorch learning resources

There are a few more resources that are very useful but perhaps not core to learning PyTorch itself. First, there’s PyTorch Lightning, which some describe as PyTorch’s equivalent to Keras. While I wouldn’t go that far, as PyTorch Lightning is not a complete high-level API for PyTorch, it is a great way of producing organized PyTorch code. Further, it provides implementations of standard boilerplate (for details like training, testing, validation, and taking care of distributed GPU/CPU setups) that you would otherwise end up re-writing for most of your PyTorch work.

The documentation on the project’s website includes some good tutorials to get you started. In particular, there’s a wonderful video that shows off the process of converting a normal PyTorch project to PyTorch Lightning. The video really shows off the flexibility and ease-of-use that PyTorch Lightning provides, so definitely have a look at that once you’ve mastered the basics.

Second, there’s Huggingface’s Transformers library, which has become the de facto standard for Transformer-based models over the past 18 months. If you want to do anything approaching state-of-the-art with deep learning and text processing, Transformers is a wonderful place to start. Containing implementations for BERT, GPT-2, and a brace of other Transformer models (with more being added seemingly on a weekly basis), it is an amazing resource. Happily, it also includes a selection of Google Colab notebooks that will get you up and running with the library swiftly.

And third, I can’t write his article without mentioning Yannic Kilcher’s explainer videos. These are not PyTorch specific at all, but they are a great way to keep track of current papers and research trends, with clear explanations and discussion. You probably won’t need to watch these when you start learning PyTorch, but by the time you’ve gone through some of the coursework mentioned here, you’ll be wanting to know what else is out there, and Kilcher’s videos point the way.

Learning PyTorch deep learning

If you’re looking to learn PyTorch, I think your best bet is to work through both the Fast.ai course and one of the more traditional courses at the same time. (My pick for the companion course would be EE-559, since it stays current with PyTorch.) As a bonus, there’s a Fast.ai book coming out in August that will be one of the best introductory texts for deep learning.

Based on the new FastAI2 library (which among other things has a multi-tiered API structure for easier integration with standard PyTorch), the Fast.ai book is likely to be essential for getting started in the field really quickly. And while I recommend buying a physical copy, you can read it all for free in notebook form on GitHub. Dive into the book, and you’ll be telling dogs from cats in no time at all!