FastAI Live - Lessons 1-4 Marathon Recap Session

Recording link :
Timing: Saturday, Apr 11, 3:00-6:15 PM
Webinar Zoom link:

Session Notebook links:

To mark the halfway point in the course, we’ll be reviewing all the material for Chapters 1-4. This is a great opportunity to join the live course as a newcomer or to catch up if you’ve missed any of the previous lectures. Please feel free to invite your friends and colleagues to join this study group.

The following topics will be explained via Juptyer notebooks & live coding:

  • Chapter 1: Intro to Deep Learning & FastAI
  • Chapter 2: Creating Web apps with Deep Learning Models
  • Chapter 3: MNIST Image Classification in Depth
  • Chapter 4: Data Ethics

After the review session, you’ll be able to build state-of-the-art Deep learning models, create simple web apps to try out & share your models, get an understanding of the inner workings of how deep learning models are trained in PyTorch & FastAI, and become aware of the ethical challenges involved in using AI for real-world applications.

This review session will also give you the perfect foundation to follow along live for the lectures 5-8, which will be streamed over the next 4 weeks, on Wednesday, from 7-9 pm.


How to Invite Friends/Colleagues
This public meetup link contains all the details:


Will a recording of the webinar be available? Asking because my internet connection is quite unstable and it might be difficult to stream. Also, thank you for doing this



  • Haven’t watched lesson 1/2/3/4
  • Setup complete/ran the notebook once
  • Did some modifications to the code/visited the docs page
  • Completed assignment - lesson 1
  • Completed assignment - lesson 2
  • Completed assignment - lesson 3
  • Started my own project/training on a different dataset

0 voters

There Are problems Verifying Indian Phone Numbers On Kaggle !!
Update : Issue Solved

1 Like

What was the problem? i dont think its a common issue.

Hey @akshay10-bhardwaj Its being recorded.

1 Like

how do we do freeze ,unfreeze of layers Simply learn.unfreeze or freeze
and is learn.finetune is replacing learn.fit_one_cycle ?

1 Like

What is nbdev ? Where/how do we install it ? What is advantage of it ?

1 Like

cant run [ 29 ]

Whats the intuition on picking pre-train model like resnet18, resnet34 or any other deep layer architecture ?

Can you scroll down in the output error, so that we can see what the error is?

So fine tune automatically just replaces last layer? Can we select how many layers to replace?

what you mean by Top layer while you were talking about the fine tune…
Why do we have 2 epochs when we gave one. coud it be training top layer only with rest freezed and one epoch all unfreeze ?


freeze and unfreeze effectively allow you to decide which specific layers of your model you want to train at a given time (I believe it does this by setting requires_grad to False to turn off training for that layer). I believe this is done because we often use transfer learning, and the early layers of our model are already going to be well trained to doing what they do, recognizing basic lines, patterns, gradients…etc, but the later ones (which are more specific to our exact task, like identifying an animal breed) will need more training.

unfreeze will unfreeze all layers of your model, so you will be training the early and later layers, although you still may be training the different layer groups at different learning rates. This is called ‘discriminative learning rates’ or ‘discriminative layer training’.

freeze will set all of your layer groups except the last one to be untrainable. It appears from the documentation that this means we freeze the first layer group (the one that comes from transfer learning) and unfreeze the second (also last usually classifier part) group, to train more.

If you know the details of your architecture and want to do something in between unfreeze and freeze you can use freeze_to(n:int) to specify which layer groups you want to freeze and which you want to train. The first n layer groups will be frozen and the last n layer groups will be unfrozen.

Hope that helps!

Find more details here in fastai Forums
Explain What Freeze and Unfreeze does


thanks i understand these ,just was trying to understand what is going under the hood when we say fine tune which is new function in fastai 2

1 Like

How much data from your domain should have been present in pre-train model ? e.g. if we start looking at brand new domain (x-ray, protein image etc.) but pre-trained model did not have these domains at the time of training …would transfer learning is still valid/effective in these cases ?

1 Like

fine_tune combines what we were doing manually in previous version.

The first set of metric that you see is freezed training, then it saves the model, unfreezes layer and then loads the saved weights, then do training with unfreezed layers which is the second set of metrics that you see.

Docs Link


Yes, you can select the number of layers that you want to finetune.
In unfreezing as well they use Discriminative Learning rates that enable different learning rate for different layers.
Answer and details to your question is available in Fastai Documentation Learner.freeze_to


What is metrics used in unet learner?

learn = unet_learner(dls, resnet34) - here there is no mention of metrics?

1 Like