1. Learning Rate Finder
Choosing the right learning rate is critical for efficient training. FastAI’s `lr_find` method helps identify the optimal learning rate:
“`python
learn.lr_find()
“`
This command plots a graph showing how the loss changes with different learning rates. Use the graph to select a learning rate where the loss decreases sharply
2. Custom Loss Functions
You can define and use custom loss functions to tailor the `Learner` object to specific tasks:
“`python
from torch.nn.functional import binary_cross_entropy_with_logits
learn = cnn_learner(dls, resnet34, loss_func=binary_cross_entropy_with_logits, metrics=accuracy)
“`
3. Callbacks
Callbacks extend the functionality of the `Learner` object. Common use cases include:
– Early Stopping: Halt training when validation loss stops improving.
– Mixed-Precision Training: Reduce memory usage and speed up training:
“`python
from fastai.callback.fp16 import
learn = cnn_learner(dls, resnet34, metrics=accuracy, cbs=MixedPrecision())
learn.fine_tune(5)
“`
– Gradient Accumulation: Simulate larger batch sizes for memory-constrained environments:
“`python
from fastai.callback.gradient_accum import GradientAccumulation
learn = cnn_learner(dls, resnet34, metrics=accuracy, cbs=GradientAccumulation(n_acc=64))
“`
4. Exporting Models
Save and export your trained model for deployment:
“`python
learn.export(‘model.pkl’)
“`
Load the model for inference:
“`python
learn_inf = load_learner(‘model.pkl’)
prediction = learn_inf.predict(‘path/to/image.jpg’)
“`
Interpreting Results
FastAI provides robust tools for interpreting and visualizing model performance:
Confusion Matrix
The confusion matrix helps identify where the model is misclassifying data:
“`python
interp = ClassificationInterpretation.from_learner(learn)
interp.plot_confusion_matrix()
“`
Top Losses
Visualize the samples that caused the highest errors:
“`python
interp.plot_top_losses(5, nrows=1)
“`