from fastai.vision.all import *
9 Neural Networks with fastai (draft)
9.1 What is fastai?
As mentioned in the previous chapter, we will now look at a extremely popular deep learning framework called fastai
. fastai
is an open-source software library for machine learning, which provides high-level APIs for deep learning applications. It is built on top of PyTorch and is designed to be easy to use and flexible, allowing developers to quickly prototype and implement their ideas. fastai
provides a wide range of pre-trained models, including state-of-the-art models for computer vision, natural language processing, and recommendation systems. Additionally, fastai
offers a number of useful tools and utilities for data processing and model evaluation, making it a popular choice among researchers and practitioners alike.
The creators of fastai
have also created accompanying educational resources to assist in learning deep learning through the use of their framework. Both the course and book are highly recommended.
The creators also have a peer-review paper published explaining high-level functionality and layered approach to the fastai library- Link.

9.2 Creating a dataloader
= Path("../data/mnist_png/") dPath
dPath.ls()
(#3) [Path('../data/mnist_png/models'),Path('../data/mnist_png/testing'),Path('../data/mnist_png/training')]
get_image_files(dPath)
(#70000) [Path('../data/mnist_png/testing/0/10.png'),Path('../data/mnist_png/testing/0/1001.png'),Path('../data/mnist_png/testing/0/1009.png'),Path('../data/mnist_png/testing/0/101.png'),Path('../data/mnist_png/testing/0/1034.png'),Path('../data/mnist_png/testing/0/1047.png'),Path('../data/mnist_png/testing/0/1061.png'),Path('../data/mnist_png/testing/0/1084.png'),Path('../data/mnist_png/testing/0/1094.png'),Path('../data/mnist_png/testing/0/1121.png')...]
= DataBlock(
dataset = (ImageBlock(cls = PILImageBW), CategoryBlock),
blocks = get_image_files,
get_items = GrandparentSplitter(train_name='training', valid_name='testing'),
splitter = parent_label,
get_y = Resize(28),
item_tfms = None
batch_tfms
)
= dataset.dataloaders(dPath, bs=128) dls
print(dls.vocab) ## Prints class labels
print(dls.c) ## Prints number of classes
=24,figsize=(10,6)) ## Show sample data dls.show_batch(max_n
['0', '1', '2', '3', '4', '5', '6', '7', '8', '9']
10
0].shape, dls.one_batch()[1].shape dls.one_batch()[
(torch.Size([128, 1, 28, 28]), torch.Size([128]))
class MLP(nn.Module):
def __init__(self, n_in, n_out):
super().__init__()
self.model = nn.Sequential(
256),
nn.Linear(n_in,
nn.ReLU(),256, 128),
nn.Linear(
nn.ReLU(),128, n_out)
nn.Linear(
)def forward(self, x):
return self.model(x.view(-1,784))
## Defining the learner
= MLP(784, 10)
model = Learner(
mlp_learner = dls,
dls =model,
model=F.cross_entropy,
loss_func=dPath/"models",
model_dir=accuracy) metrics
## Finidng Ideal learning late
mlp_learner.lr_find()
SuggestedLRs(valley=0.0005754399462603033)
5,5e-2) mlp_learner.fit_one_cycle(
epoch | train_loss | valid_loss | accuracy | time |
---|---|---|---|---|
0 | 0.415048 | 0.640518 | 0.835600 | 00:37 |
1 | 0.299409 | 0.296634 | 0.935100 | 00:37 |
2 | 0.198500 | 0.212431 | 0.950100 | 00:38 |
3 | 0.119415 | 0.128112 | 0.967800 | 00:38 |
4 | 0.065490 | 0.106387 | 0.973300 | 00:38 |