Supported models¶
BackPACK expects models to be sequences of PyTorch NN modules. For example,
model = torch.nn.Sequential(
torch.nn.Linear(784, 64),
torch.nn.ReLU(),
torch.nn.Linear(64, 10)
)
This page lists the layers currently supported by BackPACK.
Do not rewrite the forward()
function of the Sequential
or the inner modules!
If the forward is not standard, the additional backward pass to compute second-order quantities will not match the actual function.
First-order extensions that extract information might work outside of this framework, but it is not tested.
For first-order extensions¶
BackPACK can extract more information about the gradient with respect to the parameters of the following layers;
torch.nn.Linear
torch.nn.Conv1d
,torch.nn.Conv2d
,torch.nn.Conv3d
torch.nn.ConvTranspose1d
,torch.nn.ConvTranspose2d
,torch.nn.ConvTranspose3d
First-order extensions should support any module as long as they do not have parameters,
but some layers lead to the concept of “individual gradient for a sample in a minibatch”
to be ill-defined, as they introduce dependencies across examples
(like torch.nn.BatchNorm
).
For second-order extensions¶
BackPACK needs to know how to propagate second-order information. This is implemented for:
Parametrized layers | torch.nn.Conv2d |
torch.nn.Linear |
|
Loss functions | torch.nn.MSELoss |
torch.nn.CrossEntropyLoss |
|
Layers without parameters | torch.nn.MaxPool2d
torch.nn.AvgPool2d |
torch.nn.Dropout |
|
torch.nn.ReLU
torch.nn.Sigmoid
torch.nn.Tanh |
The other convolution layers (Conv1d
, Conv3d
, and ConvTransposeNd
)
are not yet supported.