How to use BackPACK¶
If you haven’t already installed it,
pip install backpack-for-pytorch
Extending the model and loss function¶
import torch from backpack import extend from utils import load_data X, y = load_data() model = torch.nn.Sequential( torch.nn.Linear(784, 64), torch.nn.ReLU(), torch.nn.Linear(64, 10) ) lossfunc = torch.nn.CrossEntropyLoss() model = extend(model) lossfunc = extend(lossfunc)
Calling the extension¶
To activate an extension, call
backward() inside a
with backpack(extension): block;
from backpack import backpack from backpack.extensions import KFAC loss = lossfunc(model(X), y) with backpack(KFAC()): loss.backward() for param in model.parameters(): print(param.grad) print(param.kfac)
- backpack.extend(module: Module, debug: bool = False, use_converter: bool = False) Module ¶
Recursively extend a
moduleto make it BackPACK-ready.
Modules that do not represent an operation in the computation graph (for instance containers like
Sequential) will not explicitly be extended.
module – The module to extend.
debug – Print debug messages during the extension. Default:
use_converter – Try converting the module to a BackPACK-compatible network. The converter might alter the model, e.g. order of parameters. Default:
- class backpack.backpack(*exts: BackpropExtension, extension_hook: Callable[[Module], None] | None = None, debug: bool = False, retain_graph: bool = False)¶
Context manager to activate BackPACK extensions.
- __init__(*exts: BackpropExtension, extension_hook: Callable[[Module], None] | None = None, debug: bool = False, retain_graph: bool = False)¶
Activate BackPACK extensions.
Enables the BackPACK extensions passed as arguments in the
backwardcalls inside the current
exts – Extensions to activate in the backward pass.
extension_hook – Function called on each module after all BackPACK extensions have run. Takes a
None(no operation will be performed).
debug – Print debug messages during the backward pass. Default:
retain_graph – Determines whether BackPack IO should be kept for additional backward passes. Should have same value as the argument
extension_hook can be used to reduce memory overhead if the goal is to compute transformations of BackPACK quantities. Information can be compacted during a backward pass and obsolete tensors be freed manually (
ValueError – if extensions are not valid
Entirely disable BackPACK, including storage of input and output.
To compute the additional quantities, BackPACK needs to know the input and output of the modules in the computation graph. It saves those by default.
disabletells BackPACK to _not_ save this information during the forward.
This can be useful if you only want a gradient with pytorch on a module that is
extendedwith BackPACK and need to avoid memory overhead. If you do not need any gradient, use the
This context is not the exact opposite of the
backpackcontext enables specific extensions during a backward. This context disables storing input/output information during a forward.
with backpack(...)in a
with disable()context will fail even if the forward pass is carried out in