Conditional¶
- class continual.Conditional(predicate, on_true, on_false=None)[source]¶
Module wrapper for conditional invocations at runtime.
For instance, it can be used to apply a softmax if the module isn’t training:
net = co.Sequential() def not_training(module, x): return not net.training net.append(co.Conditional(not_training, torch.nn.Softmax(dim=1)))
- Parameters: