Shortcuts

Conditional

class continual.Conditional(predicate, on_true, on_false=None)[source]

Module wrapper for conditional invocations at runtime.

For instance, it can be used to apply a softmax if the module isn’t training:

net = co.Sequential()

def not_training(module, x):
    return not net.training

net.append(co.Conditional(not_training, torch.nn.Softmax(dim=1)))
Parameters:
  • predicate (Callable[[CoModule, Tensor], bool]) – Function used to evaluate whether on module or the other should be invoked.

  • on_true (CoModule) – CoModule: Module to invoke on True.

  • on_false (CoModule) – Optional[CoModule]: Module to invoke on False. If no module is passed, execution is skipped.

Read the Docs v: latest
Versions
latest
stable
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.