Shortcuts

The the modules for Continual Inference Networks are listed below. They are designed to be drop-in replacements for the torch.nn modules of the same name. Methods of the same name have identical interfaces and execute identical code. The modules are extended with the forward_step and forward_steps functions alongside common properties as found in continual.CoModule.

Containers

CoModule

Base class for continual modules.

Sequential

A sequential container.

Broadcast

Broadcast one input stream to multiple output streams.

Parallel

Container for executing modules in parallel.

ParallelDispatch

Reorder, copy, and group streams from parallel streams.

Reduce

Reduce multiple input streams to a single using the selected function

BroadcastReduce

Broadcast an input to parallel modules and reduce. This module is a shorthand for::.

Residual

Residual connection wrapper for input.

Conditional

Module wrapper for conditional invocations at runtime.

Convolution Layers

Conv1d

Continual 1D convolution over a temporal input signal.

Conv2d

Continual 2D convolution over a spatio-temporal input signal.

Conv3d

Continual 3D convolution over a spatio-temporal input signal.

Pooling Layers

AvgPool1d

Applies a Continual 1D average pooling over an input signal.

AvgPool2d

Applies a Continual 2D average pooling over an input signal composed of several input planes.

AvgPool3d

Applies a Continual 3D average pooling over an input signal composed of several input planes.

MaxPool1d

Applies a Continual 1D max pooling over an input signal.

MaxPool2d

Applies a Continual 2D max pooling over an input signal composed of several input planes.

MaxPool3d

Applies a Continual 3D max pooling over an input signal composed of several input planes.

AdaptiveAvgPool2d

Applies a Continual 2D adaptive average pooling over an input signal composed of several input planes.

AdaptiveAvgPool3d

Applies a Continual 3D adaptive average pooling over an input signal composed of several input planes.

AdaptiveMaxPool2d

Applies a Continual 2D adaptive max pooling over an input signal composed of several input planes.

AdaptiveMaxPool3d

Applies a Continual 3D adaptive max pooling over an input signal composed of several input planes.

Recurrent Layers

RNN

Applies a multi-layer Elman RNN with tanh\tanh or ReLU\text{ReLU} non-linearity to an input sequence.

LSTM

Applies a multi-layer long short-term memory (LSTM) RNN to an input sequence.

GRU

Applies a multi-layer gated recurrent unit (GRU) RNN to an input sequence.

Transformer Layers

TransformerEncoder

Continual Transformer Encoder is a stack of N encoder layers.

TransformerEncoderLayerFactory

Defines the hyper-parameters of Continual Transformer Encoder layers, where each layer contains feed forward networks and continual multi-head attentions as proposed by Vaswani et al. in "Attention is all you need".

SingleOutputTransformerEncoderLayer

Continual Single-output Transformer Encoder layer.

RetroactiveTransformerEncoderLayer

Continual Retroactive Transformer Encoder layer.

RetroactiveMultiheadAttention

MultiHeadAttention with retroactively updated attention outputs during continual inference.

SingleOutputMultiheadAttention

MultiHeadAttention which only computes the attention output for the a single query during continual inference.

RecyclingPositionalEncoding

Recycling Positional Encoding with learned or static weights.

Linear Layers

Linear

Applies a linear transformation to a dimension of the incoming data: y=xAT+by = xA^T + b.

Identity

A placeholder identity operator that is argument-insensitive.

Add

Applies an additive translation to the incoming data: y=x+ay = x + a.

Multiply

Applies an scaling transformation to the incoming data: y=axy = ax.

Utilities

Lambda

Module wrapper for stateless functions.

Delay

Delay an input by a number of steps.

Skip

Skip a number of input steps.

Reshape

Reshape non-temporal dimensions of an input

Constant

Returns constant * torch.ones_like(input).

Zero

Returns torch.zeros_like(input).

One

Returns torch.ones_like(input).

Converters

continual

Convert a torch.nn module to a Continual Inference Network enhanced with forward_step and forward_steps.

forward_stepping

Enhances torch.nn.Module with forward_step and forward_steps

call_mode

Context-manager which temporarily specifies a call_mode

Read the Docs v: latest
Versions
latest
stable
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.