nnΒΆ

Accessor([nodes, activation, index])

Accessor is a neural network module that provides access to specific tensor data with optional activation.

Concatenator(activation[, nodes])

This class is designed to concatenate multiple tensors along the last dimension and apply an activation function to the concatenated result.

Dirichlet(activation, dirichlet_name[, nodes])

Dirichlet is a neural network module that overwrites values with that of dirichlet field.

Contraction(activation[, nodes])

Contraction is a neural network module that performs a contraction operation on the input tensor.

DeepSets(lambda_config, gamma_config, ...[, ...])

DeepSets is a neural network module that performs permutation invariant / equivariant operation on the input tensor.

EnEquivariantMLP(nodes[, activations, ...])

EnEquivariantMLP is a neural network module that performs an E(n)-equivariant operation on the input tensor.

EnEquivariantTCN(nodes, kernel_sizes, ...[, ...])

EnEquivariantTCN is a neural network module that performs an E(n)-equivariant operation on the time series input tensors.

Einsum(activation, equation, **kwards)

Einsum is a neural network module that performs a einsum operation on the input tensors.

GCN(nodes, support_name[, activations, ...])

GCN is a neural network module that performs a graph convolution operation on the input tensor.

Identity([nodes])

Identity is a neural network module that returns the input tensor.

IsoGCN(nodes, isoam_names, propagations, ...)

IsoGCN is a neural network module that performs an spatial pifferential operation on the input tensor.

MLP(nodes[, activations, dropouts, bias])

Multi Layer Perceptron

PInvMLP(reference_name, **kwards)

Pseudo inverse of the reference MLP layer.

Proportional(nodes)

Proportional, i.e., strictly linear, layer

Rearrange(pattern, axes_lengths, ...)

Rearrange is a neural network module that performs a rearrange operation on the input tensor.

Reducer(activation, operator[, nodes])

Reducer is a neural network module that performs a reduction operation on the input tensor.

Share(reference_name, **kwards)

Share module have same operations as the reference module.

SimilarityEquivariantMLP(nodes, scale_names)

Similarity-equivariant Multi Layer Perceptron as in https://proceedings.mlr.press/v235/horie24a.html.

SPMM(factor, support_name[, transpose])

Sparse Matrix Multiplication Network

TCN(nodes, kernel_sizes, dilations, activations)

Temporal Convolutional Networks

TimeSeriesToFeatures([nodes, activation])

TimeSeriesToFeatures is a neural network module that converts a time series tensor into a non time series tensor by summing the time series tensor along the time dimension.