Layer and Activation Functions

We use the following notation to describe layer and activation functions:

\[\begin{split}\begin{align*} N &:= \text{Set of nodes (i.e. neurons in the neural network)}\\ M_i &:= \text{Number of inputs to node $i$}\\ \hat z_i &:= \text{pre-activation value on node $i$}\\ z_i &:= \text{post-activation value on node $i$}\\ w_{ij} &:= \text{weight from input $j$ to node $i$}\\ b_i &:= \text{bias value for node $i$} \end{align*}\end{split}\]

Layer Functions

omlt.neuralnet.layers.full_space.full_space_conv_layer(net_block, net, layer_block, layer)[source]
omlt.neuralnet.layers.full_space.full_space_dense_layer(net_block, net, layer_block, layer)[source]

Add full-space formulation of the dense layer to the block

\[\begin{align*} \hat z_i &= \sum_{j{=}1}^{M_i} w_{ij} z_j + b_i && \forall i \in N \end{align*}\]
omlt.neuralnet.layers.reduced_space.reduced_space_dense_layer(net_block, net, layer_block, layer, activation)[source]

Add reduced-space formulation of the dense layer to the block

\[\begin{align*} \hat z_i &= \sum_{j{=}1}^{M_i} w_{ij} z_j + b_i && \forall i \in N \end{align*}\]
omlt.neuralnet.layers.partition_based.default_partition_split_func(w, n)[source]
omlt.neuralnet.layers.partition_based.partition_based_dense_relu_layer(net_block, net, layer_block, layer, split_func)[source]

Activation Functions

omlt.neuralnet.activations.linear.linear_activation_constraint(net_block, net, layer_block, layer, add_constraint=True)[source]

Linear activation constraint generator

Generates the constraints for the linear activation function.

\[\begin{align*} z_i &= \hat{z_i} && \forall i \in N \end{align*}\]
omlt.neuralnet.activations.linear.linear_activation_function(zhat)[source]
class omlt.neuralnet.activations.relu.ComplementarityReLUActivation(transform=None)[source]

Bases: object

Complementarity-based ReLU activation forumlation.

omlt.neuralnet.activations.relu.bigm_relu_activation_constraint(net_block, net, layer_block, layer)[source]

Big-M ReLU activation formulation.

omlt.neuralnet.activations.smooth.sigmoid_activation_constraint(net_block, net, layer_block, layer)[source]

Sigmoid activation constraint generator

Generates the constraints for the sigmoid activation function.

\[\begin{align*} z_i &= \frac{1}{1 + exp(-\hat z_i)} && \forall i \in N \end{align*}\]
omlt.neuralnet.activations.smooth.sigmoid_activation_function(x)[source]
omlt.neuralnet.activations.smooth.smooth_monotonic_activation_constraint(net_block, net, layer_block, layer, fcn)[source]

Activation constraint generator for a smooth monotonic function

Generates the constraints for the activation function fcn if it is smooth and monotonic

\[\begin{align*} z_i &= fcn(\hat z_i) && \forall i \in N \end{align*}\]
omlt.neuralnet.activations.smooth.softplus_activation_constraint(net_block, net, layer_block, layer)[source]

Softplus activation constraint generator

Generates the constraints for the softplus activation function.

\[\begin{align*} z_i &= log(exp(\hat z_i) + 1) && \forall i \in N \end{align*}\]
omlt.neuralnet.activations.smooth.softplus_activation_function(x)[source]
omlt.neuralnet.activations.smooth.tanh_activation_constraint(net_block, net, layer_block, layer)[source]

tanh activation constraint generator

Generates the constraints for the tanh activation function.

\[\begin{align*} z_i &= tanh(\hat z_i) && \forall i \in N \end{align*}\]
omlt.neuralnet.activations.smooth.tanh_activation_function(x)[source]