Layers

Activation

Activation Layer applies a non linear activation function on the input layer.

Batch Generator

Batch Generator Layer is to generate the weights or bias or any variable with specified shape, specified initialiser type by taking batch size from the input.

Batch Normalization

Applies normalization (usually zero mean and unit variation) on the input. This is done to help a layer learn a bit more independently than previous layers without worrying about scale of values in each layer output.

Bi-Directional RNN

Bi-Directional Recurrent Neural Networks (RNN) are putting two independent RNNs together. The input sequence is fed in normal time order for one network and in reverse time order for the other. The basic idea of BRNNs is to connect two hidden layers of opposite directions to the same output. By this structure, the output layer can get information from past and future states. BRNN are especially useful when the context of the input is needed. For example, in handwriting recognition, the performance can be enhanced by knowledge of the letters located before and after the current letter.

Convolution 1D

Convolution 1D Layer applies conv 1d operation on the given input tensor.

Convolution 2D

Convolution Layer applies a 2D convolution operation on 2D input to figure out multiple distinct feature maps.

Convolution 3D

3D ConvNet is well-suited for spatio-temporal feature learning. 3D ConvNet has the ability to model temporal information better owing to 3D convolution and 3D pooling operation.

DeConvolution

Does a transposed Convolution. It is often used for upsampling the output of a ConvNet to the original image resolution.

Dropout

Applies Dropout (Keep Probability) Function on given input.

Fully Connected

The high-level reasoning in the neural network is done via fully connected layers. Neurons in a fully connected layer have connections to all activations in the previous layer.

Highway Convolution

Highway Convolution Layers are deep networks based on multiple stacked convolutional layers for feature preprocessing.

Highway Fully Connected

Highway Fully Connected Layers are deep networks based on multiple stacked fully connected layers for feature preprocessing.

Morph

Creates a new layer which exactly mimics the shape of the input layer but with different values (zeros or ones).

Neural Accumulator

Neural accumulator (NAC), which is a special case of a linear (affine) layer whose transformation matrix W consists just of ?1’s, 0’s, and 1’s; that is, its outputs are additions or subtractions (rather than arbitrary rescalings) of rows in the input vector.

Pooling

It is a function to progressively reduce the spatial size of the representation to reduce the amount of parameters and computation in the network and hence to also control overfitting.

Recurrent NN

Recurrent Neural Networks (RNN) are a powerful and robust type of neural networks having internal memory. RNN's are able to remember important things about the input they received, which enables them to be very precise in predicting what's coming next. This makes it useful for use cases with sequential data like sentences/NLP or time series data.

Separable Convolution

Separable Convolution 2D Layer applies separable convolution operation on input tensor.

Switchable Normalization

Enables ability to learn different normalization operations for different normalization layers in a deep neural network in an end-to-end manner.