JA

Released Neural Network Libraries v1.6.0!

Monday, March 09, 2020

Release

Posted by shin

We have released Neural Network Libraries 1.6.0! RNN, GRU, and LSTM are now available for CPUs, and many new features including adaptive separable convolution, NHWC deconvolution, and patch correlation layer have been added!

Spotlight

CPU implementation of RNN, GRU, and LSTM

RNN, GRU, and LSTM that were available for GPU only can now be used with CPUs too, making its usage feasible for lightweight, low-powered devices with limited resources!

Adaptive Separable Convolution

This layer implements 2D Adaptive Separable Convolution for NCHW (the channel-first tensor), which is useful for video frame interpolation as demonstrated in this paper. Sample and pixel-dependent vertical/horizontal kernels are dynamically generated, and are used for approximating feature-independent 2D kernel. We support gradients with respect to all of inputs, images, vertical kernels, and horizontal kernels.

NHWC Deconvolution (channel_last option)

We had already implemented NHWC layout for convolution and pooling, and now the same layout is available for deconvolution as well, allowing for more degrees of freedom for mixed precision training!

Patch Correlation

This layer correlates patches of two input images by comparing a patch from the first input at incremental coordinates, with patches of the second input shifted around those coordinates. The patch and shift increments are configurable and padding may also be applied. Particularly useful for applications involving optical flow, such as FlowNet or PWC-Net.

Layers

Utilities/Format Conversion

Examples

Bug Fix/Documentation​