JA

Released v1.0.11! Provide pre-trained models, added new functions, and better compatibility with ONNX.

Friday, February 08, 2019

News

Posted by shin

Last week, we released nnabla packages v1.0.11! The main features updated from the previous version are described below:

  • Spotlight:

Subpackage nnabla.models:

This adds nnabla.models API which allows users to use state-of-the-art pretrained models for both inference and training, without having to manually train the model from scratch, as shown below:

from nnabla.models.imagenet import ResNet
model = ResNet(18)
batch_size = 1
x = nn.Variable((batch_size,) + model.input_shape)
y = model(x, training=False)

Following models pretrained on ImageNet are available;

ResNet-18,34,50,101,152

MobileNetV2

SqueezeNet

SENet-154

 

  • Functions:

Add functions: IsInf, IsNaN, ResetNaN, ResetInf, and Where

Add clear_buffer flag to forward_all

Add 3D support for pooling functions.

[Experimental] PyTorch-like Functions

Added AMSGRAD

Serialization of SolverState

Add CuDNN max and average pooling for 3D case.

Use dedicated function to determine workspace size for alogorithm.

 

  • Utilities:

Improve onnx exporter

improve onnx import

[Experimental] Trainer API

On memory api

Support multiple dataset in .proto

Added learing rate scheduler

Add graph converters for inference

 

  • Examples:

DeepLabv3

Self-attention GAN

Efficient Neural Architecture Search (ENAS)