A little past the anniversary of Neural Network Libraries 1.0’s release, we are now finally releasing Neural Network Libraries v1.1.0!
We have added double backward for most function layers, enabling computations of second-order derivatives!
Thank you for supporting Neural Network Libraries for the past one year, and please continue to support us!
Spotlight
Double Backward
Double backward, i.e., the second-order gradients of outputs with respect to inputs, is critical for implementing many state-of-the-art deep learning techniques.
We have enabled double backward for more than 70 function layers, highly enriching the applicability of Neural Network Libraries.
grads = nn.grad(outputs, inputs)
# Manipulate {grads} as usual variables.
Utilities
- Add setitem methods for NdArray and Variable
- Use module-wise random generator to initialize parameter weights
Bug Fix / Documentations
Examples
* Important Notes
* We’ve decided to change Neural Network Libraries’ versioning policy to semantic versioning.
This change has been applied from version 1.1.0.
* “nnabla-ext-cuda” package is temporarily unavailable. Use of this package is not recommended. Please install nnabla-ext-cuda101, nnabla-ext-cuda100, nnabla-ext-cuda90 or nnabla-ext-cuda80 instead.
* The following nnabla CUDA extension packages have been deprecated and the PyPi repository has been closed.
– nnabla-ext-cuda91
– nnabla-ext-cuda92
* The following “nnabla-ext-cuda” docker images have been deprecated.
– py37-cuda92
– py36-cuda92
– py27-cuda92
– py37-cuda92-v1.0.xx
– py36-cuda92-v1.0.xx
– py27-cuda92-v1.0.xx