We have released Neural Network Libraries v.1.0.17! Added Xception model, STFT, and synced batch normalization!
Spotlight
STFT and inverse STFT
Short-form Fourier Transformation is a popular technique frequently used in speech/audio processing.
Synchronized Batch Normalization
Synchronized batch normalization is useful for distributed training with multiple GPUs.
Xception
Current Lineup of pre-trained models:
* ResNet: 18, 34, 50, 101, 152 layers available
* MobileNet: Both v1 and v2 available
* SENet: 154 layers available
* SqueezeNet: v1.1
* VGG: 11, 13, 16 layers available
* DenseNet: 161 layers available
* Network in Network
* Inception: v3
* Xception
Other Updates
Function Layers
- Various Activation Funcs (hard_sigmoid, hard_tanh, log_sigmoid, relu6, softplus, softsign, tanh_shrink, sinc)
- Implement assign function
-
gather_nd and selection by index variable (advanced indexing)
Utilities
- Add tests for min and max function when returning indices.
- Improve CLI train, profile and forward
- Add variable name option to dump command.
Examples
Build
Bug Fix
- Fix LSTM Memory Leakage Problem
- Fix access violation in C++ MNIST training example
- Fix bugs in min and max functions when used to return indices.