JA

Released Neural Network Libraries v.1.3.0!

Wednesday, October 23, 2019

Release

Posted by shin

We have released Neural Network Libraries 1.3.0! Multi-head attention layer and Deep Q learning (DQN) example have been added!

Spotlight

MultiHead Attention

Multi-head attention layer provides a building block component for transformer, which has become a state-of-the-art model for many tasks such as language modeling. It is also being actively applied to a variety of computer vision tasks.

Deep Q-Learning

DQN is one of the most widely used deep reinforcement learning (RL) algorithms. Check this blog post for more details!

Layers​

Utilities

Examples

Bug Fix / Documentation

NOTE

Asynchronous sort support in CUDA 10.1 is known to cause segmentation fault due to an improper concurrency. This issue will be handled by the next release.