JA

Released Neural Network Libraries v1.18.0!

Tuesday, March 23, 2021

Release

Posted by shin

We have released Neural Network Libraries v1.18.0!
Higher-order gradients for functions are now available, and we have also implemented SLE-GAN!

Spotlight

Higher-Order Gradients (GPU)

We have revamped our current double-backward APIs by constructing advanced gradient computation graph with existing functions, enabling n-th order gradients (any order gradients or infinite times gradients)!

You can use higher-order gradients as following:

x = nn.Variable.from_numpy_array(np.random.randn(2, 2)).apply(need_grad=True)
x.grad.zero()
y = F.sin(x)
def grad(y, x, n=1):
    dx = [y]
    for _ in range(n):
        dx = nn.grad([dx[0]], [x])
    return dx[0]
dnx = grad(y, x, n=10)
dnx.forward()
print(np.allclose(-np.sin(x.d), dnx.d))
dnx.backward()
print(np.allclose(-np.cos(x.d), x.g))

Users can also see the registry, and register the grad function of their own implementation as following:

# Show the registry status
from nnabla.backward_functions import show_registry
show_registry()

# Register the grad function of a function
register("<FancyFunction>", <fancy_function>_backward)

SLE-GAN

We have also implemented SLE-GAN from ICLR 2021, a light-weight generative model for high-fidelity image synthesis, which enables efficient few-shot training from around 100 training samples, with roughly half the size of StyleGAN2’s parameters.

Layers

Utilities

Examples

Bugfix / Documentation