We have released Neural Network Libraries v.1.0.16!
Added InceptionV3 Model and AdaBound/AMSBound!
Spotlight
AdaBound & AMSBound
AdaBound optimizer applies dynamic bounds on learning rates of Adam, and AMSBound does the same to AMSGrad.
See this paper for more details!
import nnabla.solvers as S
solver = S.AdaBound(lr=1e-3)
solver.set_parameters(nn.get_parameters())
for itr in range(num_itr):
x.d = ... # set data
t.d = ... # set label
loss.forward()
solver.zero_grad() # All gradient buffer being 0
loss.backward()
solver.weight_decay(decay_rate) # Apply weight decay
solver.update() # updating parameters
New Model:
See this paper for details.
Functions
- Add tile function (c.f. numpy.tile or torch.repeat)
- Add random_choice function and tests.
- Add CUDA implemention for random_choice function.
- Support backward of BatchNormalization for batch_stat=False
- Add TopKDataCuda and TopKGradCuda function implementations.
Utilities
- Remove existing NNP before saving new NNP in CLI train command
- Create CLI header to share CLI implementation from multiple applications.