Article
10 March 2022
In Press
Institute of Electrical and Electronics Engineers (IEEE)
10.1109/tnnls.2022.3143554
2162-237X
New Funder; National Natural Science Foundation of China; Engineering and Physical Sciences Research Council; EPSRC Engineering and Physical Sciences Research Council
Zhou, Y., Huang, K., Cheng, C., Wang, X., Hussain, A., & Liu, X. (in press). FastAdaBelief: Improving Convergence Rate for Belief-Based Adaptive Optimizers by Exploiting Strong Convexity. IEEE Transactions on Neural Networks and Learning Systems, https://doi.org/10.1109/tnnls.2022.3143554
ProfessorSchool of Computing Engineering and the Built Environment
0131 455 2239
A.Hussain@napier.ac.uk
Adaptive Learning Rate, Stochastic Gradient Descent, Online Learning, Optimization Algorithm, Strong Convexity
4MB