Add like
Add dislike
Add to saved papers

Communication-efficient distributed cubic Newton with compressed lazy Hessian.

Recently, second-order distributed optimization algorithms have been becoming a research hot in distributed learning, due to their faster convergence rate than the first-order algorithms. However, second-order algorithms always suffer from serious communication bottleneck. To conquer such challenge, we propose communication-efficient second-order distributed optimization algorithms in the parameter-server framework, by incorporating cubic Newton methods with compressed lazy Hessian. Specifically, our algorithms require each worker communicate compressed Hessians with the server only at some particular iterations, which can save both communication bits and communication rounds. For non-convex problems, we theoretically prove that our algorithms can reduce the communication cost comparing to the state-of-the-art second-order algorithms, while maintaining the same iteration complexity order O(ϵ-3/2 ) as the centralized cubic Newton methods. By further using gradient regularization technique, our algorithms can achieve global convergence for convex problems. Moreover, for strongly convex problems, our algorithms can achieve local superlinear convergence rate without any requirement on initial conditions. Finally, numerical experiments are conducted to show the high efficiency of the proposed algorithms.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app