Movatterモバイル変換


[0]ホーム

URL:


Better Mini-Batch Algorithms via Accelerated Gradient Methods

Part ofAdvances in Neural Information Processing Systems 24 (NIPS 2011)

BibtexMetadataPaperSupplemental

Authors

Andrew Cotter, Ohad Shamir, Nati Srebro, Karthik Sridharan

Abstract

Mini-batch algorithms have recently received significant attention as a way to speed-up stochastic convex optimization problems. In this paper, we study how such algorithms can be improved using accelerated gradient methods. We provide a novel analysis, which shows how standard gradient methods may sometimes be insufficient to obtain a significant speed-up. We propose a novel accelerated gradient algorithm, which deals with this deficiency, and enjoys a uniformly superior guarantee. We conclude our paper with experiments on real-world datasets, which validates our algorithm and substantiates our theoretical insights.


Name Change Policy

Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues. Authors are asked to consider this carefully and discuss it with their co-authors prior to requesting a name change in the electronic proceedings.

Use the "Report an Issue" link to request a name change.


[8]ページ先頭

©2009-2025 Movatter.jp