Movatterモバイル変換


[0]ホーム

URL:


Asymptotically Optimal Regularization in Smooth Parametric Models

Part ofAdvances in Neural Information Processing Systems 22 (NIPS 2009)

BibtexMetadataPaper

Authors

Percy Liang, Guillaume Bouchard, Francis R. Bach, Michael I. Jordan

Abstract

Many types of regularization schemes have been employed in statistical learning, each one motivated by some assumption about the problem domain. In this paper, we present a unified asymptotic analysis of smooth regularizers, which allows us to see how the validity of these assumptions impacts the success of a particular regularizer. In addition, our analysis motivates an algorithm for optimizing regularization parameters, which in turn can be analyzed within our framework. We apply our analysis to several examples, including hybrid generative-discriminative learning and multi-task learning.


Name Change Policy

Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues. Authors are asked to consider this carefully and discuss it with their co-authors prior to requesting a name change in the electronic proceedings.

Use the "Report an Issue" link to request a name change.


[8]ページ先頭

©2009-2025 Movatter.jp