Movatterモバイル変換


[0]ホーム

URL:


Neural Networks with Cheap Differential Operators

Part ofAdvances in Neural Information Processing Systems 32 (NeurIPS 2019)

AuthorFeedbackBibtexMetaReviewMetadataPaperReviewsSupplemental

Authors

Ricky T. Q. Chen, David K. Duvenaud

Abstract

Gradients of neural networks can be computed efficiently for any architecture, but some applications require computing differential operators with higher time complexity. We describe a family of neural network architectures that allow easy access to a family of differential operators involving \emph{dimension-wise derivatives}, and we show how to modify the backward computation graph to compute them efficiently. We demonstrate the use of these operators for solving root-finding subproblems in implicit ODE solvers, exact density evaluation for continuous normalizing flows, and evaluating the Fokker-Planck equation for training stochastic differential equation models.


Name Change Policy

Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues. Authors are asked to consider this carefully and discuss it with their co-authors prior to requesting a name change in the electronic proceedings.

Use the "Report an Issue" link to request a name change.


[8]ページ先頭

©2009-2025 Movatter.jp