Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

{KFAC,EKFAC,Diagonal,Implicit} Fisher Matrices and finite width NTKs in PyTorch

License

NotificationsYou must be signed in to change notification settings

tfjgeorge/nngeometry

Repository files navigation

Build StatuscodecovDOIPyPI version

NNGeometry allows you to:

  • compute Gauss-Newton orFisher Information Matrices (FIM andFIM_MonteCarlo), as well as any matrix that is written as the covariance of gradients w.r.t. parameters, using efficient approximations such as low-rank matrices, KFAC, EKFAC, diagonal and so on. Some of these representations also work for hessians (Hessian).
  • compute finite-widthNeural Tangent Kernels evaluated on a set of examples (GramMatrix), even for multiple output functions.
  • computeper-examples jacobians of the loss w.r.t network parameters (Jacobian), or of any function such as the network's output.
  • easily and efficiently compute linear algebra operations involving these matricesregardless of their approximation.
  • computeimplicit operations on these matrices, that do not require explicitely storing large matrices that would not fit in memory.

It offers a high level abstraction over the parameter and function spaces described by neural networks. As a simple example, a parameter space vectorPVector actually contains weight matrices, bias vectors, or convolutions kernels of the whole neural network (a set of tensors). Using NNGeometry's API, performing a step in parameter space (e.g. an update of your favorite optimization algorithm) is abstracted as a python addition:w_next = w_previous + epsilon * delta_w.

Example

In the Elastic Weight Consolidation continual learning technique, you want to compute$\left(\mathbf{w}-\mathbf{w}_{A}\right)^{\top}F\left(\mathbf{w}-\mathbf{w}_{A}\right)$. It can be achieved with a diagonal approximation for the FIM using:

F=FIM(model=model,loader=loader,representation=PMatDiag)regularizer=F.vTMv(w-w_a)

The first statement instantiates a diagonal matrix, and populates it with the diagonal coefficients of the FIM of the modelmodel computed using the examples from the dataloaderloader.

If diagonal is not sufficiently accurate then you could instead choose a KFAC approximation, by just changingPMatDiag toPMatKFAC in the above. Note that it internally involves very different operations, depending on the chosen representation (e.g. KFAC, EKFAC, ...).

Documentation

You can visit the documentation athttps://nngeometry.readthedocs.io.

More example usage are available in the repositoryhttps://github.com/tfjgeorge/nngeometry-examples.

Feature requests, bugs, contributions, or any kind of request

You are now many who are using NNGeometry in your work: do not hesitate to drop me a line (tfjgeorge@gmail.com) about your project so that I have a better understanding of your use cases or the current limitations of the library.

We welcome any feature request or bug report in theissue tracker.

We also welcome contributions, please submit your PRs!

Citation

If you use NNGeometry in a published project, please cite our work using the following bibtex entry

@software{george_nngeometry,  author       = {Thomas George},  title        = {{NNGeometry: Easy and Fast Fisher Information                    Matrices and Neural Tangent Kernels in PyTorch}},  month        = feb,  year         = 2021,  publisher    = {Zenodo},  version      = {v0.3},  doi          = {10.5281/zenodo.4532597},  url          = {https://doi.org/10.5281/zenodo.4532597}}

License

This project is distributed under the MIT license (see LICENSE file).This project also includes code licensed under the BSD 3 clause as it borrows some code fromhttps://github.com/owkin/grad-cnns.


[8]ページ先頭

©2009-2025 Movatter.jp