- Notifications
You must be signed in to change notification settings - Fork26k
function transforms (aka torch.func, functorch)
Manuel edited this pageJul 3, 2024 ·3 revisions
Page Maintainers: @zou3519
- understand what composable function transforms are and their most common use cases
- understand what DynamicLayerStack is and how it is used to implement composition of function transforms
- Read throughthe whirlwind tour
- Read through theadvanced autodiff tutorial
- Read through theper-sample-gradients tutorial
- Read through themodel ensembling tutorial
The advanced autodiff tutorial explains how to compute Jacobians via a composition of vmap and vjp.
- Without looking at the source code for jacfwd or torch.autograd.functional.jacobian, write a function to compute the Jacobian using forward-mode AD and a for-loop. Note that forward-mode AD computes Jacobian-vector products while reverse-mode AD (vjp, grad) compute vector-Jacobian products.
- Write a function to compute the Jacobian by composing vmap and jvp.
The APIs should have the following signature:
def jacobian(f, *args): passYou can assume thatf accepts multiple Tensor arguments and returns a single Tensor argument.
Read through thisgdoc.
Back to theCore Frontend Onboarding
I would love to contribute to PyTorch!