Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

function transforms (aka torch.func, functorch)

Manuel edited this pageJul 3, 2024 ·3 revisions

Page Maintainers: @zou3519

Scope

  • understand what composable function transforms are and their most common use cases
  • understand what DynamicLayerStack is and how it is used to implement composition of function transforms

Learn about function transforms

Exercise

The advanced autodiff tutorial explains how to compute Jacobians via a composition of vmap and vjp.

  1. Without looking at the source code for jacfwd or torch.autograd.functional.jacobian, write a function to compute the Jacobian using forward-mode AD and a for-loop. Note that forward-mode AD computes Jacobian-vector products while reverse-mode AD (vjp, grad) compute vector-Jacobian products.
  2. Write a function to compute the Jacobian by composing vmap and jvp.

The APIs should have the following signature:

def jacobian(f, *args):    pass

You can assume thatf accepts multiple Tensor arguments and returns a single Tensor argument.

Understand how PyTorch implements composable function transforms

Read through thisgdoc.

Next

Back to theCore Frontend Onboarding

I would love to contribute to PyTorch!

Clone this wiki locally


[8]ページ先頭

©2009-2025 Movatter.jp