torch._logging#

Created On: Apr 24, 2023 | Last Updated On: Jun 17, 2025

PyTorch has a configurable logging system, where different components can begiven different log level settings. For instance, one component’s log messagescan be completely disabled, while another component’s log messages can beset to maximum verbosity.

Warning

This feature is in beta and may have compatibility breakingchanges in the future.

Warning

This feature has not been expanded to control the log messages ofall components in PyTorch yet.

There are two ways to configure the logging system: through the environment variableTORCH_LOGSor the python API torch._logging.set_logs.

set_logs

Sets the log level for individual components and toggles individual log artifact types.

The environment variableTORCH_LOGS is a comma-separated list of[+-]<component> pairs, where<component> is a component specified below. The+ prefixwill decrease the log level of the component, displaying more log messages while the- prefixwill increase the log level of the component and display fewer log messages. The default settingis the behavior when a component is not specified inTORCH_LOGS. In addition to components, there arealso artifacts. Artifacts are specific pieces of debug information associated with a component that are either displayed or not displayed,so prefixing an artifact with+ or- will be a no-op. Since they are associated with a component, enabling that component will typically also enable that artifact,unless that artifact was specified to beoff_by_default. This option is specified in _registrations.py for artifacts that are so spammy they should only be displayed when explicitly enabled.The following components and artifacts are configurable through theTORCH_LOGS environmentvariable (see torch._logging.set_logs for the python API):

Components:
all

Special component which configures the default log level of all components. Default:logging.WARN

dynamo

The log level for the TorchDynamo component. Default:logging.WARN

aot

The log level for the AOTAutograd component. Default:logging.WARN

inductor

The log level for the TorchInductor component. Default:logging.WARN

your.custom.module

The log level for an arbitrary unregistered module. Provide the fully qualified name and the module will be enabled. Default:logging.WARN

Artifacts:
bytecode

Whether to emit the original and generated bytecode from TorchDynamo.Default:False

aot_graphs

Whether to emit the graphs generated by AOTAutograd. Default:False

aot_joint_graph

Whether to emit the joint forward-backward graph generated by AOTAutograd. Default:False

compiled_autograd

Whether to emit logs from compiled_autograd. Defaults:False

ddp_graphs

Whether to emit graphs generated by DDPOptimizer. Default:False

graph

Whether to emit the graph captured by TorchDynamo in tabular format.Default:False

graph_code

Whether to emit the python source of the graph captured by TorchDynamo.Default:False

graph_breaks

Whether to emit a message when a unique graph break is encountered duringTorchDynamo tracing. Default:False

guards

Whether to emit the guards generated by TorchDynamo for each compiledfunction. Default:False

recompiles

Whether to emit a guard failure reason and message every timeTorchDynamo recompiles a function. Default:False

output_code

Whether to emit the TorchInductor output code. Default:False

schedule

Whether to emit the TorchInductor schedule. Default:False

Examples:

TORCH_LOGS="+dynamo,aot" will set the log level of TorchDynamo tologging.DEBUG and AOT tologging.INFO

TORCH_LOGS="-dynamo,+inductor" will set the log level of TorchDynamo tologging.ERROR and TorchInductor tologging.DEBUG

TORCH_LOGS="aot_graphs" will enable theaot_graphs artifact

TORCH_LOGS="+dynamo,schedule" will enable set the log level of TorchDynamo tologging.DEBUG and enable theschedule artifact

TORCH_LOGS="+some.random.module,schedule" will set the log level of some.random.module tologging.DEBUG and enable theschedule artifact