torch._logging#
Created On: Apr 24, 2023 | Last Updated On: Jun 17, 2025
PyTorch has a configurable logging system, where different components can begiven different log level settings. For instance, one component’s log messagescan be completely disabled, while another component’s log messages can beset to maximum verbosity.
Warning
This feature is in beta and may have compatibility breakingchanges in the future.
Warning
This feature has not been expanded to control the log messages ofall components in PyTorch yet.
There are two ways to configure the logging system: through the environment variableTORCH_LOGSor the python API torch._logging.set_logs.
set_logs | Sets the log level for individual components and toggles individual log artifact types. |
The environment variableTORCH_LOGS is a comma-separated list of[+-]<component> pairs, where<component> is a component specified below. The+ prefixwill decrease the log level of the component, displaying more log messages while the- prefixwill increase the log level of the component and display fewer log messages. The default settingis the behavior when a component is not specified inTORCH_LOGS. In addition to components, there arealso artifacts. Artifacts are specific pieces of debug information associated with a component that are either displayed or not displayed,so prefixing an artifact with+ or- will be a no-op. Since they are associated with a component, enabling that component will typically also enable that artifact,unless that artifact was specified to beoff_by_default. This option is specified in _registrations.py for artifacts that are so spammy they should only be displayed when explicitly enabled.The following components and artifacts are configurable through theTORCH_LOGS environmentvariable (see torch._logging.set_logs for the python API):
- Components:
allSpecial component which configures the default log level of all components. Default:
logging.WARNdynamoThe log level for the TorchDynamo component. Default:
logging.WARNaotThe log level for the AOTAutograd component. Default:
logging.WARNinductorThe log level for the TorchInductor component. Default:
logging.WARNyour.custom.moduleThe log level for an arbitrary unregistered module. Provide the fully qualified name and the module will be enabled. Default:
logging.WARN
- Artifacts:
bytecodeWhether to emit the original and generated bytecode from TorchDynamo.Default:
Falseaot_graphsWhether to emit the graphs generated by AOTAutograd. Default:
Falseaot_joint_graphWhether to emit the joint forward-backward graph generated by AOTAutograd. Default:
Falsecompiled_autogradWhether to emit logs from compiled_autograd. Defaults:
Falseddp_graphsWhether to emit graphs generated by DDPOptimizer. Default:
FalsegraphWhether to emit the graph captured by TorchDynamo in tabular format.Default:
Falsegraph_codeWhether to emit the python source of the graph captured by TorchDynamo.Default:
Falsegraph_breaksWhether to emit a message when a unique graph break is encountered duringTorchDynamo tracing. Default:
FalseguardsWhether to emit the guards generated by TorchDynamo for each compiledfunction. Default:
FalserecompilesWhether to emit a guard failure reason and message every timeTorchDynamo recompiles a function. Default:
Falseoutput_codeWhether to emit the TorchInductor output code. Default:
FalsescheduleWhether to emit the TorchInductor schedule. Default:
False
- Examples:
TORCH_LOGS="+dynamo,aot"will set the log level of TorchDynamo tologging.DEBUGand AOT tologging.INFOTORCH_LOGS="-dynamo,+inductor"will set the log level of TorchDynamo tologging.ERRORand TorchInductor tologging.DEBUGTORCH_LOGS="aot_graphs"will enable theaot_graphsartifactTORCH_LOGS="+dynamo,schedule"will enable set the log level of TorchDynamo tologging.DEBUGand enable thescheduleartifactTORCH_LOGS="+some.random.module,schedule"will set the log level of some.random.module tologging.DEBUGand enable thescheduleartifact