Rate this Page

torch.cuda.memory.reset_max_memory_cached#

torch.cuda.memory.reset_max_memory_cached(device=None)[source]#

Reset the starting point in tracking maximum GPU memory managed by the caching allocator for a given device.

Seemax_memory_cached() for details.

Parameters

device (torch.device orint,optional) – selected device. Returnsstatistic for the current device, given bycurrent_device(),ifdevice isNone (default).

Warning

This function now callsreset_peak_memory_stats(), which resets/all/ peak memory stats.

Note

SeeMemory management for more details about GPU memorymanagement.