torch.cuda.memory.reset_max_memory_cached#
- torch.cuda.memory.reset_max_memory_cached(device=None)[source]#
Reset the starting point in tracking maximum GPU memory managed by the caching allocator for a given device.
See
max_memory_cached()for details.- Parameters
device (torch.device orint,optional) – selected device. Returnsstatistic for the current device, given by
current_device(),ifdeviceisNone(default).
Warning
This function now calls
reset_peak_memory_stats(), which resets/all/ peak memory stats.Note
SeeMemory management for more details about GPU memorymanagement.