torch.hub#
Created On: Jun 13, 2025 | Last Updated On: Jun 13, 2025
Pytorch Hub is a pre-trained model repository designed to facilitate research reproducibility.
Publishing models#
Pytorch Hub supports publishing pre-trained models(model definitions and pre-trained weights)to a GitHub repository by adding a simplehubconf.py file;
hubconf.py can have multiple entrypoints. Each entrypoint is defined as a python function(example: a pre-trained model you want to publish).
defentrypoint_name(*args,**kwargs):# args & kwargs are optional, for models which take positional/keyword arguments....
How to implement an entrypoint?#
Here is a code snippet specifies an entrypoint forresnet18 model if we expandthe implementation inpytorch/vision/hubconf.py.In most case importing the right function inhubconf.py is sufficient. Here wejust want to use the expanded version as an example to show how it works.You can see the full script inpytorch/vision repo
dependencies=['torch']fromtorchvision.models.resnetimportresnet18as_resnet18# resnet18 is the name of entrypointdefresnet18(pretrained=False,**kwargs):""" # This docstring shows up in hub.help() Resnet18 model pretrained (bool): kwargs, load pretrained weights into the model """# Call the model, load pretrained weightsmodel=_resnet18(pretrained=pretrained,**kwargs)returnmodel
dependenciesvariable is alist of package names required toload the model. Note this mightbe slightly different from dependencies required for training a model.argsandkwargsare passed along to the real callable function.Docstring of the function works as a help message. It explains what does the model do and whatare the allowed positional/keyword arguments. It’s highly recommended to add a few examples here.
Entrypoint function can either return a model(nn.module), or auxiliary tools to make the user workflow smoother, e.g. tokenizers.
Callables prefixed with underscore are considered as helper functions which won’t show up in
torch.hub.list().Pretrained weights can either be stored locally in the GitHub repo, or loadable by
torch.hub.load_state_dict_from_url(). If less than 2GB, it’s recommended to attach it to aproject releaseand use the url from the release.In the example abovetorchvision.models.resnet.resnet18handlespretrained, alternatively you can put the following logic in the entrypoint definition.
ifpretrained:# For checkpoint saved in local GitHub repo, e.g. <RELATIVE_PATH_TO_CHECKPOINT>=weights/save.pthdirname=os.path.dirname(__file__)checkpoint=os.path.join(dirname,<RELATIVE_PATH_TO_CHECKPOINT>)state_dict=torch.load(checkpoint)model.load_state_dict(state_dict)# For checkpoint saved elsewherecheckpoint='https://download.pytorch.org/models/resnet18-5c106cde.pth'model.load_state_dict(torch.hub.load_state_dict_from_url(checkpoint,progress=False))
Important Notice#
The published models should be at least in a branch/tag. It can’t be a random commit.
Loading models from Hub#
Pytorch Hub provides convenient APIs to explore all available models in hubthroughtorch.hub.list(), show docstring and examples throughtorch.hub.help() and load the pre-trained models usingtorch.hub.load().
- torch.hub.list(github,force_reload=False,skip_validation=False,trust_repo=None,verbose=True)[source]#
List all callable entrypoints available in the repo specified by
github.- Parameters
github (str) – a string with format “repo_owner/repo_name[:ref]” with an optionalref (tag or branch). If
refis not specified, the default branch is assumed to bemainifit exists, and otherwisemaster.Example: ‘pytorch/vision:0.10’force_reload (bool,optional) – whether to discard the existing cache and force a fresh download.Default is
False.skip_validation (bool,optional) – if
False, torchhub will check that the branch or commitspecified by thegithubargument properly belongs to the repo owner. This will makerequests to the GitHub API; you can specify a non-default GitHub token by setting theGITHUB_TOKENenvironment variable. Default isFalse.trust_repo (bool,str orNone) –
"check",True,FalseorNone.This parameter was introduced in v1.12 and helps ensuring that usersonly run code from repos that they trust.If
False, a prompt will ask the user whether the repo shouldbe trusted.If
True, the repo will be added to the trusted list and loadedwithout requiring explicit confirmation.If
"check", the repo will be checked against the list oftrusted repos in the cache. If it is not present in that list, thebehaviour will fall back onto thetrust_repo=Falseoption.If
None: this will raise a warning, inviting the user to settrust_repoto eitherFalse,Trueor"check". Thisis only present for backward compatibility and will be removed inv2.0.
Default is
Noneand will eventually change to"check"in v2.0.verbose (bool,optional) – If
False, mute messages about hittinglocal caches. Note that the message about first download cannot bemuted. Default isTrue.
- Returns
The available callables entrypoint
- Return type
Example
>>>entrypoints=torch.hub.list("pytorch/vision",force_reload=True)
- torch.hub.help(github,model,force_reload=False,skip_validation=False,trust_repo=None)[source]#
Show the docstring of entrypoint
model.- Parameters
github (str) – a string with format <repo_owner/repo_name[:ref]> with an optionalref (a tag or a branch). If
refis not specified, the default branch is assumedto bemainif it exists, and otherwisemaster.Example: ‘pytorch/vision:0.10’model (str) – a string of entrypoint name defined in repo’s
hubconf.pyforce_reload (bool,optional) – whether to discard the existing cache and force a fresh download.Default is
False.skip_validation (bool,optional) – if
False, torchhub will check that the refspecified by thegithubargument properly belongs to the repo owner. This will makerequests to the GitHub API; you can specify a non-default GitHub token by setting theGITHUB_TOKENenvironment variable. Default isFalse.trust_repo (bool,str orNone) –
"check",True,FalseorNone.This parameter was introduced in v1.12 and helps ensuring that usersonly run code from repos that they trust.If
False, a prompt will ask the user whether the repo shouldbe trusted.If
True, the repo will be added to the trusted list and loadedwithout requiring explicit confirmation.If
"check", the repo will be checked against the list oftrusted repos in the cache. If it is not present in that list, thebehaviour will fall back onto thetrust_repo=Falseoption.If
None: this will raise a warning, inviting the user to settrust_repoto eitherFalse,Trueor"check". Thisis only present for backward compatibility and will be removed inv2.0.
Default is
Noneand will eventually change to"check"in v2.0.
Example
>>>print(torch.hub.help("pytorch/vision","resnet18",force_reload=True))
- torch.hub.load(repo_or_dir,model,*args,source='github',trust_repo=None,force_reload=False,verbose=True,skip_validation=False,**kwargs)[source]#
Load a model from a github repo or a local directory.
Note: Loading a model is the typical use case, but this can also be used tofor loading other objects such as tokenizers, loss functions, etc.
If
sourceis ‘github’,repo_or_diris expected to beof the formrepo_owner/repo_name[:ref]with an optionalref (a tag or a branch).If
sourceis ‘local’,repo_or_diris expected to be apath to a local directory.- Parameters
repo_or_dir (str) – If
sourceis ‘github’,this should correspond to a github repo with formatrepo_owner/repo_name[:ref]withan optional ref (tag or branch), for example ‘pytorch/vision:0.10’. Ifrefis not specified,the default branch is assumed to bemainif it exists, and otherwisemaster.Ifsourceis ‘local’ then it should be a path to a local directory.model (str) – the name of a callable (entrypoint) defined in therepo/dir’s
hubconf.py.*args (optional) – the corresponding args for callable
model.source (str,optional) – ‘github’ or ‘local’. Specifies how
repo_or_diris to be interpreted. Default is ‘github’.trust_repo (bool,str orNone) –
"check",True,FalseorNone.This parameter was introduced in v1.12 and helps ensuring that usersonly run code from repos that they trust.If
False, a prompt will ask the user whether the repo shouldbe trusted.If
True, the repo will be added to the trusted list and loadedwithout requiring explicit confirmation.If
"check", the repo will be checked against the list oftrusted repos in the cache. If it is not present in that list, thebehaviour will fall back onto thetrust_repo=Falseoption.If
None: this will raise a warning, inviting the user to settrust_repoto eitherFalse,Trueor"check". Thisis only present for backward compatibility and will be removed inv2.0.
Default is
Noneand will eventually change to"check"in v2.0.force_reload (bool,optional) – whether to force a fresh download ofthe github repo unconditionally. Does not have any effect if
source='local'. Default isFalse.verbose (bool,optional) – If
False, mute messages about hittinglocal caches. Note that the message about first download cannot bemuted. Does not have any effect ifsource='local'.Default isTrue.skip_validation (bool,optional) – if
False, torchhub will check that the branch or commitspecified by thegithubargument properly belongs to the repo owner. This will makerequests to the GitHub API; you can specify a non-default GitHub token by setting theGITHUB_TOKENenvironment variable. Default isFalse.**kwargs (optional) – the corresponding kwargs for callable
model.
- Returns
The output of the
modelcallable when called with the given*argsand**kwargs.
Example
>>># from a github repo>>>repo="pytorch/vision">>>model=torch.hub.load(...repo,"resnet50",weights="ResNet50_Weights.IMAGENET1K_V1"...)>>># from a local directory>>>path="/some/local/path/pytorch/vision">>>model=torch.hub.load(path,"resnet50",weights="ResNet50_Weights.DEFAULT")
- torch.hub.download_url_to_file(url,dst,hash_prefix=None,progress=True)[source]#
Download object at the given URL to a local path.
- Parameters
url (str) – URL of the object to download
dst (str) – Full path where object will be saved, e.g.
/tmp/temporary_filehash_prefix (str,optional) – If not None, the SHA256 downloaded file should start with
hash_prefix.Default: Noneprogress (bool,optional) – whether or not to display a progress bar to stderrDefault: True
Example
>>>torch.hub.download_url_to_file(..."https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth",..."/tmp/temporary_file",...)
- torch.hub.load_state_dict_from_url(url,model_dir=None,map_location=None,progress=True,check_hash=False,file_name=None,weights_only=False)[source]#
Loads the Torch serialized object at the given URL.
If downloaded file is a zip file, it will be automaticallydecompressed.
If the object is already present inmodel_dir, it’s deserialized andreturned.The default value of
model_diris<hub_dir>/checkpointswherehub_diris the directory returned byget_dir().- Parameters
url (str) – URL of the object to download
model_dir (str,optional) – directory in which to save the object
map_location (optional) – a function or a dict specifying how to remap storage locations (see torch.load)
progress (bool,optional) – whether or not to display a progress bar to stderr.Default: True
check_hash (bool,optional) – If True, the filename part of the URL should follow the naming convention
filename-<sha256>.extwhere<sha256>is the first eight or moredigits of the SHA256 hash of the contents of the file. The hash is used toensure unique names and to verify the contents of the file.Default: Falsefile_name (str,optional) – name for the downloaded file. Filename from
urlwill be used if not set.weights_only (bool,optional) – If True, only weights will be loaded and no complex pickled objects.Recommended for untrusted sources. See
load()for more details.
- Return type
Example
>>>state_dict=torch.hub.load_state_dict_from_url(..."https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth"...)
Running a loaded model:#
Note that*args and**kwargs intorch.hub.load() are used toinstantiate a model. After you have loaded a model, how can you find outwhat you can do with the model?A suggested workflow is
dir(model)to see all available methods of the model.help(model.foo)to check what argumentsmodel.footakes to run
To help users explore without referring to documentation back and forth, we stronglyrecommend repo owners make function help messages clear and succinct. It’s also helpfulto include a minimal working example.
Where are my downloaded models saved?#
The locations are used in the order of
Calling
hub.set_dir(<PATH_TO_HUB_DIR>)$TORCH_HOME/hub, if environment variableTORCH_HOMEis set.$XDG_CACHE_HOME/torch/hub, if environment variableXDG_CACHE_HOMEis set.~/.cache/torch/hub
- torch.hub.get_dir()[source]#
Get the Torch Hub cache directory used for storing downloaded models & weights.
If
set_dir()is not called, default path is$TORCH_HOME/hubwhereenvironment variable$TORCH_HOMEdefaults to$XDG_CACHE_HOME/torch.$XDG_CACHE_HOMEfollows the X Design Group specification of the Linuxfilesystem layout, with a default value~/.cacheif the environmentvariable is not set.- Return type
Caching logic#
By default, we don’t clean up files after loading it. Hub uses the cache by default if it already exists in thedirectory returned byget_dir().
Users can force a reload by callinghub.load(...,force_reload=True). This will deletethe existing GitHub folder and downloaded weights, reinitialize a fresh download. This is usefulwhen updates are published to the same branch, users can keep up with the latest release.
Known limitations:#
Torch hub works by importing the package as if it was installed. There are some side effectsintroduced by importing in Python. For example, you can see new items in Python cachessys.modules andsys.path_importer_cache which is normal Python behavior.This also means that you may have import errors when importing different modelsfrom different repos, if the repos have the same sub-package names (typically, amodel subpackage). A workaround for these kinds of import errors is toremove the offending sub-package from thesys.modules dict; more details canbe found inthis GitHub issue.
A known limitation that is worth mentioning here: usersCANNOT load two different branches ofthe same repo in thesame python process. It’s just like installing two packages with thesame name in Python, which is not good. Cache might join the party and give you surprises if youactually try that. Of course it’s totally fine to load them in separate processes.