pyproject.toml?Important
This PEP is a historical document. The up-to-date, canonical spec,Inline script metadata, is maintained on thePyPA specs page.
×
See thePyPA specification update process for how to propose changes.
This PEP specifies a metadata format that can be embedded in single-file Pythonscripts to assist launchers, IDEs and other external tools which may need tointeract with such scripts.
Python is routinely used as a scripting language, with Python scripts as a(better) alternative to shell scripts, batch files, etc. When Python code isstructured as a script, it is usually stored as a single file and does notexpect the availability of any other local code that may be used for imports.As such, it is possible to share with others over arbitrary text-based meanssuch as email, a URL to the script, or even a chat window. Code that isstructured like this may live as a single file forever, never becoming afull-fledged project with its own directory andpyproject.toml file.
An issue that users encounter with this approach is that there is no standardmechanism to define metadata for tools whose job it is to execute such scripts.For example, a tool that runs a script may need to know which dependencies arerequired or the supported version(s) of Python.
There is currently no standard tool that addresses this issue, and this PEPdoesnot attempt to define one. However, any tool thatdoes address thisissue will need to know what the runtime requirements of scripts are. Bydefining a standard format for storing such metadata, existing tools, as wellas any future tools, will be able to obtain that information without requiringusers to include tool-specific metadata in their scripts.
This PEP defines a mechanism for embedding metadatawithin the script itself,and not in an external file.
The metadata format is designed to be similar to the layout of data in thepyproject.toml file of a Python project directory, to provide a familiarexperience for users who have experience writing Python projects. By using asimilar format, we avoid unnecessary inconsistency between packaging tools,a common frustration expressed by users in the recentpackaging survey.
The following are some of the use cases that this PEP wishes to support:
hatchrun/path/to/script.py[args] and Hatch will manage theenvironment for that script. Such tools could be used as shebang lines onnon-Windows systems e.g.#!/usr/bin/envhatchrun.This PEP defines a metadata comment block format loosely inspired[2] byreStructuredText Directives.
Any Python script may have top-level comment blocks that MUST start with theline#///TYPE whereTYPE determines how to process the content. Thatis: a single#, followed by a single space, followed by three forwardslashes, followed by a single space, followed by the type of metadata. BlockMUST end with the line#///. That is: a single#, followed by a singlespace, followed by three forward slashes. TheTYPE MUST only consist ofASCII letters, numbers and hyphens.
Every line between these two lines (#///TYPE and#///) MUST be acomment starting with#. If there are characters after the# then thefirst character MUST be a space. The embedded content is formed by taking awaythe first two characters of each line if the second character is a space,otherwise just the first character (which means the line consists of only asingle#).
Precedence for an ending line#/// is given when the next line is nota valid embedded content line as described above. For example, the followingis a single fully valid block:
# /// some-toml# embedded-csharp = """# /// <summary># /// text# ///# /// </summary># public class MyClass { }# """# ///
A starting line MUST NOT be placed between another starting line and its endingline. In such cases tools MAY produce an error. Unclosed blocks MUST be ignored.
When there are multiple comment blocks of the sameTYPE defined, tools MUSTproduce an error.
Tools reading embedded metadata MAY respect the standard Python encodingdeclaration. If they choose not to do so, they MUST process the file as UTF-8.
This is the canonical regular expression that MAY be used to parse themetadata:
(?m)^# /// (?P<type>[a-zA-Z0-9-]+)$\s(?P<content>(^#(| .*)$\s)+)^# ///$
In circumstances where there is a discrepancy between the text specificationand the regular expression, the text specification takes precedence.
Tools MUST NOT read from metadata blocks with types that have not beenstandardized by this PEP or future ones.
The first type of metadata block is namedscript which contains scriptmetadata (dependency data and tool configuration).
This document MAY include top-level fieldsdependencies andrequires-python,and MAY optionally include a[tool] table.
The[tool] table MAY be used by any tool, script runner or otherwise, to configurebehavior. It has the same semantics as thetool table inpyproject.toml.
The top-level fields are:
dependencies: A list of strings that specifies the runtime dependenciesof the script. Each entry MUST be a validPEP 508 dependency.requires-python: A string that specifies the Python version(s) with whichthe script is compatible. The value of this field MUST be a validversion specifier.Script runners MUST error if the specifieddependencies cannot be provided.Script runners SHOULD error if no version of Python that satisfies the specifiedrequires-python can be provided.
The following is an example of a script with embedded metadata:
# /// script# requires-python = ">=3.11"# dependencies = [# "requests<3",# "rich",# ]# ///importrequestsfromrich.prettyimportpprintresp=requests.get("https://peps.python.org/api/peps.json")data=resp.json()pprint([(k,v["title"])fork,vindata.items()][:10])
The following is an example of how to read the metadata on Python 3.11 orhigher.
importreimporttomllibREGEX=r'(?m)^# /// (?P<type>[a-zA-Z0-9-]+)$\s(?P<content>(^#(| .*)$\s)+)^# ///$'defread(script:str)->dict|None:name='script'matches=list(filter(lambdam:m.group('type')==name,re.finditer(REGEX,script)))iflen(matches)>1:raiseValueError(f'Multiple{name} blocks found')eliflen(matches)==1:content=''.join(line[2:]ifline.startswith('# ')elseline[1:]forlineinmatches[0].group('content').splitlines(keepends=True))returntomllib.loads(content)else:returnNone
Often tools will edit dependencies like package managers or dependency updateautomation in CI. The following is a crude example of modifying the contentusing thetomlkitlibrary.
importreimporttomlkitREGEX=r'(?m)^# /// (?P<type>[a-zA-Z0-9-]+)$\s(?P<content>(^#(| .*)$\s)+)^# ///$'defadd(script:str,dependency:str)->str:match=re.search(REGEX,script)content=''.join(line[2:]ifline.startswith('# ')elseline[1:]forlineinmatch.group('content').splitlines(keepends=True))config=tomlkit.parse(content)config['dependencies'].append(dependency)new_content=''.join(f'#{line}'ifline.strip()elsef'#{line}'forlineintomlkit.dumps(config).splitlines(keepends=True))start,end=match.span('content')returnscript[:start]+new_content+script[end:]
Note that this example used a library that preserves TOML formatting. This isnot a requirement for editing by any means but rather is a “nice to have”feature.
The following is an example of how to read a stream of arbitrary metadatablocks.
importrefromtypingimportIteratorREGEX=r'(?m)^# /// (?P<type>[a-zA-Z0-9-]+)$\s(?P<content>(^#(| .*)$\s)+)^# ///$'defstream(script:str)->Iterator[tuple[str,str]]:formatchinre.finditer(REGEX,script):yieldmatch.group('type'),''.join(line[2:]ifline.startswith('# ')elseline[1:]forlineinmatch.group('content').splitlines(keepends=True))
At the time of writing, the#///script block comment starter does notappear in any Python fileson GitHub. Therefore, there is little risk of existingscripts being broken by this PEP.
If a script containing embedded metadata is run using a tool that automaticallyinstalls dependencies, this could cause arbitrary code to be downloaded andinstalled in the user’s environment.
The risk here is part of the functionality of the tool being used to run thescript, and as such should already be addressed by the tool itself. The onlyadditional risk introduced by this PEP is if an untrusted script with embeddedmetadata is run, when a potentially malicious dependency or transitivedependency might be installed.
This risk is addressed by the normal good practice of reviewing codebefore running it. Additionally, tools may be able to providelocking functionality to ameliorate this risk.
To embed metadata in a script, define a comment block that starts with theline#///script and ends with the line#///. Every line betweenthose two lines must be a comment and the full content is derived by removingthe first two characters.
# /// script# dependencies = [# "requests<3",# "rich",# ]# requires-python = ">=3.11"# ///
The allowed fields are described in the following table:
| Field | Description | Tool behavior |
dependencies | A list of strings that specifies the runtime dependencies of the script.Each entry must be a validPEP 508 dependency. | Tools will error if the specified dependencies cannot be provided. |
requires-python | A string that specifies the Python version(s)with which the script is compatible.The value of this field must be a validversion specifier. | Tools might error if no version of Python that satisfiesthe constraint can be executed. |
In addition, a[tool] table is allowed. Details of what is permitted are similarto what is permitted inpyproject.toml, but precise information must be includedin the documentation of the relevant tool.
It is up to individual tools whether or not their behavior is altered based onthe embedded metadata. For example, every script runner may not be able toprovide an environment for specific Python versions as defined by therequires-python field.
Thetool table may be used by any tool, script runneror otherwise, to configure behavior.
Tools that support managing different versions of Python should attempt to usethe highest available version of Python that is compatible with the script’srequires-python metadata, if defined.
The following is a list of tools that have expressed support for this PEP orhave committed to implementing support should it be accepted:
This PEP considers there to be different types of users for whom Python codewould live as single-file scripts:
PATH environment variable.Some examples:PATH, for example, but are unlikely to be familiar withPython concepts like virtual environments. These users often operate inisolation and have limited need to gain exposure to tools intended forsharing like Git.This PEP argues that the proposed TOML-based metadata format is the best foreach category of user and that the requirements-like block comment is onlyapproachable for those who have familiarity withrequirements.txt, whichrepresents a small subset of users.
requirements.txt. These users will very likely rely onsnippets found online via a search engine or utilize AI in the formof a chat bot or direct code completion software. The similarity withdependency information stored inpyproject.toml will provide usefulsearch results relatively quickly, and while thepyproject.toml formatand the script metadata format are not the same, any resulting discrepanciesare unlikely to be difficult for the intended users to resolve.Additionally, these users are most susceptible to formatting quirks andsyntax errors. TOML is a well-defined format with existing onlinevalidators that features assignment that is compatible with Pythonexpressions and has no strict indenting rules. The block comment formaton the other hand could be easily malformed by forgetting the colon, forexample, and debugging why it’s not working with a search engine would bea difficult task for such a user.
requirements.txt. For either formatthey would have to read documentation. They would likely be more comfortablewith TOML since they are used to structured data formats and there would beless perceived magic in their systems.Additionally, for maintenance of their systems///script would bemuch easier to search for from a shell than a block comment with potentiallynumerous extensions over time.
These users are responsible for the security of their systems and most likelyhave security scanners set up to automatically open PRs to update versionsof dependencies. Such automated tools like Dependabot would have a mucheasier time using existing TOML libraries than writing their own customparser for a block comment format.
requirements.txt file, unless they are aPython programmer who has had previous experience with writing applications.In the case of experience with the requirements format, it necessarily meansthat they are at least somewhat familiar with the ecosystem and thereforeit is safe to assume they know what TOML is.Another benefit of this PEP to these users is that their IDEs like VisualStudio Code would be able to provide TOML syntax highlighting much moreeasily than each writing custom logic for this feature.
Additionally, since the original block comment alternative format (double#) went against the recommendation ofPEP 8 and as a result lintersand IDE auto-formatters that respected the recommendation wouldfail by default, the finalproposal uses standard comments starting with a single# character withoutany obvious start nor end sequence.
The concept of regular comments that do not appear to be intended for machines(i.e.encoding declarations) affecting behavior would not be customary tousers of Python and goes directly against the “explicit is better thanimplicit” foundational principle.
Users typing what to them looks like prose could alter runtime behavior. ThisPEP takes the view that the possibility of that happening, even when a toolhas been set up as such (maybe by a sysadmin), is unfriendly to users.
Finally, and critically, the alternatives to this PEP likePEP 722 do notsatisfy the use cases enumerated herein, such as setting the supported Pythonversions, the eventual building of scripts into packages, and the ability tohave machines edit metadata on behalf of users. It is very likely that therequests for such features persist and conceivable that another PEP in thefuture would allow for the embedding of such metadata. At that point therewould be multiple ways to achieve the same thing which goes against ourfoundational principle of “there should be one - and preferably only one -obvious way to do it”.
A previous version of this PEP proposed that the metadata be stored as follows:
__pyproject__="""..."""
The most significant problem with this proposal is that the embedded TOML wouldbe limited in the following ways:
r prefix requirementmay be potentially confusing to users.A previous version of this PEP proposed to reuse the existingmetadata standard that is used to describe projects.
There are two significant problems with this proposal:
name andversion fields are required and changing that wouldrequire its own PEPBy limiting the metadata to justdependencies, we would prevent the knownuse case of tools that support managing Python installations, which wouldallows users to target specific versions of Python for new syntax or standardlibrary functionality.
By not allowing the[tool] table, we would prevent known functionalitythat would benefit users. For example:
gorun can do).The author of the Rust RFC for embedding metadatamentioned to us that they areactively looking into that as well based on user feedback saying that thereis unnecessary friction with managing small projects.
There has beena commitment tosupport this by at least one major build system.
A previous version of this PEP proposed that non-script running tools SHOULDNOT modify their behavior when the script is not the sole input to the tool.For example, if a linter is invoked with the path to a directory, it SHOULDbehave the same as if zero files had embedded metadata.
This was done as a precaution to avoid tool behavior confusion and generatingvarious feature requests for tools to support this PEP. However, duringdiscussion we receivedfeedbackfrom maintainers of tools that this would be undesirable and potentiallyconfusing to users. Additionally, this may allow for a universally easierway to configure tools in certain circumstances and solve existing issues.
pyproject.toml?Again, a key issue here is that the target audience for this proposal is peoplewriting scripts which aren’t intended for distribution. Sometimes scripts willbe “shared”, but this is far more informal than “distribution” - it typicallyinvolves sending a script via an email with some written instructions on how torun it, or passing someone a link to a GitHub gist.
Expecting such users to learn the complexities of Python packaging is asignificant step up in complexity, and would almost certainly give theimpression that “Python is too hard for scripts”.
In addition, if the expectation here is that thepyproject.toml willsomehow be designed for running scripts in place, that’s a new feature of thestandard that doesn’t currently exist. At a minimum, this isn’t a reasonablesuggestion until thecurrent discussion on Discourse about usingpyproject.toml for projects thatwon’t be distributed as wheels is resolved. And even then, it doesn’t addressthe “sending someone a script in a gist or email” use case.
The idea would be to automatically recognizeimport statements in the sourcefile and turn them into a list of requirements.
However, this is infeasible for several reasons. First, the points above aboutthe necessity to keep the syntax easily parsable, for all Python versions, alsoby tools written in other languages, apply equally here.
Second, PyPI and other package repositories conforming to the Simple RepositoryAPI do not provide a mechanism to resolve package names from the module namesthat are imported (see alsothis related discussion).
Third, even if repositories did offer this information, the same import name maycorrespond to several packages on PyPI. One might object that disambiguatingwhich package is wanted would only be needed if there are several projectsproviding the same import name. However, this would make it easy for anyone tounintentionally or malevolently break working scripts, by uploading a package toPyPI providing an import name that is the same as an existing project. Thealternative where, among the candidates, the first package to have beenregistered on the index is chosen, would be confusing in case a popular packageis developed with the same import name as an existing obscure package, and evenharmful if the existing package is malware intentionally uploaded with asufficiently generic import name that has a high probability of being reused.
A related idea would be to attach the requirements as comments to the importstatements instead of gathering them in a block, with a syntax such as:
importnumpyasnp# requires: numpyimportrich# requires: rich
This still suffers from parsing difficulties. Also, where to place the commentin the case of multiline imports is ambiguous and may look ugly:
fromPyQt5.QtWidgetsimport(QCheckBox,QComboBox,QDialog,QDialogButtonBox,QGridLayout,QLabel,QSpinBox,QTextEdit)# requires: PyQt5
Furthermore, this syntax cannot behave as might be intuitively expectedin all situations. Consider:
importplatformifplatform.system()=="Windows":importpywin32# requires: pywin32
Here, the user’s intent is that the package is only required on Windows, butthis cannot be understood by the script runner (the correct way to writeit would berequires:pywin32;sys_platform=='win32').
(Thanks to Jean Abou-Samra for the clear discussion of this point)
Putting your requirements in a requirements file, doesn’t require a PEP. Youcan do that right now, and in fact it’s quite likely that many adhoc solutionsdo this. However, without a standard, there’s no way of knowing how to locate ascript’s dependency data. And furthermore, the requirements file format ispip-specific, so tools relying on it are depending on a pip implementationdetail.
So in order to make a standard, two things would be required:
The first item is a significant undertaking. It has been discussed on a numberof occasions, but so far no-one has attempted to actually do it. The mostlikely approach would be for standards to be developed for individual use casescurrently addressed with requirements files. One option here would be for thisPEP to simply define a new file format which is simply a text file containingPEP 508 requirements, one per line. That would just leave the question ofhow to locate that file.
The “obvious” solution here would be to do something like name the file thesame as the script, but with a.reqs extension (or something similar).However, this still requirestwo files, where currently only a single file isneeded, and as such, does not match the “better batch file” model (shellscripts and batch files are typically self-contained). It requires thedeveloper to remember to keep the two files together, and this may not alwaysbe possible. For example, system administration policies may require thatallfiles in a certain directory are executable (the Linux filesystem standardsrequire this of/usr/bin, for example). And some methods of sharing ascript (for example, publishing it on a text file sharing service like Github’sgist, or a corporate intranet) may not allow for deriving the location of anassociated requirements file from the script’s location (tools likepipxsupport running a script directly from a URL, so “download and unpack a zip ofthe script and its dependencies” may not be an appropriate requirement).
Essentially, though, the issue here is that there is an explicitly statedrequirement that the format supports storing dependency datain the scriptfile itself. Solutions that don’t do that are simply ignoring thatrequirement.
This would typically involve storing metadata as multiple special variables,such as the following.
__requires_python__=">=3.11"__dependencies__=["requests","click",]
The most significant problem with this proposal is that it requires allconsumers of the dependency data to implement a Python parser. Even if thesyntax is restricted, therest of the script will use the full Python syntax,and trying to define a syntax which can be successfully parsed in isolationfrom the surrounding code is likely to be extremely difficult and error-prone.
Furthermore, Python’s syntax changes in every release. If extracting dependencydata needs a Python parser, the parser will need to know which version ofPython the script is written for, and the overhead for a generic tool of havinga parser that can handlemultiple versions of Python is unsustainable.
With this approach there is the potential to clutter scripts with manyvariables as new extensions get added. Additionally, intuiting which metadatafields correspond to which variable names would cause confusion for users.
It is worth noting, though, that thepip-run utility does implement (anextended form of) this approach.Further discussion ofthepip-run design is available on the project’s issue tracker.
These can be handled without needing special metadata and tooling, simply byadding the location of the dependencies tosys.path. This PEP simply isn’tneeded for this case. If, on the other hand, the “local dependencies” areactual distributions which are published locally, they can be specified asusual with aPEP 508 requirement, and the local package index specified whenrunning a tool by using the tool’s UI for that.
None at this point.
This document is placed in the public domain or under theCC0-1.0-Universal license, whichever is more permissive.
Source:https://github.com/python/peps/blob/main/peps/pep-0723.rst
Last modified:2025-08-08 15:00:59 GMT