Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork32.1k
gh-127833: Add links to token types to the lexical analysis intro#131468
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Uh oh!
There was an error while loading.Please reload this page.
Conversation
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com>
45bb5ba
intopython:mainUh oh!
There was an error while loading.Please reload this page.
Thanks@encukou for the PR, and@AA-Turner for merging it 🌮🎉.. I'm working now to backport this PR to: 3.14. |
…ro (pythonGH-131468)(cherry picked from commit45bb5ba)Co-authored-by: Petr Viktorin <encukou@gmail.com>Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com>
GH-133652 is a backport of this pull request to the3.14 branch. |
Uh oh!
There was an error while loading.Please reload this page.
In the first part of the “lexical analysis” document, add links to token type documentation in the
token
module.Add a section on the ENDMARKER token.
(I plan to send separate PR(s) for the rest of the document.)