Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Repo: moveshared-fixtures tests toast-spec fixtures #6065

Closed
Assignees
JoshuaKGoldberg
Labels
accepting prsGo ahead, send a pull request that resolves this issuerepo maintenancethings to do with maintenance of the repo, and not with code/docstestsanything to do with testing
@bradzacher

Description

@bradzacher

Suggestion

In#3258 we added a new testing framework for our AST parsing tests.
We did this for a few reasons:

  1. it allows us to colocate tests alongside the AST definitions - making the ast-spec self documenting of sorts
    • i.e. "this AST node is produced by this code"
    • as an aside - in some future state we'd love to generate documentation on our website using the fixtures - making it truly self documenting!
  2. it allows us to automatically diff against the babel AST to clearly document differences
    • in theshared-fixtures tests we rely on sets ofmanual patches to the AST to get our "alignment" tests to pass (which is obviously pretty hacky and hard to maintain)
    • in theshared-fixtures tests we also rely on aset of ignores to document and skip fixtures we know don't align.
      • This isn't great because there's no checks to enforce an ignored fixture fails like it should (meaning it's easy to forget to un-ignore a fixture when it should pass)
      • This also doesn't document what's actually different - we usually comment with an explanation, but that doesn't document the actual AST difference!

With a new framework comes a migration! We need to create fixtures for all of our AST nodes to test all the many combinations!
There's already a number of examples present in the theast-spec package (for example all of thedeclaration nodes are done), but to walk through the process:

First, setup the fixture folder:

  1. open any AST node folder (any folder with aspec.ts egpackages/ast-spec/src/declaration/ClassDeclaration).
  2. create afixtures folder

Next, add "success" fixture cases:

  1. within yourfixtures folder, create a folder with a short kebab-cased name that describes the case you're covering
    • For example if you're testing the AST for a class declaration that implements one interface you might name itimplements-one. Each fixture should only cover one hyper-targeted, specific case to keep things easy to debug and review.
  2. within your new folder, create a file calledfixture.ts (egpackages/ast-spec/src/declaration/ClassDeclaration/implements-one/fixture.ts) and fill this file with your test case.
    • As mentioned, the code should be hyper-targeted! Put another way - don't try to make your code "semantically" correct (i.e. it doesn't have to pass a type check!) - it just has to be syntactically correct.
      For example - in our "Class declarationimplements-one test case, we shouldn't add a matching interface declaration!
  3. run the test to generate the snapshots:cd packages/ast-spec && yarn jest /fixtures
    • This will generate asnapshots subfolder next to yourfixture.ts that contains a number of snapshots generated from your test. At the time of writing that's:
      1. 1-TSESTree-AST.shot - the AST output from parsing your fixture (parsed using ourtypescript-estree parser)
      2. 2-TSESTree-Tokens.shot - the AST tokens parsed out of your fixture (parsed using ourtypescript-estree parser)
      3. 3-Babel-AST.shot - same as (1) but parsed using babel
      4. 4-Babel-Tokens.shot - same as (2) but parsed using babel
      5. 5-AST-Alignment-AST.shot - the diff of (1) and (3)
      6. 6-AST-Alignment-Tokens.shot - the diff of (2) and (4)
    • If one of the parser throws an error on your fixture then your fixture should instead be an error fixture (see below).
    • If there is a difference in the AST, you'll notice your fixture is auto-added to the snapshotfixtures-with-differences-errors.shot. This is intentional - it's essentially an automatically generated register of the places where we our parser and babel's don't align!

To add "error" fixture cases:

  1. within yourfixtures folder, create a folder called_error_.
  2. Follow the above "success" guide (create a kebab-case folder within_error_ and then afixture.ts within that folder, then run jest).
  3. The snapshots generated will be a little different - they're there to document the thrown errors instead of the AST output! So you'll see:
    1. 1-TSESTree-Error.shot - the error caught during the parse or"NO ERROR" if no error was thrown (parsed using ourtypescript-estree parser)
    2. 2-Babel-Error.shot - same as (1) but parsed with babel
    3. 3-Alignment-Error.shot - a string describing the error state:
      • "Babel errored but TSESTree didn't"
      • "TSESTree errored but Babel didn't"
      • "Both errored"

When creating fixtures it's important to stress that each fixture should be as hyper-focused as you can make it! The more code there is, the more AST gets generated and the harder it is for a human to parse the AST output to find the relevant bits of the AST. More code also means more surface area for changes to impact which means that there's more and more snapshots that unnecessarily need updating when we make parser changes (which just creates noise!).

How to decide what to create a fixture for? The best way it to look at the spec for the AST node and try to create a fixture that targets each case for each property. For exampleClassDeclaration has a boolean propertyabstract: boolean - so we want to ensure we have at least one fixture which generates atrue value for the property, and one that generates afalse value for the property. Again - fixtures are cheap and easy to create, so if in doubt, just add another fixture to cover a case you're thinking about.

Make sure you consider both good and bad cases! Even though our parser doesn't currently throw many syntax errors (#1852), it's good practice to also add bad code fixtures so that we can clearly document the expected current behaviour of our parser and how it aligns with babel!

Finally - don't endeavour to cover every single case in a single PR (unless you really, really want to) - feel free to just create one PR that covers one or a small number of nodes. The beauty of this framework is that we don't need to do everything at once!

Metadata

Metadata

Labels

accepting prsGo ahead, send a pull request that resolves this issuerepo maintenancethings to do with maintenance of the repo, and not with code/docstestsanything to do with testing

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions


    [8]ページ先頭

    ©2009-2025 Movatter.jp