Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Codspeed - performance benchmarks#471

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
tony wants to merge6 commits intomaster
base:master
Choose a base branch
Loading
frompytest-codspeed
Open

Conversation

@tony
Copy link
Member

@tonytony commentedSep 22, 2024
edited by sourcery-aibot
Loading

Note

Problem

It's difficult to catch performance degradation or improvements over time, in a PR, etc.

Changes

Add performance benchmarks

TBD

Setup codpseed

Configure on website, set secret, etc.

py(deps[test]) Addpytest-codspeed

See also:

Summary by Sourcery

Add performance benchmarks using Codspeed. Integrate Codspeed into the CI workflow to automatically run performance tests and report results.

CI:

  • Integrate Codspeed into the CI workflow to trigger performance tests on pull requests and pushes, and on demand via manual dispatch.

Tests:

  • Addpytest-codspeed to enable performance testing.

@codecov
Copy link

codecovbot commentedSep 22, 2024
edited
Loading

Codecov Report

Attention: Patch coverage is83.64780% with26 lines in your changes missing coverage. Please review.

Project coverage is 64.09%. Comparing base(bc6e897) to head(4ac41d3).

Files with missing linesPatch %Lines
src/libvcs/pytest_plugin.py80.30%16 Missing and 10 partials⚠️
Additional details and impacted files
@@            Coverage Diff             @@##           master     #471      +/-   ##==========================================+ Coverage   63.85%   64.09%   +0.24%==========================================  Files          40       40                Lines        3591     3724     +133       Branches      774      790      +16     ==========================================+ Hits         2293     2387      +94- Misses        772      800      +28- Partials      526      537      +11

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report?Share it here.

@tonytonyforce-pushed thepytest-codspeed branch 3 times, most recently from61f4aa8 to5ec5337CompareSeptember 22, 2024 11:24
@tonytonyforce-pushed thepytest-codspeed branch 5 times, most recently from9265999 to99fa170CompareOctober 11, 2024 19:00
@tonytonyforce-pushed thepytest-codspeed branch 2 times, most recently fromf649e7f tod0e1548CompareOctober 12, 2024 14:53
@tonytonyforce-pushed themaster branch 2 times, most recently from9b797e7 tobc6e897CompareOctober 12, 2024 15:51
`workflow_dispatch` allows CodSpeed to trigger backtest performanceanalysis in order to generate initial data.See also:https://docs.codspeed.io/ci/github-actions#2-create-the-benchmarks-workflow
tests/sync/test_git.py:11: error: Skipping analyzing "pytest_codspeed.plugin": module is installed, but missing library stubs or py.typed marker  [import-untyped]tests/sync/test_git.py:11: note: Seehttps://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
tony added a commit that referenced this pull requestOct 12, 2024
# ProblemGit, Mercurial, and Subversion repositories are unnecessarily reinitialized for each test.- We're not utilizing session-based scoping.  - A single initial repo could be created, then copied to [`tmp_path`](https://docs.pytest.org/en/8.3.x/how-to/tmp_path.html#the-tmp-path-fixture) using [`shutil.copytree`](https://docs.python.org/3/library/shutil.html#shutil.copytree) ([source](https://github.com/python/cpython/blob/v3.13.0/Lib/shutil.py#L550-L605)).Issue#471 highlighted this inefficiency, where benchmarks showed tens of thousands of redundant functional calls.# Improvement```❯ hyperfine -L branch master,pytest-plugin-fixture-caching 'git checkout {branch} && py.test'Benchmark 1: git checkout master && py.test  Time (mean ± σ):     32.062 s ±  0.869 s    [User: 41.391 s, System: 9.931 s]  Range (min … max):   30.878 s … 33.583 s    10 runsBenchmark 2: git checkout pytest-plugin-fixture-caching && py.test  Time (mean ± σ):     14.659 s ±  0.495 s    [User: 16.351 s, System: 4.433 s]  Range (min … max):   13.990 s … 15.423 s    10 runsSummary  git checkout pytest-plugin-fixture-caching && py.test ran    2.19 ± 0.09 times faster than git checkout master && py.test```# Changes## Pytest fixtures overhaul1. Create a base VCS repo.2. For subsequent tests, copy and modify from this template.
@tonytonyforce-pushed themaster branch 2 times, most recently from7f69eab toc480bb4CompareNovember 24, 2024 15:12
@tony
Copy link
MemberAuthor

@sourcery-ai review

sourcery-ai[bot] reacted with eyes emoji

@sourcery-ai
Copy link

Reviewer's Guide by Sourcery

This pull request introduces performance benchmarks using CodSpeed. It sets up CodSpeed configuration, adds thepytest-codspeed dependency, and integrates it into the testing workflow. A benchmark is added to thetest_repo_git_obtain_initial_commit_repo test function.

Sequence diagram for benchmark execution flow

sequenceDiagram    participant Dev as Developer    participant CI as CI Pipeline    participant CS as CodSpeed    Dev->>CI: Push code changes    activate CI    CI->>CI: Run tests with pytest    CI->>CI: Execute benchmarks    CI->>CS: Send benchmark results    activate CS    CS->>CS: Analyze performance    CS-->>Dev: Report performance changes    deactivate CS    deactivate CI
Loading

File-Level Changes

ChangeDetailsFiles
Set up CodSpeed to collect performance benchmarks.
  • Added a CodSpeed test step to the GitHub Actions workflow.
  • Added thepytest-codspeed Python dependency.
  • Added aworkflow_dispatch trigger to allow CodSpeed to perform backtests.
  • Added necessary environment variables for CodSpeed integration.
  • Added type overrides forpytest_codspeed inpyproject.toml
.github/workflows/tests.yml
pyproject.toml
Added a performance benchmark to an existing test.
  • Added a benchmark to thetest_repo_git_obtain_initial_commit_repo function using theBenchmarkFixture.
tests/sync/test_git.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment@sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it.
  • Generate a pull request title: Write@sourcery-ai anywhere in the pull
    request title to generate a title at any time.
  • Generate a pull request summary: Write@sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time. You can also use
    this command to specify where the summary should be inserted.

Customizing Your Experience

Access yourdashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link

@sourcery-aisourcery-aibot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Hey@tony - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Remember to add thebenchmark annotation/fixture as noted in your TODO. This is needed for proper benchmark function identification.
Here's what I looked at during the review
  • 🟢General issues: all looks good
  • 🟢Security: all looks good
  • 🟢Testing: all looks good
  • 🟢Complexity: all looks good
  • 🟢Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

sourcery-ai[bot] reacted with thumbs up emojisourcery-ai[bot] reacted with thumbs down emoji
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment

Reviewers

1 more reviewer

@sourcery-aisourcery-ai[bot]sourcery-ai[bot] left review comments

Reviewers whose approvals may not affect merge requirements

At least 1 approving review is required to merge this pull request.

Assignees

No one assigned

Labels

None yet

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

2 participants

@tony

[8]ページ先頭

©2009-2025 Movatter.jp