- Notifications
You must be signed in to change notification settings - Fork11
Codspeed - performance benchmarks#471
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
base:master
Are you sure you want to change the base?
Uh oh!
There was an error while loading.Please reload this page.
Conversation
codecovbot commentedSep 22, 2024 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@## master #471 +/- ##==========================================+ Coverage 63.85% 64.09% +0.24%========================================== Files 40 40 Lines 3591 3724 +133 Branches 774 790 +16 ==========================================+ Hits 2293 2387 +94- Misses 772 800 +28- Partials 526 537 +11 ☔ View full report in Codecov by Sentry. |
61f4aa8 to5ec5337Compare9265999 to99fa170Comparef649e7f tod0e1548Compare9b797e7 tobc6e897Compare`workflow_dispatch` allows CodSpeed to trigger backtest performanceanalysis in order to generate initial data.See also:https://docs.codspeed.io/ci/github-actions#2-create-the-benchmarks-workflow
tests/sync/test_git.py:11: error: Skipping analyzing "pytest_codspeed.plugin": module is installed, but missing library stubs or py.typed marker [import-untyped]tests/sync/test_git.py:11: note: Seehttps://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
# ProblemGit, Mercurial, and Subversion repositories are unnecessarily reinitialized for each test.- We're not utilizing session-based scoping. - A single initial repo could be created, then copied to [`tmp_path`](https://docs.pytest.org/en/8.3.x/how-to/tmp_path.html#the-tmp-path-fixture) using [`shutil.copytree`](https://docs.python.org/3/library/shutil.html#shutil.copytree) ([source](https://github.com/python/cpython/blob/v3.13.0/Lib/shutil.py#L550-L605)).Issue#471 highlighted this inefficiency, where benchmarks showed tens of thousands of redundant functional calls.# Improvement```❯ hyperfine -L branch master,pytest-plugin-fixture-caching 'git checkout {branch} && py.test'Benchmark 1: git checkout master && py.test Time (mean ± σ): 32.062 s ± 0.869 s [User: 41.391 s, System: 9.931 s] Range (min … max): 30.878 s … 33.583 s 10 runsBenchmark 2: git checkout pytest-plugin-fixture-caching && py.test Time (mean ± σ): 14.659 s ± 0.495 s [User: 16.351 s, System: 4.433 s] Range (min … max): 13.990 s … 15.423 s 10 runsSummary git checkout pytest-plugin-fixture-caching && py.test ran 2.19 ± 0.09 times faster than git checkout master && py.test```# Changes## Pytest fixtures overhaul1. Create a base VCS repo.2. For subsequent tests, copy and modify from this template.
7f69eab toc480bb4Comparetony commentedJan 11, 2025
@sourcery-ai review |
Reviewer's Guide by SourceryThis pull request introduces performance benchmarks using CodSpeed. It sets up CodSpeed configuration, adds the Sequence diagram for benchmark execution flowsequenceDiagram participant Dev as Developer participant CI as CI Pipeline participant CS as CodSpeed Dev->>CI: Push code changes activate CI CI->>CI: Run tests with pytest CI->>CI: Execute benchmarks CI->>CS: Send benchmark results activate CS CS->>CS: Analyze performance CS-->>Dev: Report performance changes deactivate CS deactivate CIFile-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess yourdashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Hey@tony - I've reviewed your changes - here's some feedback:
Overall Comments:
- Remember to add the
benchmarkannotation/fixture as noted in your TODO. This is needed for proper benchmark function identification.
Here's what I looked at during the review
- 🟢General issues: all looks good
- 🟢Security: all looks good
- 🟢Testing: all looks good
- 🟢Complexity: all looks good
- 🟢Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Uh oh!
There was an error while loading.Please reload this page.
Note
benchmark(BenchmarkFixture)Problem
It's difficult to catch performance degradation or improvements over time, in a PR, etc.
Changes
Add performance benchmarks
TBD
Setup codpseed
Configure on website, set secret, etc.
py(deps[test]) Add
pytest-codspeedSee also:
Summary by Sourcery
Add performance benchmarks using Codspeed. Integrate Codspeed into the CI workflow to automatically run performance tests and report results.
CI:
Tests:
pytest-codspeedto enable performance testing.