- Notifications
You must be signed in to change notification settings - Fork673
Async and sync compatible wrapper#1036
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
base:main
Are you sure you want to change the base?
Uh oh!
There was an error while loading.Please reload this page.
Conversation
Make all functions asyncUse httpx instead of httpxMake decorators async to correctly wrap async methods
Change _http_auth on direct httpx.AsyncClient.auth property
Fix some errors that are met on the way to async
This possibly is temporary solution because httpx raises TypeError whentries to init DataField with bool value
feat: httpx based python-gitlab
on_http_error should work with sync functions that return coroutines andhandle errors when coroutines will be awaited
Also provide async and sync interface to interact with GitlabList
Provide base of gitlab client along with implementations of sync andasync interface
Basic principle is that we won't use constructor to create object,instead classmethod would be used that return either object or corotinethat returns object
feat(async): async and sync compatible wrapper
chore: provide docstrings for gitlab client implementations
vishes-shell commentedMar 5, 2020 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
Here is some sort of notable changes:
|
I just noticed something:
That's really cool :) |
@vishes-shell Thanks for all the work that you're doing! People could have been using chunk_size, by providing a different But we have nothing about this officially in the docs, so I would say we can solve this by mentioning it in the release notes. We anyway need to bump the version to 3.X. |
@max-wittig I tried earlier and at least in some cases it can even be just: project=gl.projects.get(1)export=project.exports.create({})dl=export.download(chunk_size=512) From what I see in the current solution by@vishes-shell this would still work and wouldn't break anything, the argument would just be ignored by |
requests>=2.22.0 | ||
respx>=0.12.1,<0.13 | ||
pytest | ||
pytest-asyncio |
graingertJun 21, 2021 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
can you use the anyio pytest plugin so you test on both trio and asyncio?
the latest httpx depends on anyio which bundles this plugin
darkdragon-001 commentedSep 16, 2021 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
What's the current state? |
darkdragon-001 commentedSep 16, 2021 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
When using your library, I get the following warning:
Apart from that, it works like a charm 😍 |
darkdragon-001 commentedSep 16, 2021
Btw: |
graingert commentedSep 16, 2021
You'll need to use the client with |
darkdragon-001 commentedDec 21, 2021
I am using it since a few months without any problems. |
lmilbaum commentedOct 13, 2022
As part of onboarding myself into this project, I stumbled into this PR. Could someone provide a use case when an async wrapper is being used? |
darkdragon-001 commentedOct 13, 2022
@lmilbaum We are evaluating the jobs from the last day via script such that our QA team has a good overview about possible regressions or problems with our hardware text boxes (e.g. when the same code runs on one runner but not the other one). As network is slow, waiting for the answer of the previous request before submitting the next one takes quite a lot of time. If we can submit all requests asynchronously and just wait for all of them to finish in parallel, we have reduced the run time of our script by a multitude. Unfortunately, this MR is outdated and newer features are not available yet... |
The wrapper herehttps://github.com/pan-net-security/aio-gitlab/ has a few points similar to what@darkdragon-001 described above. But true, this PR would need quite a bit of rework, or be used as starting point for a new one, since the library has changed a bit since then. I personally have not yet had enough need for this to work on it. Another option would be to first switch from requests to httpx for the sync client (with a breaking change, because we're currently allowing people to bring their own requests sessions) and smooth out any kinks, then add async later, to make the transition less painful. |
lmilbaum commentedOct 13, 2022
@darkdragon-001 and@nejch thanks for your feedback. Now, I can understand better the need for an async capability. |
@lmilbaum thanks for looking into this! I'd keep this PR open even if as draft, just so it stays on our radar as there's a lot of useful discussion and work done here. And just so we remember to credit@vishes-shell as author whether they want to still work on this or not. I'd maybe open a new issue to plan the switch to httpx as that would be for v4.0.0. We could then get rid of |
This Pull Request (PR) was marked stale because it has been open 90 days with no activity. Please remove the stale label or comment on this PR. Otherwise, it will be closed in 15 days. |
Uh oh!
There was an error while loading.Please reload this page.
I've rebuild wrapper to support async and sync interface with support of
httpx
.There is two gitlab clients:
Gitlab
andAsyncGitlab
, initerfaces are the same, except most of the methods would return awaitable object in AsyncGitlab.Related#1025