- Notifications
You must be signed in to change notification settings - Fork4
Set upload limits consistently#43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Open
QuLogic wants to merge1 commit intomatplotlib:mainChoose a base branch fromQuLogic:limits
base:main
Could not load branches
Branch not found:{{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline, and old review comments may become outdated.
Open
Uh oh!
There was an error while loading.Please reload this page.
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
We previously checked that the content was below GitHub's 25M limit, butthis was done in the request handler. `aiohttp` _already_ checks thecontent size and has a limit of 1 MiB.Instead, set the limit for `aiohttp` and for Caddy directly. Though thelatter is redundant, it's possibly a bit more secure. Limiting upload tothe regular site is also probably redundant since it goes to`file_server` which supports no uploads, but better to cut that offearly. CloudFlare also has a limit set, but it's to its minimum allowedwhich is 100MB.
I noticed this with pushing the Plausible changes, where the list of files ended up being a 2.5MB JSON document, which failed before it even touched our code. |
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
We previously checked that the content was below GitHub's 25M limit, but this was done in the request handler.
aiohttp
already checks the content size and has a limit of 1 MiB.Instead, set the limit for
aiohttp
and for Caddy directly. Though the latter is redundant, it's possibly a bit more secure. Limiting upload to the regular site is also probably redundant since it goes tofile_server
which supports no uploads, but better to cut that off early. CloudFlare also has a limit set, but it's to its minimum allowed which is 100MB.