From 7366eb5dda0a127c2157e9ead92a34c3cc1e01bc Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 5 Jan 2026 10:42:46 +0000
Subject: [PATCH 001/141] Bump filelock from 3.20.1 to 3.20.2 (#11923)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [filelock](https://github.com/tox-dev/py-filelock) from 3.20.1 to
3.20.2.
Release notes
Sourced from filelock's
releases.
3.20.2
What's Changed
New Contributors
Full Changelog: https://github.com/tox-dev/filelock/compare/3.20.1...3.20.2
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 52f834ac9c4..88ec7d2bc5b 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -71,7 +71,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.20.1
+filelock==3.20.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 61437065baa..72ce9da1aeb 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -69,7 +69,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.20.1
+filelock==3.20.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/lint.txt b/requirements/lint.txt
index a2a3710deb2..9f66ff67e30 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -29,7 +29,7 @@ distlib==0.4.0
# via virtualenv
exceptiongroup==1.3.1
# via pytest
-filelock==3.20.1
+filelock==3.20.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
From 951f089b1087e3f99f940664f45e907548972188 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 5 Jan 2026 10:50:22 +0000
Subject: [PATCH 002/141] Bump certifi from 2025.11.12 to 2026.1.4 (#11925)
Bumps [certifi](https://github.com/certifi/python-certifi) from
2025.11.12 to 2026.1.4.
Commits
c64d9f3
2026.01.04 (#389)
4ac232f
Bump actions/download-artifact from 6.0.0 to 7.0.0 (#387)
95ae4b2
Update CI workflow to use Ubuntu 24.04 and Python 3.14 stable (#386)
b72a7b1
Bump dessant/lock-threads from 5.0.1 to 6.0.0 (#385)
ecc2672
Bump actions/upload-artifact from 5.0.0 to 6.0.0 (#384)
6a897db
Bump peter-evans/create-pull-request from 7.0.11 to 8.0.0 (#383)
27ca98a
Bump peter-evans/create-pull-request from 7.0.9 to 7.0.11 (#381)
56c59a6
Bump actions/checkout from 6.0.0 to 6.0.1 (#382)
ae0021c
Bump actions/setup-python from 6.0.0 to 6.1.0 (#380)
ddf5d0b
Bump actions/checkout from 5.0.1 to 6.0.0 (#378)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 88ec7d2bc5b..9e38e4743ba 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -38,7 +38,7 @@ brotli==1.2.0 ; platform_python_implementation == "CPython"
# via -r requirements/runtime-deps.in
build==1.3.0
# via pip-tools
-certifi==2025.11.12
+certifi==2026.1.4
# via requests
cffi==2.0.0
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 72ce9da1aeb..54b386c959b 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -38,7 +38,7 @@ brotli==1.2.0 ; platform_python_implementation == "CPython"
# via -r requirements/runtime-deps.in
build==1.3.0
# via pip-tools
-certifi==2025.11.12
+certifi==2026.1.4
# via requests
cffi==2.0.0
# via
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index ab23d91c402..108b6e8502e 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -10,7 +10,7 @@ alabaster==1.0.0
# via sphinx
babel==2.17.0
# via sphinx
-certifi==2025.11.12
+certifi==2026.1.4
# via requests
charset-normalizer==3.4.4
# via requests
diff --git a/requirements/doc.txt b/requirements/doc.txt
index 0efefba2adb..a04ef426099 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -10,7 +10,7 @@ alabaster==1.0.0
# via sphinx
babel==2.17.0
# via sphinx
-certifi==2025.11.12
+certifi==2026.1.4
# via requests
charset-normalizer==3.4.4
# via requests
From 7a833e561915fcac5d99225e84f0b7c2bd369b1f Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 8 Jan 2026 10:58:38 +0000
Subject: [PATCH 003/141] Bump pathspec from 0.12.1 to 1.0.2 (#11934)
Bumps [pathspec](https://github.com/cpburnz/python-pathspec) from 0.12.1
to 1.0.2.
Release notes
Sourced from pathspec's
releases.
v1.0.2
Release v1.0.2. See CHANGES.rst.
v1.0.1
Release v1.0.1. See CHANGES.rst.
v1.0.0
Release v1.0.0. See CHANGES.rst.
Changelog
Sourced from pathspec's
changelog.
1.0.2 (2026-01-07)
Bug fixes:
- Type hint
collections.abc.Callable does not properly
replace typing.Callable until Python 3.9.2.
1.0.1 (2026-01-06)
Bug fixes:
Issue
[#100](https://github.com/cpburnz/python-pathspec/issues/100)_:
ValueError(f"{patterns=!r} cannot be empty.") when using
black.
.. _Issue
[#100](https://github.com/cpburnz/python-pathspec/issues/100): cpburnz/python-pathspec#100
1.0.0 (2026-01-05)
Major changes:
Issue
[#91](https://github.com/cpburnz/python-pathspec/issues/91)_:
Dropped support of EoL Python 3.8.
- Added concept of backends to allow for faster regular expression
matching. The backend can be controlled using the
backend
argument to PathSpec(), PathSpec.from_lines(),
GitIgnoreSpec(), and
GitIgnoreSpec.from_lines().
- Renamed "gitwildmatch" pattern back to
"gitignore". The "gitignore" pattern behaves
slightly differently when used with
PathSpec
(gitignore as documented) than with GitIgnoreSpec
(replicates Git's edge cases).
API changes:
- Breaking: protected method
pathspec.pathspec.PathSpec._match_file() (with a leading
underscore) has been removed and replaced by backends. This does not
affect normal usage of PathSpec or
GitIgnoreSpec. Only custom subclasses will be affected. If
this breaks your usage, let me know by opening an issue
<https://github.com/cpburnz/python-pathspec/issues>_.
- Deprecated: "gitwildmatch" is now an alias for
"gitignore".
- Deprecated:
pathspec.patterns.GitWildMatchPattern is
now an alias for
pathspec.patterns.gitignore.spec.GitIgnoreSpecPattern.
- Deprecated:
pathspec.patterns.gitwildmatch module has
been replaced by the pathspec.patterns.gitignore
package.
- Deprecated:
pathspec.patterns.gitwildmatch.GitWildMatchPattern is now
an alias for
pathspec.patterns.gitignore.spec.GitIgnoreSpecPattern.
- Deprecated:
pathspec.patterns.gitwildmatch.GitWildMatchPatternError is
now an alias for
pathspec.patterns.gitignore.GitIgnorePatternError.
- Removed:
pathspec.patterns.gitwildmatch.GitIgnorePattern has been
deprecated since v0.4 (2016-07-15).
- Signature of method
pathspec.pattern.RegexPattern.match_file() has been changed
from def match_file(self, file: str) -> RegexMatchResult |
None to def match_file(self, file: AnyStr) ->
RegexMatchResult | None to reflect usage.
- Signature of class method
pathspec.pattern.RegexPattern.pattern_to_regex() has been
changed from def pattern_to_regex(cls, pattern: str) ->
tuple[str, bool] to def pattern_to_regex(cls, pattern:
AnyStr) -> tuple[AnyStr | None, bool | None] to reflect usage
and documentation.
New features:
- Added optional "hyperscan" backend using
hyperscan_ library. It will automatically be used when
installed. This dependency can be installed with pip install
'pathspec[hyperscan]'.
- Added optional "re2" backend using the
google-re2_ library. It will automatically be used when
installed. This dependency can be installed with pip install
'pathspec[re2]'.
- Added optional dependency on
typing-extensions_ library
to improve some type hints.
Bug fixes:
Issue
[#93](https://github.com/cpburnz/python-pathspec/issues/93)_: Do
not remove leading spaces.
Issue
[#95](https://github.com/cpburnz/python-pathspec/issues/95)_:
Matching for files inside folder does not seem to behave like
.gitignore's.
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 9e38e4743ba..abffa30d8b5 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -131,7 +131,7 @@ packaging==25.0
# gunicorn
# pytest
# sphinx
-pathspec==0.12.1
+pathspec==1.0.2
# via mypy
pip-tools==7.5.2
# via -r requirements/dev.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 54b386c959b..e689db06bc7 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -128,7 +128,7 @@ packaging==25.0
# gunicorn
# pytest
# sphinx
-pathspec==0.12.1
+pathspec==1.0.2
# via mypy
pip-tools==7.5.2
# via -r requirements/dev.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 9f66ff67e30..33aff888d0a 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -57,7 +57,7 @@ nodeenv==1.10.0
# via pre-commit
packaging==25.0
# via pytest
-pathspec==0.12.1
+pathspec==1.0.2
# via mypy
platformdirs==4.5.1
# via virtualenv
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index e3fd94fb778..34ce106f32a 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -46,7 +46,7 @@ mypy-extensions==1.1.0
# via mypy
packaging==25.0
# via pytest
-pathspec==0.12.1
+pathspec==1.0.2
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index fab9c52071c..f66243ea413 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -75,7 +75,7 @@ packaging==25.0
# via
# gunicorn
# pytest
-pathspec==0.12.1
+pathspec==1.0.2
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index cbc6806568f..d49bf0b3dc9 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -75,7 +75,7 @@ packaging==25.0
# via
# gunicorn
# pytest
-pathspec==0.12.1
+pathspec==1.0.2
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
From 02fcae8981fce63a23134b31db83dbba23496c2b Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 8 Jan 2026 11:10:22 +0000
Subject: [PATCH 004/141] Bump urllib3 from 2.6.2 to 2.6.3 (#11936)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.6.2 to 2.6.3.
Release notes
Sourced from urllib3's
releases.
2.6.3
🚀 urllib3 is fundraising for HTTP/2 support
urllib3
is raising ~$40,000 USD to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects please consider contributing
financially to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.
Thank you for your support.
Changes
- Fixed a security issue where decompression-bomb safeguards of the
streaming API were bypassed when HTTP redirects were followed.
(CVE-2026-21441 reported by
@D47A, 8.9 High,
GHSA-38jv-5279-wg99)
- Started treating
Retry-After times greater than 6 hours
as 6 hours by default. (urllib3/urllib3#3743)
- Fixed
urllib3.connection.VerifiedHTTPSConnection on
Emscripten. (urllib3/urllib3#3752)
Changelog
Sourced from urllib3's
changelog.
2.6.3 (2026-01-07)
- Fixed a high-severity security issue where decompression-bomb
safeguards of
the streaming API were bypassed when HTTP redirects were followed.
(
GHSA-38jv-5279-wg99
<https://github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99>__)
- Started treating
Retry-After times greater than 6 hours
as 6 hours by
default. ([#3743](https://github.com/urllib3/urllib3/issues/3743)
<https://github.com/urllib3/urllib3/issues/3743>__)
- Fixed
urllib3.connection.VerifiedHTTPSConnection on
Emscripten.
([#3752](https://github.com/urllib3/urllib3/issues/3752)
<https://github.com/urllib3/urllib3/issues/3752>__)
Commits
0248277
Release 2.6.3
8864ac4
Merge commit from fork
70cecb2
Fix Scorecard issues related to vulnerable dev dependencies (#3755)
41f249a
Move "v2.0 Migration Guide" to the end of the table of
contents (#3747)
fd4dffd
Patch VerifiedHTTPSConnection for Emscripten (#3752)
13f0bfd
Handle massive values in Retry-After when calculating time to sleep for
(#3743)
8c480bf
Bump actions/upload-artifact from 5.0.0 to 6.0.0 (#3748)
4b40616
Bump actions/cache from 4.3.0 to 5.0.1 (#3750)
82b8479
Bump actions/download-artifact from 6.0.0 to 7.0.0 (#3749)
34284cb
Mention experimental features in the security policy (#3746)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index abffa30d8b5..1845dd1d9c7 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -270,7 +270,7 @@ typing-extensions==4.15.0 ; python_version < "3.13"
# virtualenv
typing-inspection==0.4.2
# via pydantic
-urllib3==2.6.2
+urllib3==2.6.3
# via requests
uvloop==0.21.0 ; platform_system != "Windows"
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index e689db06bc7..ab58796932b 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -260,7 +260,7 @@ typing-extensions==4.15.0 ; python_version < "3.13"
# virtualenv
typing-inspection==0.4.2
# via pydantic
-urllib3==2.6.2
+urllib3==2.6.3
# via requests
uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython"
# via
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index 108b6e8502e..64d01879cca 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -69,5 +69,5 @@ towncrier==25.8.0
# via
# -r requirements/doc.in
# sphinxcontrib-towncrier
-urllib3==2.6.2
+urllib3==2.6.3
# via requests
diff --git a/requirements/doc.txt b/requirements/doc.txt
index a04ef426099..7a7545efbe2 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -62,5 +62,5 @@ towncrier==25.8.0
# via
# -r requirements/doc.in
# sphinxcontrib-towncrier
-urllib3==2.6.2
+urllib3==2.6.3
# via requests
From 3a097bff308170adafbdde223aaaea569c34d773 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 9 Jan 2026 20:30:50 +0000
Subject: [PATCH 005/141] Bump cython from 3.2.3 to 3.2.4 (#11924)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [cython](https://github.com/cython/cython) from 3.2.3 to 3.2.4.
Changelog
Sourced from cython's
changelog.
3.2.4 (2026-01-04)
Features added
-
In preparation of Cython 3.3, a new decorator
@collection_type(tname) can be used
to advertise an extension type as being a 'sequence' or
'mapping'. This currently
only has the effect of setting the Py_TPFLAGS_SEQUENCE flag
on the type or not, but
is provided for convenience to allow using the new decorator already in
Cython 3.2 code.
-
Several C++ exception declarations were added to
libcpp.exceptions.
(Github issue :issue:7389)
Bugs fixed
-
Pseudo-literal default values of function arguments like
arg=str() could generate
invalid C code when internally converted into a real literal.
(Github issue :issue:6192)
-
The pickle serialisation of extension types using the
auto_pickle feature was
larger than necessary since 3.2.0 for types without Python object
attributes.
It is now back to the state before 3.2.0 again.
(Github issue :issue:7443)
-
Constants are now only made immortal on freethreading Python if they
are not shared.
(Github issue :issue:7439)
-
PyDict_SetDefaultRef() is now used when available to
avoid temporary borrowed references.
(Github issue :issue:7347)
-
Includes all fixes as of Cython 3.1.8.
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/cython.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 1845dd1d9c7..b6c2a5993a6 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -61,7 +61,7 @@ coverage==7.13.1
# pytest-cov
cryptography==46.0.3
# via trustme
-cython==3.2.3
+cython==3.2.4
# via -r requirements/cython.in
distlib==0.4.0
# via virtualenv
diff --git a/requirements/cython.txt b/requirements/cython.txt
index 667d8f52cd0..03fa291992c 100644
--- a/requirements/cython.txt
+++ b/requirements/cython.txt
@@ -4,7 +4,7 @@
#
# pip-compile --allow-unsafe --output-file=requirements/cython.txt --resolver=backtracking --strip-extras requirements/cython.in
#
-cython==3.2.3
+cython==3.2.4
# via -r requirements/cython.in
multidict==6.7.0
# via -r requirements/multidict.in
From 2f9f56e9c5c3081bf06673d1441778595fbc56de Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 9 Jan 2026 20:32:58 +0000
Subject: [PATCH 006/141] Bump pypa/cibuildwheel from 3.3.0 to 3.3.1 (#11927)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [pypa/cibuildwheel](https://github.com/pypa/cibuildwheel) from
3.3.0 to 3.3.1.
Release notes
Sourced from pypa/cibuildwheel's
releases.
v3.3.1
- 🛠 Update dependencies and container pins, including updating to
CPython 3.14.2. (#2708)
Changelog
Sourced from pypa/cibuildwheel's
changelog.
v3.3.1
5 January 2026
- 🛠 Update dependencies and container pins, including updating to
CPython 3.14.2. (#2708)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index cc36f6f7239..8ab50b9a235 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -426,7 +426,7 @@ jobs:
run: |
make cythonize
- name: Build wheels
- uses: pypa/cibuildwheel@v3.3.0
+ uses: pypa/cibuildwheel@v3.3.1
env:
CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
CIBW_ARCHS_MACOS: x86_64 arm64 universal2
From 016e652d7a9b01cbf34247ad6f1dcaec4ffa11e6 Mon Sep 17 00:00:00 2001
From: "J. Nick Koston"
Date: Sat, 10 Jan 2026 13:05:02 -1000
Subject: [PATCH 007/141] [PR #11940/a47740c backport][3.13] Restore
BodyPartReader.decode() as sync method, add decode_async() for non-blocking
decompression (#11944)
---
CHANGES/11898.bugfix.rst | 6 ++++
aiohttp/multipart.py | 60 +++++++++++++++++++++++++++++-------
aiohttp/web_request.py | 2 +-
docs/multipart_reference.rst | 30 +++++++++++++++++-
tests/test_multipart.py | 58 +++++++++++++++++++++++++++-------
5 files changed, 132 insertions(+), 24 deletions(-)
create mode 100644 CHANGES/11898.bugfix.rst
diff --git a/CHANGES/11898.bugfix.rst b/CHANGES/11898.bugfix.rst
new file mode 100644
index 00000000000..f430bcce997
--- /dev/null
+++ b/CHANGES/11898.bugfix.rst
@@ -0,0 +1,6 @@
+Restored :py:meth:`~aiohttp.BodyPartReader.decode` as a synchronous method
+for backward compatibility. The method was inadvertently changed to async
+in 3.13.3 as part of the decompression bomb security fix. A new
+:py:meth:`~aiohttp.BodyPartReader.decode_async` method is now available
+for non-blocking decompression of large payloads. Internal aiohttp code
+uses the async variant to maintain security protections -- by :user:`bdraco`.
diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py
index 9c37f0bb716..d3a6ae7d604 100644
--- a/aiohttp/multipart.py
+++ b/aiohttp/multipart.py
@@ -325,7 +325,7 @@ async def read(self, *, decode: bool = False) -> bytes:
while not self._at_eof:
data.extend(await self.read_chunk(self.chunk_size))
if decode:
- return await self.decode(data)
+ return await self.decode_async(data)
return data
async def read_chunk(self, size: int = chunk_size) -> bytes:
@@ -503,20 +503,58 @@ def at_eof(self) -> bool:
"""Returns True if the boundary was reached or False otherwise."""
return self._at_eof
- async def decode(self, data: bytes) -> bytes:
- """Decodes data.
+ def _apply_content_transfer_decoding(self, data: bytes) -> bytes:
+ """Apply Content-Transfer-Encoding decoding if header is present."""
+ if CONTENT_TRANSFER_ENCODING in self.headers:
+ return self._decode_content_transfer(data)
+ return data
+
+ def _needs_content_decoding(self) -> bool:
+ """Check if Content-Encoding decoding should be applied."""
+ # https://datatracker.ietf.org/doc/html/rfc7578#section-4.8
+ return not self._is_form_data and CONTENT_ENCODING in self.headers
+
+ def decode(self, data: bytes) -> bytes:
+ """Decodes data synchronously.
- Decoding is done according the specified Content-Encoding
+ Decodes data according the specified Content-Encoding
or Content-Transfer-Encoding headers value.
+
+ Note: For large payloads, consider using decode_async() instead
+ to avoid blocking the event loop during decompression.
"""
- if CONTENT_TRANSFER_ENCODING in self.headers:
- data = self._decode_content_transfer(data)
- # https://datatracker.ietf.org/doc/html/rfc7578#section-4.8
- if not self._is_form_data and CONTENT_ENCODING in self.headers:
- return await self._decode_content(data)
+ data = self._apply_content_transfer_decoding(data)
+ if self._needs_content_decoding():
+ return self._decode_content(data)
return data
- async def _decode_content(self, data: bytes) -> bytes:
+ async def decode_async(self, data: bytes) -> bytes:
+ """Decodes data asynchronously.
+
+ Decodes data according the specified Content-Encoding
+ or Content-Transfer-Encoding headers value.
+
+ This method offloads decompression to an executor for large payloads
+ to avoid blocking the event loop.
+ """
+ data = self._apply_content_transfer_decoding(data)
+ if self._needs_content_decoding():
+ return await self._decode_content_async(data)
+ return data
+
+ def _decode_content(self, data: bytes) -> bytes:
+ encoding = self.headers.get(CONTENT_ENCODING, "").lower()
+ if encoding == "identity":
+ return data
+ if encoding in {"deflate", "gzip"}:
+ return ZLibDecompressor(
+ encoding=encoding,
+ suppress_deflate_header=True,
+ ).decompress_sync(data, max_length=self._max_decompress_size)
+
+ raise RuntimeError(f"unknown content encoding: {encoding}")
+
+ async def _decode_content_async(self, data: bytes) -> bytes:
encoding = self.headers.get(CONTENT_ENCODING, "").lower()
if encoding == "identity":
return data
@@ -599,7 +637,7 @@ async def write(self, writer: AbstractStreamWriter) -> None:
field = self._value
chunk = await field.read_chunk(size=2**16)
while chunk:
- await writer.write(await field.decode(chunk))
+ await writer.write(await field.decode_async(chunk))
chunk = await field.read_chunk(size=2**16)
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index 0eafcd6e34c..9163d3c34fd 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -740,7 +740,7 @@ async def post(self) -> "MultiDictProxy[Union[str, bytes, FileField]]":
)
chunk = await field.read_chunk(size=2**16)
while chunk:
- chunk = await field.decode(chunk)
+ chunk = await field.decode_async(chunk)
await self._loop.run_in_executor(None, tmp.write, chunk)
size += len(chunk)
if 0 < max_size < size:
diff --git a/docs/multipart_reference.rst b/docs/multipart_reference.rst
index e0f6e4a0162..2c13c8cfec9 100644
--- a/docs/multipart_reference.rst
+++ b/docs/multipart_reference.rst
@@ -102,7 +102,7 @@ Multipart reference
.. method:: decode(data)
- Decodes data according the specified ``Content-Encoding``
+ Decodes data synchronously according the specified ``Content-Encoding``
or ``Content-Transfer-Encoding`` headers value.
Supports ``gzip``, ``deflate`` and ``identity`` encodings for
@@ -117,6 +117,34 @@ Multipart reference
:rtype: bytes
+ .. note::
+
+ For large payloads, consider using :meth:`decode_async` instead
+ to avoid blocking the event loop during decompression.
+
+ .. method:: decode_async(data)
+ :async:
+
+ Decodes data asynchronously according the specified ``Content-Encoding``
+ or ``Content-Transfer-Encoding`` headers value.
+
+ This method offloads decompression to an executor for large payloads
+ to avoid blocking the event loop.
+
+ Supports ``gzip``, ``deflate`` and ``identity`` encodings for
+ ``Content-Encoding`` header.
+
+ Supports ``base64``, ``quoted-printable``, ``binary`` encodings for
+ ``Content-Transfer-Encoding`` header.
+
+ :param bytearray data: Data to decode.
+
+ :raises: :exc:`RuntimeError` - if encoding is unknown.
+
+ :rtype: bytes
+
+ .. versionadded:: 3.13.4
+
.. method:: get_charset(default=None)
Returns charset parameter from ``Content-Type`` header or default.
diff --git a/tests/test_multipart.py b/tests/test_multipart.py
index 4e52a3cba1c..4dbba7abcdf 100644
--- a/tests/test_multipart.py
+++ b/tests/test_multipart.py
@@ -392,6 +392,53 @@ async def test_read_with_content_transfer_encoding_base64(self) -> None:
result = await obj.read(decode=True)
assert b"Time to Relax!" == result
+ async def test_decode_with_content_transfer_encoding_base64(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_TRANSFER_ENCODING: "base64"}))
+ with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ result = b""
+ while not obj.at_eof():
+ chunk = await obj.read_chunk(size=6)
+ result += obj.decode(chunk)
+ assert b"Time to Relax!" == result
+
+ async def test_decode_async_with_content_transfer_encoding_base64(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_TRANSFER_ENCODING: "base64"}))
+ with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ result = b""
+ while not obj.at_eof():
+ chunk = await obj.read_chunk(size=6)
+ result += await obj.decode_async(chunk)
+ assert b"Time to Relax!" == result
+
+ async def test_decode_with_content_encoding_deflate(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_ENCODING: "deflate"}))
+ data = b"\x0b\xc9\xccMU(\xc9W\x08J\xcdI\xacP\x04\x00"
+ with Stream(data + b"\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ chunk = await obj.read_chunk(size=len(data))
+ result = obj.decode(chunk)
+ assert b"Time to Relax!" == result
+
+ async def test_decode_with_content_encoding_identity(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_ENCODING: "identity"}))
+ data = b"Time to Relax!"
+ with Stream(data + b"\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ chunk = await obj.read_chunk(size=len(data))
+ result = obj.decode(chunk)
+ assert data == result
+
+ async def test_decode_with_content_encoding_unknown(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_ENCODING: "snappy"}))
+ data = b"Time to Relax!"
+ with Stream(data + b"\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ chunk = await obj.read_chunk(size=len(data))
+ with pytest.raises(RuntimeError, match="unknown content encoding"):
+ obj.decode(chunk)
+
async def test_read_with_content_transfer_encoding_quoted_printable(self) -> None:
with Stream(
b"=D0=9F=D1=80=D0=B8=D0=B2=D0=B5=D1=82,"
@@ -409,17 +456,6 @@ async def test_read_with_content_transfer_encoding_quoted_printable(self) -> Non
)
assert result == expected
- async def test_decode_with_content_transfer_encoding_base64(self) -> None:
- with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
- obj = aiohttp.BodyPartReader(
- BOUNDARY, {CONTENT_TRANSFER_ENCODING: "base64"}, stream
- )
- result = b""
- while not obj.at_eof():
- chunk = await obj.read_chunk(size=6)
- result += await obj.decode(chunk)
- assert b"Time to Relax!" == result
-
@pytest.mark.parametrize("encoding", ("binary", "8bit", "7bit"))
async def test_read_with_content_transfer_encoding_binary(
self, encoding: str
From 2fc38fe163a29f7b84eaad0c1e3fba14fa079d93 Mon Sep 17 00:00:00 2001
From: "J. Nick Koston"
Date: Sat, 10 Jan 2026 13:05:15 -1000
Subject: [PATCH 008/141] [PR #11940/a47740c backport][3.14] Restore
BodyPartReader.decode() as sync method, add decode_async() for non-blocking
decompression (#11943)
---
CHANGES/11898.bugfix.rst | 6 ++++
aiohttp/multipart.py | 60 +++++++++++++++++++++++++++++-------
aiohttp/web_request.py | 2 +-
docs/multipart_reference.rst | 30 +++++++++++++++++-
tests/test_multipart.py | 58 +++++++++++++++++++++++++++-------
5 files changed, 132 insertions(+), 24 deletions(-)
create mode 100644 CHANGES/11898.bugfix.rst
diff --git a/CHANGES/11898.bugfix.rst b/CHANGES/11898.bugfix.rst
new file mode 100644
index 00000000000..f430bcce997
--- /dev/null
+++ b/CHANGES/11898.bugfix.rst
@@ -0,0 +1,6 @@
+Restored :py:meth:`~aiohttp.BodyPartReader.decode` as a synchronous method
+for backward compatibility. The method was inadvertently changed to async
+in 3.13.3 as part of the decompression bomb security fix. A new
+:py:meth:`~aiohttp.BodyPartReader.decode_async` method is now available
+for non-blocking decompression of large payloads. Internal aiohttp code
+uses the async variant to maintain security protections -- by :user:`bdraco`.
diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py
index ba41659d90c..0736e8090a2 100644
--- a/aiohttp/multipart.py
+++ b/aiohttp/multipart.py
@@ -313,7 +313,7 @@ async def read(self, *, decode: bool = False) -> bytes:
while not self._at_eof:
data.extend(await self.read_chunk(self.chunk_size))
if decode:
- return await self.decode(data)
+ return await self.decode_async(data)
return data
async def read_chunk(self, size: int = chunk_size) -> bytes:
@@ -491,20 +491,58 @@ def at_eof(self) -> bool:
"""Returns True if the boundary was reached or False otherwise."""
return self._at_eof
- async def decode(self, data: bytes) -> bytes:
- """Decodes data.
+ def _apply_content_transfer_decoding(self, data: bytes) -> bytes:
+ """Apply Content-Transfer-Encoding decoding if header is present."""
+ if CONTENT_TRANSFER_ENCODING in self.headers:
+ return self._decode_content_transfer(data)
+ return data
+
+ def _needs_content_decoding(self) -> bool:
+ """Check if Content-Encoding decoding should be applied."""
+ # https://datatracker.ietf.org/doc/html/rfc7578#section-4.8
+ return not self._is_form_data and CONTENT_ENCODING in self.headers
+
+ def decode(self, data: bytes) -> bytes:
+ """Decodes data synchronously.
- Decoding is done according the specified Content-Encoding
+ Decodes data according the specified Content-Encoding
or Content-Transfer-Encoding headers value.
+
+ Note: For large payloads, consider using decode_async() instead
+ to avoid blocking the event loop during decompression.
"""
- if CONTENT_TRANSFER_ENCODING in self.headers:
- data = self._decode_content_transfer(data)
- # https://datatracker.ietf.org/doc/html/rfc7578#section-4.8
- if not self._is_form_data and CONTENT_ENCODING in self.headers:
- return await self._decode_content(data)
+ data = self._apply_content_transfer_decoding(data)
+ if self._needs_content_decoding():
+ return self._decode_content(data)
return data
- async def _decode_content(self, data: bytes) -> bytes:
+ async def decode_async(self, data: bytes) -> bytes:
+ """Decodes data asynchronously.
+
+ Decodes data according the specified Content-Encoding
+ or Content-Transfer-Encoding headers value.
+
+ This method offloads decompression to an executor for large payloads
+ to avoid blocking the event loop.
+ """
+ data = self._apply_content_transfer_decoding(data)
+ if self._needs_content_decoding():
+ return await self._decode_content_async(data)
+ return data
+
+ def _decode_content(self, data: bytes) -> bytes:
+ encoding = self.headers.get(CONTENT_ENCODING, "").lower()
+ if encoding == "identity":
+ return data
+ if encoding in {"deflate", "gzip"}:
+ return ZLibDecompressor(
+ encoding=encoding,
+ suppress_deflate_header=True,
+ ).decompress_sync(data, max_length=self._max_decompress_size)
+
+ raise RuntimeError(f"unknown content encoding: {encoding}")
+
+ async def _decode_content_async(self, data: bytes) -> bytes:
encoding = self.headers.get(CONTENT_ENCODING, "").lower()
if encoding == "identity":
return data
@@ -587,7 +625,7 @@ async def write(self, writer: AbstractStreamWriter) -> None:
field = self._value
chunk = await field.read_chunk(size=2**16)
while chunk:
- await writer.write(await field.decode(chunk))
+ await writer.write(await field.decode_async(chunk))
chunk = await field.read_chunk(size=2**16)
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index c2185a7a214..f5bb35a87fc 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -749,7 +749,7 @@ async def post(self) -> "MultiDictProxy[str | bytes | FileField]":
)
chunk = await field.read_chunk(size=2**16)
while chunk:
- chunk = await field.decode(chunk)
+ chunk = await field.decode_async(chunk)
await self._loop.run_in_executor(None, tmp.write, chunk)
size += len(chunk)
if 0 < max_size < size:
diff --git a/docs/multipart_reference.rst b/docs/multipart_reference.rst
index e0f6e4a0162..2c13c8cfec9 100644
--- a/docs/multipart_reference.rst
+++ b/docs/multipart_reference.rst
@@ -102,7 +102,7 @@ Multipart reference
.. method:: decode(data)
- Decodes data according the specified ``Content-Encoding``
+ Decodes data synchronously according the specified ``Content-Encoding``
or ``Content-Transfer-Encoding`` headers value.
Supports ``gzip``, ``deflate`` and ``identity`` encodings for
@@ -117,6 +117,34 @@ Multipart reference
:rtype: bytes
+ .. note::
+
+ For large payloads, consider using :meth:`decode_async` instead
+ to avoid blocking the event loop during decompression.
+
+ .. method:: decode_async(data)
+ :async:
+
+ Decodes data asynchronously according the specified ``Content-Encoding``
+ or ``Content-Transfer-Encoding`` headers value.
+
+ This method offloads decompression to an executor for large payloads
+ to avoid blocking the event loop.
+
+ Supports ``gzip``, ``deflate`` and ``identity`` encodings for
+ ``Content-Encoding`` header.
+
+ Supports ``base64``, ``quoted-printable``, ``binary`` encodings for
+ ``Content-Transfer-Encoding`` header.
+
+ :param bytearray data: Data to decode.
+
+ :raises: :exc:`RuntimeError` - if encoding is unknown.
+
+ :rtype: bytes
+
+ .. versionadded:: 3.13.4
+
.. method:: get_charset(default=None)
Returns charset parameter from ``Content-Type`` header or default.
diff --git a/tests/test_multipart.py b/tests/test_multipart.py
index af091e2947f..05d9aad009f 100644
--- a/tests/test_multipart.py
+++ b/tests/test_multipart.py
@@ -391,6 +391,53 @@ async def test_read_with_content_transfer_encoding_base64(self) -> None:
result = await obj.read(decode=True)
assert b"Time to Relax!" == result
+ async def test_decode_with_content_transfer_encoding_base64(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_TRANSFER_ENCODING: "base64"}))
+ with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ result = b""
+ while not obj.at_eof():
+ chunk = await obj.read_chunk(size=6)
+ result += obj.decode(chunk)
+ assert b"Time to Relax!" == result
+
+ async def test_decode_async_with_content_transfer_encoding_base64(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_TRANSFER_ENCODING: "base64"}))
+ with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ result = b""
+ while not obj.at_eof():
+ chunk = await obj.read_chunk(size=6)
+ result += await obj.decode_async(chunk)
+ assert b"Time to Relax!" == result
+
+ async def test_decode_with_content_encoding_deflate(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_ENCODING: "deflate"}))
+ data = b"\x0b\xc9\xccMU(\xc9W\x08J\xcdI\xacP\x04\x00"
+ with Stream(data + b"\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ chunk = await obj.read_chunk(size=len(data))
+ result = obj.decode(chunk)
+ assert b"Time to Relax!" == result
+
+ async def test_decode_with_content_encoding_identity(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_ENCODING: "identity"}))
+ data = b"Time to Relax!"
+ with Stream(data + b"\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ chunk = await obj.read_chunk(size=len(data))
+ result = obj.decode(chunk)
+ assert data == result
+
+ async def test_decode_with_content_encoding_unknown(self) -> None:
+ h = CIMultiDictProxy(CIMultiDict({CONTENT_ENCODING: "snappy"}))
+ data = b"Time to Relax!"
+ with Stream(data + b"\r\n--:--") as stream:
+ obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
+ chunk = await obj.read_chunk(size=len(data))
+ with pytest.raises(RuntimeError, match="unknown content encoding"):
+ obj.decode(chunk)
+
async def test_read_with_content_transfer_encoding_quoted_printable(self) -> None:
with Stream(
b"=D0=9F=D1=80=D0=B8=D0=B2=D0=B5=D1=82,"
@@ -408,17 +455,6 @@ async def test_read_with_content_transfer_encoding_quoted_printable(self) -> Non
)
assert result == expected
- async def test_decode_with_content_transfer_encoding_base64(self) -> None:
- with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
- obj = aiohttp.BodyPartReader(
- BOUNDARY, {CONTENT_TRANSFER_ENCODING: "base64"}, stream
- )
- result = b""
- while not obj.at_eof():
- chunk = await obj.read_chunk(size=6)
- result += await obj.decode(chunk)
- assert b"Time to Relax!" == result
-
@pytest.mark.parametrize("encoding", ("binary", "8bit", "7bit"))
async def test_read_with_content_transfer_encoding_binary(
self, encoding: str
From a6de40dd733b4a906aba0f3f3785811a84212363 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Sat, 10 Jan 2026 23:43:52 +0000
Subject: [PATCH 009/141] Bump pathspec from 1.0.2 to 1.0.3 (#11949)
Bumps [pathspec](https://github.com/cpburnz/python-pathspec) from 1.0.2
to 1.0.3.
Release notes
Sourced from pathspec's
releases.
v1.0.3
Release v1.0.3. See CHANGES.rst.
Changelog
Sourced from pathspec's
changelog.
1.0.3 (2026-01-09)
Bug fixes:
Issue
[#101](https://github.com/cpburnz/python-pathspec/issues/101)_:
pyright strict errors with pathspec >= 1.0.0.
Issue
[#102](https://github.com/cpburnz/python-pathspec/issues/102)_:
No module named 'tomllib'.
.. _Issue
[#101](https://github.com/cpburnz/python-pathspec/issues/101): cpburnz/python-pathspec#101
.. _Issue
[#102](https://github.com/cpburnz/python-pathspec/issues/102): cpburnz/python-pathspec#102
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index b6c2a5993a6..55bfa7af130 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -131,7 +131,7 @@ packaging==25.0
# gunicorn
# pytest
# sphinx
-pathspec==1.0.2
+pathspec==1.0.3
# via mypy
pip-tools==7.5.2
# via -r requirements/dev.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index ab58796932b..f6ac5338e67 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -128,7 +128,7 @@ packaging==25.0
# gunicorn
# pytest
# sphinx
-pathspec==1.0.2
+pathspec==1.0.3
# via mypy
pip-tools==7.5.2
# via -r requirements/dev.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 33aff888d0a..6ccb4c39b6a 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -57,7 +57,7 @@ nodeenv==1.10.0
# via pre-commit
packaging==25.0
# via pytest
-pathspec==1.0.2
+pathspec==1.0.3
# via mypy
platformdirs==4.5.1
# via virtualenv
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 34ce106f32a..d9d99533c78 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -46,7 +46,7 @@ mypy-extensions==1.1.0
# via mypy
packaging==25.0
# via pytest
-pathspec==1.0.2
+pathspec==1.0.3
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index f66243ea413..5ee94137e74 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -75,7 +75,7 @@ packaging==25.0
# via
# gunicorn
# pytest
-pathspec==1.0.2
+pathspec==1.0.3
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index d49bf0b3dc9..41784454907 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -75,7 +75,7 @@ packaging==25.0
# via
# gunicorn
# pytest
-pathspec==1.0.2
+pathspec==1.0.3
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
From b6e4b4af56c20b74c1edcfeab355bc447dd68f0d Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Sun, 11 Jan 2026 00:09:47 +0000
Subject: [PATCH 010/141] Bump filelock from 3.20.2 to 3.20.3 (#11952)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [filelock](https://github.com/tox-dev/py-filelock) from 3.20.2 to
3.20.3.
Release notes
Sourced from filelock's
releases.
3.20.3
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.20.2...3.20.3
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 55bfa7af130..ecd51f80047 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -71,7 +71,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.20.2
+filelock==3.20.3
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/dev.txt b/requirements/dev.txt
index f6ac5338e67..5553ae8e16b 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -69,7 +69,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.20.2
+filelock==3.20.3
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 6ccb4c39b6a..911eb446b2c 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -29,7 +29,7 @@ distlib==0.4.0
# via virtualenv
exceptiongroup==1.3.1
# via pytest
-filelock==3.20.2
+filelock==3.20.3
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
From 7f1cf5e79ae30aee360c5221f60a3ffdec3e1cdc Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Sun, 11 Jan 2026 00:26:31 +0000
Subject: [PATCH 011/141] Bump virtualenv from 20.35.4 to 20.36.1 (#11953)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [virtualenv](https://github.com/pypa/virtualenv) from 20.35.4 to
20.36.1.
Release notes
Sourced from virtualenv's
releases.
20.36.1
What's Changed
Full Changelog: https://github.com/pypa/virtualenv/compare/20.36.0...20.36.1
20.36.0
What's Changed
New Contributors
Full Changelog: https://github.com/pypa/virtualenv/compare/20.35.3...20.36.0
Changelog
Sourced from virtualenv's
changelog.
v20.36.1 (2026-01-09)
Bugfixes - 20.36.1
- Fix TOCTOU vulnerabilities in app_data and lock directory
creation that could be exploited via symlink attacks - reported by
:user:`tsigouris007`, fixed by :user:`gaborbernat`. (:issue:`3013`)
v20.36.0 (2026-01-07)
Features - 20.36.0
- Add support for PEP 440 version specifiers in the
--python flag. Users can now specify Python versions using
operators like >=, <=, ~=,
etc. For example: virtualenv --python=">=3.12"
myenv . (:issue:2994`)
Commits
d0ad11d
release 20.36.1
dec4cec
Merge pull request #3013
from gaborbernat/fix-sec
5fe5d38
release 20.36.0 (#3011)
9719376
release 20.36.0
0276db6
Add support for PEP 440 version specifiers in the --python
flag. (#3008)
4f900c2
Fix Interpreter discovery bug wrt. Microsoft Store shortcut using
Latin-1 (#3...
13afcc6
fix: resolve EncodingWarning in tox upgrade environment (#3007)
31b5d31
[pre-commit.ci] pre-commit autoupdate (#2997)
7c28422
fix: update filelock dependency version to 3.20.1 to fix CVE
CVE-2025-68146 (...
365628c
test_too_many_open_files: assert on errno.EMFILE instead of
strerror (#3001)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index ecd51f80047..ac16f46d94c 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -278,7 +278,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.35.4
+virtualenv==20.36.1
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 5553ae8e16b..ce5c8588cd9 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -268,7 +268,7 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.35.4
+virtualenv==20.36.1
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 911eb446b2c..189198fe34c 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -121,7 +121,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# via -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.35.4
+virtualenv==20.36.1
# via pre-commit
zlib-ng==1.0.0
# via -r requirements/lint.in
From 4171f86acc85aa03049400a50958df2803d9d954 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 12 Jan 2026 13:02:54 +0000
Subject: [PATCH 012/141] Bump tomli from 2.3.0 to 2.4.0 (#11957)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [tomli](https://github.com/hukkin/tomli) from 2.3.0 to 2.4.0.
Changelog
Sourced from tomli's
changelog.
2.4.0
- Added
- TOML v1.1.0 compatibility
- Binary wheels for Windows arm64
Commits
a678e6f
Bump version: 2.3.0 → 2.4.0
b8a1358
Tests: remove now needless "TOML
compliance"->"burntsushi" format conversion
4979375
Update GitHub actions
f890dd1
Update pre-commit hooks
d9c65c3
Add 2.4.0 change log
0efe49d
Update README for v2.4.0
9eb2125
TOML 1.1: Make seconds optional in Date-Time and Time (#203)
12314bd
TOML 1.1: Add \xHH Unicode escape code to basic strings (#202)
2a2aa62
TOML 1.1: Allow newlines and trailing comma in inline tables (#200)
38297f8
Xfail on tests for TOML 1.1 features not yet supported
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
8 files changed, 8 insertions(+), 8 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index ac16f46d94c..5d502b221b4 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -237,7 +237,7 @@ sphinxcontrib-spelling==8.0.2 ; platform_system != "Windows"
# via -r requirements/doc-spelling.in
sphinxcontrib-towncrier==0.5.0a0
# via -r requirements/doc.in
-tomli==2.3.0
+tomli==2.4.0
# via
# build
# coverage
diff --git a/requirements/dev.txt b/requirements/dev.txt
index ce5c8588cd9..61c50af190d 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -227,7 +227,7 @@ sphinxcontrib-serializinghtml==2.0.0
# via sphinx
sphinxcontrib-towncrier==0.5.0a0
# via -r requirements/doc.in
-tomli==2.3.0
+tomli==2.4.0
# via
# build
# coverage
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index 64d01879cca..f8e52b40ecc 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -61,7 +61,7 @@ sphinxcontrib-spelling==8.0.2 ; platform_system != "Windows"
# via -r requirements/doc-spelling.in
sphinxcontrib-towncrier==0.5.0a0
# via -r requirements/doc.in
-tomli==2.3.0
+tomli==2.4.0
# via
# sphinx
# towncrier
diff --git a/requirements/doc.txt b/requirements/doc.txt
index 7a7545efbe2..b9dcb92a428 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -54,7 +54,7 @@ sphinxcontrib-serializinghtml==2.0.0
# via sphinx
sphinxcontrib-towncrier==0.5.0a0
# via -r requirements/doc.in
-tomli==2.3.0
+tomli==2.4.0
# via
# sphinx
# towncrier
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 189198fe34c..01a01448106 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -98,7 +98,7 @@ six==1.17.0
# via python-dateutil
slotscheck==0.19.1
# via -r requirements/lint.in
-tomli==2.3.0
+tomli==2.4.0
# via
# mypy
# pytest
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index d9d99533c78..8416cd7ab40 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -95,7 +95,7 @@ setuptools-git==1.2
# via -r requirements/test-common.in
six==1.17.0
# via python-dateutil
-tomli==2.3.0
+tomli==2.4.0
# via
# coverage
# mypy
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 5ee94137e74..f32450f9d9c 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -130,7 +130,7 @@ setuptools-git==1.2
# via -r requirements/test-common.in
six==1.17.0
# via python-dateutil
-tomli==2.3.0
+tomli==2.4.0
# via
# coverage
# mypy
diff --git a/requirements/test.txt b/requirements/test.txt
index 41784454907..40d54aa47d6 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -130,7 +130,7 @@ setuptools-git==1.2
# via -r requirements/test-common.in
six==1.17.0
# via python-dateutil
-tomli==2.3.0
+tomli==2.4.0
# via
# coverage
# mypy
From 75c3ef22c464eceedb279fd7ac433c99da8f10ae Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Mon, 12 Jan 2026 14:53:02 +0000
Subject: [PATCH 013/141] [PR #11938/d09103a5 backport][3.14] Add Windows 11
ARM64 to CI/CD workflow matrix (#11958)
**This is a backport of PR #11938 as merged into master
(d09103a5baed801a0d475a79270cc08b205110bb).**
Co-authored-by: AraHaan
---
.github/workflows/ci-cd.yml | 2 +-
CHANGES/11937.misc.rst | 2 ++
2 files changed, 3 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/11937.misc.rst
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 8ab50b9a235..173b97fad57 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -350,7 +350,7 @@ jobs:
needs: pre-deploy
strategy:
matrix:
- os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
+ os: ["ubuntu-latest", "windows-latest", "windows-11-arm", "macos-latest", "ubuntu-24.04-arm"]
qemu: ['']
musl: [""]
include:
diff --git a/CHANGES/11937.misc.rst b/CHANGES/11937.misc.rst
new file mode 100644
index 00000000000..e8435d14618
--- /dev/null
+++ b/CHANGES/11937.misc.rst
@@ -0,0 +1,2 @@
+Added win_arm64 to the wheels that gets pushed to PyPI
+-- by :user:`AraHaan`.
From 30ec25f8a58c5dc3f8fdb3eec31f555eeaabd30a Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Mon, 12 Jan 2026 18:16:47 +0000
Subject: [PATCH 014/141] Add max_headers parameter (#11955) (#11959)
(cherry picked from commit ed6440ca49ef4907ab9d99ba7e329aab702b7173)
---
CHANGES/11955.feature.rst | 1 +
aiohttp/_http_parser.pyx | 31 +++++----
aiohttp/client.py | 9 +++
aiohttp/client_proto.py | 2 +
aiohttp/http_exceptions.py | 9 +--
aiohttp/http_parser.py | 84 +++++++++++++----------
aiohttp/web_protocol.py | 2 +-
docs/client_reference.rst | 17 +++--
docs/web_reference.rst | 5 +-
tests/test_client_functional.py | 117 +++++++++++++++++++++++---------
tests/test_http_exceptions.py | 18 ++---
tests/test_http_parser.py | 112 ++++++++++++++++++++++--------
12 files changed, 277 insertions(+), 130 deletions(-)
create mode 100644 CHANGES/11955.feature.rst
diff --git a/CHANGES/11955.feature.rst b/CHANGES/11955.feature.rst
new file mode 100644
index 00000000000..eaea1016e60
--- /dev/null
+++ b/CHANGES/11955.feature.rst
@@ -0,0 +1 @@
+Added ``max_headers`` parameter to limit the number of headers that should be read from a response -- by :user:`Dreamsorcerer`.
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index 4a7101edbcb..cc8b13cfbe8 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -279,6 +279,7 @@ cdef class HttpParser:
object _name
bytes _raw_value
bint _has_value
+ int _header_name_size
object _protocol
object _loop
@@ -329,7 +330,7 @@ cdef class HttpParser:
self, cparser.llhttp_type mode,
object protocol, object loop, int limit,
object timer=None,
- size_t max_line_size=8190, size_t max_headers=32768,
+ size_t max_line_size=8190, size_t max_headers=128,
size_t max_field_size=8190, payload_exception=None,
bint response_with_body=True, bint read_until_eof=False,
bint auto_decompress=True,
@@ -352,6 +353,7 @@ cdef class HttpParser:
self._raw_name = EMPTY_BYTES
self._raw_value = EMPTY_BYTES
self._has_value = False
+ self._header_name_size = 0
self._max_line_size = max_line_size
self._max_headers = max_headers
@@ -383,11 +385,14 @@ cdef class HttpParser:
value = self._raw_value.decode('utf-8', 'surrogateescape')
self._headers.append((name, value))
+ if len(self._headers) > self._max_headers:
+ raise BadHttpMessage("Too many headers received")
if name is CONTENT_ENCODING:
self._content_encoding = value
self._has_value = False
+ self._header_name_size = 0
self._raw_headers.append((self._raw_name, self._raw_value))
self._raw_name = EMPTY_BYTES
self._raw_value = EMPTY_BYTES
@@ -574,7 +579,7 @@ cdef class HttpRequestParser(HttpParser):
def __init__(
self, protocol, loop, int limit, timer=None,
- size_t max_line_size=8190, size_t max_headers=32768,
+ size_t max_line_size=8190, size_t max_headers=128,
size_t max_field_size=8190, payload_exception=None,
bint response_with_body=True, bint read_until_eof=False,
bint auto_decompress=True,
@@ -638,7 +643,7 @@ cdef class HttpResponseParser(HttpParser):
def __init__(
self, protocol, loop, int limit, timer=None,
- size_t max_line_size=8190, size_t max_headers=32768,
+ size_t max_line_size=8190, size_t max_headers=128,
size_t max_field_size=8190, payload_exception=None,
bint response_with_body=True, bint read_until_eof=False,
bint auto_decompress=True
@@ -677,8 +682,8 @@ cdef int cb_on_url(cparser.llhttp_t* parser,
cdef HttpParser pyparser = parser.data
try:
if length > pyparser._max_line_size:
- raise LineTooLong(
- 'Status line is too long', pyparser._max_line_size, length)
+ status = pyparser._buf + at[:length]
+ raise LineTooLong(status[:100] + b"...", pyparser._max_line_size)
extend(pyparser._buf, at, length)
except BaseException as ex:
pyparser._last_error = ex
@@ -690,11 +695,10 @@ cdef int cb_on_url(cparser.llhttp_t* parser,
cdef int cb_on_status(cparser.llhttp_t* parser,
const char *at, size_t length) except -1:
cdef HttpParser pyparser = parser.data
- cdef str reason
try:
if length > pyparser._max_line_size:
- raise LineTooLong(
- 'Status line is too long', pyparser._max_line_size, length)
+ reason = pyparser._buf + at[:length]
+ raise LineTooLong(reason[:100] + b"...", pyparser._max_line_size)
extend(pyparser._buf, at, length)
except BaseException as ex:
pyparser._last_error = ex
@@ -711,8 +715,9 @@ cdef int cb_on_header_field(cparser.llhttp_t* parser,
pyparser._on_status_complete()
size = len(pyparser._raw_name) + length
if size > pyparser._max_field_size:
- raise LineTooLong(
- 'Header name is too long', pyparser._max_field_size, size)
+ name = pyparser._raw_name + at[:length]
+ raise LineTooLong(name[:100] + b"...", pyparser._max_field_size)
+ pyparser._header_name_size = size
pyparser._on_header_field(at, length)
except BaseException as ex:
pyparser._last_error = ex
@@ -727,9 +732,9 @@ cdef int cb_on_header_value(cparser.llhttp_t* parser,
cdef Py_ssize_t size
try:
size = len(pyparser._raw_value) + length
- if size > pyparser._max_field_size:
- raise LineTooLong(
- 'Header value is too long', pyparser._max_field_size, size)
+ if pyparser._header_name_size + size > pyparser._max_field_size:
+ value = pyparser._raw_value + at[:length]
+ raise LineTooLong(value[:100] + b"...", pyparser._max_field_size)
pyparser._on_header_value(at, length)
except BaseException as ex:
pyparser._last_error = ex
diff --git a/aiohttp/client.py b/aiohttp/client.py
index fda3cb4f5df..2a3823be6cb 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -192,6 +192,7 @@ class _RequestOptions(TypedDict, total=False):
auto_decompress: bool | None
max_line_size: int | None
max_field_size: int | None
+ max_headers: int | None
middlewares: Sequence[ClientMiddlewareType] | None
@@ -284,6 +285,7 @@ class ClientSession:
"_read_bufsize",
"_max_line_size",
"_max_field_size",
+ "_max_headers",
"_resolve_charset",
"_default_proxy",
"_default_proxy_auth",
@@ -326,6 +328,7 @@ def __init__(
read_bufsize: int = 2**16,
max_line_size: int = 8190,
max_field_size: int = 8190,
+ max_headers: int = 128,
fallback_charset_resolver: _CharsetResolver = lambda r, b: "utf-8",
middlewares: Sequence[ClientMiddlewareType] = (),
ssl_shutdown_timeout: _SENTINEL | None | float = sentinel,
@@ -425,6 +428,7 @@ def __init__(
self._read_bufsize = read_bufsize
self._max_line_size = max_line_size
self._max_field_size = max_field_size
+ self._max_headers = max_headers
# Convert to list of tuples
if headers:
@@ -539,6 +543,7 @@ async def _request(
auto_decompress: bool | None = None,
max_line_size: int | None = None,
max_field_size: int | None = None,
+ max_headers: int | None = None,
middlewares: Sequence[ClientMiddlewareType] | None = None,
) -> ClientResponse:
@@ -628,6 +633,9 @@ async def _request(
if max_field_size is None:
max_field_size = self._max_field_size
+ if max_headers is None:
+ max_headers = self._max_headers
+
traces = [
Trace(
self,
@@ -771,6 +779,7 @@ async def _connect_and_send_request(
timeout_ceil_threshold=self._connector._timeout_ceil_threshold,
max_line_size=max_line_size,
max_field_size=max_field_size,
+ max_headers=max_headers,
)
try:
resp = await req.send(conn)
diff --git a/aiohttp/client_proto.py b/aiohttp/client_proto.py
index 6eb6ffffed3..a078068eb05 100644
--- a/aiohttp/client_proto.py
+++ b/aiohttp/client_proto.py
@@ -230,6 +230,7 @@ def set_response_params(
timeout_ceil_threshold: float = 5,
max_line_size: int = 8190,
max_field_size: int = 8190,
+ max_headers: int = 128,
) -> None:
self._skip_payload = skip_payload
@@ -248,6 +249,7 @@ def set_response_params(
auto_decompress=auto_decompress,
max_line_size=max_line_size,
max_field_size=max_field_size,
+ max_headers=max_headers,
)
if self._tail:
diff --git a/aiohttp/http_exceptions.py b/aiohttp/http_exceptions.py
index ac6745acf21..daa8da81a3c 100644
--- a/aiohttp/http_exceptions.py
+++ b/aiohttp/http_exceptions.py
@@ -79,11 +79,12 @@ class DecompressSizeError(PayloadEncodingError):
class LineTooLong(BadHttpMessage):
def __init__(
- self, line: str, limit: str = "Unknown", actual_size: str = "Unknown"
+ self,
+ line: str | bytes,
+ limit: str | int = "Unknown",
+ actual_size: str = "Unknown",
) -> None:
- super().__init__(
- f"Got more than {limit} bytes ({actual_size}) when reading {line}."
- )
+ super().__init__(f"Got more than {limit} bytes when reading: {line!r}.")
self.args = (line, limit, actual_size)
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index 6fbe04a3203..9e7fd84bd75 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -155,20 +155,10 @@ def parse_headers(
raise InvalidHeader(line)
bvalue = bvalue.lstrip(b" \t")
- if len(bname) > self.max_field_size:
- raise LineTooLong(
- "request header name {}".format(
- bname.decode("utf8", "backslashreplace")
- ),
- str(self.max_field_size),
- str(len(bname)),
- )
name = bname.decode("utf-8", "surrogateescape")
if not TOKENRE.fullmatch(name):
raise InvalidHeader(bname)
- header_length = len(bvalue)
-
# next line
lines_idx += 1
line = lines[lines_idx]
@@ -178,16 +168,14 @@ def parse_headers(
# Deprecated: https://www.rfc-editor.org/rfc/rfc9112.html#name-obsolete-line-folding
if continuation:
+ header_length = len(bvalue)
bvalue_lst = [bvalue]
while continuation:
header_length += len(line)
if header_length > self.max_field_size:
+ header_line = bname + b": " + b"".join(bvalue_lst)
raise LineTooLong(
- "request header field {}".format(
- bname.decode("utf8", "backslashreplace")
- ),
- str(self.max_field_size),
- str(header_length),
+ header_line[:100] + b"...", self.max_field_size
)
bvalue_lst.append(line)
@@ -201,15 +189,6 @@ def parse_headers(
line = b""
break
bvalue = b"".join(bvalue_lst)
- else:
- if header_length > self.max_field_size:
- raise LineTooLong(
- "request header field {}".format(
- bname.decode("utf8", "backslashreplace")
- ),
- str(self.max_field_size),
- str(header_length),
- )
bvalue = bvalue.strip(b" \t")
value = bvalue.decode("utf-8", "surrogateescape")
@@ -240,7 +219,7 @@ def __init__(
loop: asyncio.AbstractEventLoop | None = None,
limit: int = 2**16,
max_line_size: int = 8190,
- max_headers: int = 32768,
+ max_headers: int = 128,
max_field_size: int = 8190,
timer: BaseTimerContext | None = None,
code: int | None = None,
@@ -255,6 +234,7 @@ def __init__(
self.max_line_size = max_line_size
self.max_headers = max_headers
self.max_field_size = max_field_size
+ self.max_headers = max_headers
self.timer = timer
self.code = code
self.method = method
@@ -313,6 +293,7 @@ def feed_data(
data_len = len(data)
start_pos = 0
loop = self.loop
+ max_line_length = self.max_line_size
should_close = False
while start_pos < data_len:
@@ -334,11 +315,21 @@ def feed_data(
line = data[start_pos:pos]
if SEP == b"\n": # For lax response parsing
line = line.rstrip(b"\r")
+ if len(line) > max_line_length:
+ raise LineTooLong(line[:100] + b"...", max_line_length)
+
self._lines.append(line)
+ # After processing the status/request line, everything is a header.
+ max_line_length = self.max_field_size
+
+ if len(self._lines) > self.max_headers:
+ raise BadHttpMessage("Too many headers received")
+
start_pos = pos + len(SEP)
# \r\n\r\n found
if self._lines[-1] == EMPTY:
+ max_trailers = self.max_headers - len(self._lines)
try:
msg: _MsgT = self.parse_message(self._lines)
finally:
@@ -397,6 +388,9 @@ def get_content_length() -> int | None:
auto_decompress=self._auto_decompress,
lax=self.lax,
headers_parser=self._headers_parser,
+ max_line_size=self.max_line_size,
+ max_field_size=self.max_field_size,
+ max_trailers=max_trailers,
)
if not payload_parser.done:
self._payload_parser = payload_parser
@@ -416,6 +410,9 @@ def get_content_length() -> int | None:
auto_decompress=self._auto_decompress,
lax=self.lax,
headers_parser=self._headers_parser,
+ max_line_size=self.max_line_size,
+ max_field_size=self.max_field_size,
+ max_trailers=max_trailers,
)
elif not empty_body and length is None and self.read_until_eof:
payload = StreamReader(
@@ -435,6 +432,9 @@ def get_content_length() -> int | None:
auto_decompress=self._auto_decompress,
lax=self.lax,
headers_parser=self._headers_parser,
+ max_line_size=self.max_line_size,
+ max_field_size=self.max_field_size,
+ max_trailers=max_trailers,
)
if not payload_parser.done:
self._payload_parser = payload_parser
@@ -445,6 +445,8 @@ def get_content_length() -> int | None:
should_close = msg.should_close
else:
self._tail = data[start_pos:]
+ if len(self._tail) > self.max_line_size:
+ raise LineTooLong(self._tail[:100] + b"...", self.max_line_size)
data = EMPTY
break
@@ -580,11 +582,6 @@ def parse_message(self, lines: list[bytes]) -> RawRequestMessage:
except ValueError:
raise BadHttpMethod(line) from None
- if len(path) > self.max_line_size:
- raise LineTooLong(
- "Status line is too long", str(self.max_line_size), str(len(path))
- )
-
# method
if not TOKENRE.fullmatch(method):
raise BadHttpMethod(method)
@@ -700,11 +697,6 @@ def parse_message(self, lines: list[bytes]) -> RawResponseMessage:
status = status.strip()
reason = ""
- if len(reason) > self.max_line_size:
- raise LineTooLong(
- "Status line is too long", str(self.max_line_size), str(len(reason))
- )
-
# version
match = VERSRE.fullmatch(version)
if match is None:
@@ -769,6 +761,9 @@ def __init__(
lax: bool = False,
*,
headers_parser: HeadersParser,
+ max_line_size: int = 8190,
+ max_field_size: int = 8190,
+ max_trailers: int = 128,
) -> None:
self._length = 0
self._type = ParseState.PARSE_UNTIL_EOF
@@ -778,6 +773,9 @@ def __init__(
self._auto_decompress = auto_decompress
self._lax = lax
self._headers_parser = headers_parser
+ self._max_line_size = max_line_size
+ self._max_field_size = max_field_size
+ self._max_trailers = max_trailers
self._trailer_lines: list[bytes] = []
self.done = False
@@ -841,6 +839,15 @@ def feed_data(
# Chunked transfer encoding parser
elif self._type == ParseState.PARSE_CHUNKED:
if self._chunk_tail:
+ # We should never have a tail if we're inside the payload body.
+ assert self._chunk != ChunkState.PARSE_CHUNKED_CHUNK
+ # We should check the length is sane.
+ max_line_length = self._max_line_size
+ if self._chunk == ChunkState.PARSE_TRAILERS:
+ max_line_length = self._max_field_size
+ if len(self._chunk_tail) > max_line_length:
+ raise LineTooLong(self._chunk_tail[:100] + b"...", max_line_length)
+
chunk = self._chunk_tail + chunk
self._chunk_tail = b""
@@ -924,8 +931,15 @@ def feed_data(
chunk = chunk[pos + len(SEP) :]
if SEP == b"\n": # For lax response parsing
line = line.rstrip(b"\r")
+
+ if len(line) > self._max_field_size:
+ raise LineTooLong(line[:100] + b"...", self._max_field_size)
+
self._trailer_lines.append(line)
+ if len(self._trailer_lines) > self._max_trailers:
+ raise BadHttpMessage("Too many trailers received")
+
# \r\n\r\n found, end of stream
if self._trailer_lines[-1] == b"":
# Headers and trailers are defined the same way,
diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py
index 843d5504b8e..0f87304a0ef 100644
--- a/aiohttp/web_protocol.py
+++ b/aiohttp/web_protocol.py
@@ -177,7 +177,7 @@ def __init__(
access_log_format: str = AccessLogger.LOG_FORMAT,
debug: bool = False,
max_line_size: int = 8190,
- max_headers: int = 32768,
+ max_headers: int = 128,
max_field_size: int = 8190,
lingering_time: float = 10.0,
read_bufsize: int = 2**16,
diff --git a/docs/client_reference.rst b/docs/client_reference.rst
index 9a9506da382..e96f56bb0df 100644
--- a/docs/client_reference.rst
+++ b/docs/client_reference.rst
@@ -57,6 +57,7 @@ The client session supports the context manager protocol for self closing.
read_bufsize=2**16, \
max_line_size=8190, \
max_field_size=8190, \
+ max_headers=128, \
fallback_charset_resolver=lambda r, b: "utf-8", \
ssl_shutdown_timeout=0)
@@ -245,7 +246,9 @@ The client session supports the context manager protocol for self closing.
:param int max_line_size: Maximum allowed size of lines in responses.
- :param int max_field_size: Maximum allowed size of header fields in responses.
+ :param int max_field_size: Maximum allowed size of header name and value combined in responses.
+
+ :param int max_headers: Maximum number of headers and trailers combined in responses.
:param Callable[[ClientResponse,bytes],str] fallback_charset_resolver:
A :term:`callable` that accepts a :class:`ClientResponse` and the
@@ -425,7 +428,8 @@ The client session supports the context manager protocol for self closing.
read_bufsize=None, \
auto_decompress=None, \
max_line_size=None, \
- max_field_size=None)
+ max_field_size=None, \
+ max_headers=None)
:async:
:noindexentry:
@@ -589,7 +593,9 @@ The client session supports the context manager protocol for self closing.
:param int max_line_size: Maximum allowed size of lines in responses.
- :param int max_field_size: Maximum allowed size of header fields in responses.
+ :param int max_field_size: Maximum allowed size of header name and value combined in responses.
+
+ :param int max_headers: Maximum number of headers and trailers combined in responses.
:return ClientResponse: a :class:`client response `
object.
@@ -918,6 +924,7 @@ certification chaining.
auto_decompress=None, \
max_line_size=None, \
max_field_size=None, \
+ max_headers=None, \
version=aiohttp.HttpVersion11, \
connector=None)
:async:
@@ -1055,7 +1062,9 @@ certification chaining.
:param int max_line_size: Maximum allowed size of lines in responses.
- :param int max_field_size: Maximum allowed size of header fields in responses.
+ :param int max_field_size: Maximum allowed size of header name and value combined in responses.
+
+ :param int max_headers: Maximum number of headers and trailers combined in responses.
:param aiohttp.protocol.HttpVersion version: Request HTTP version,
``HTTP 1.1`` by default. (optional)
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index 4a130e32f1d..3f1fd6aa9da 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -2863,9 +2863,10 @@ application on specific TCP or Unix socket, e.g.::
:attr:`helpers.AccessLogger.LOG_FORMAT`.
:param int max_line_size: Optional maximum header line size. Default:
``8190``.
- :param int max_headers: Optional maximum header size. Default: ``32768``.
- :param int max_field_size: Optional maximum header field size. Default:
+ :param int max_field_size: Optional maximum header combined name and value size. Default:
``8190``.
+ :param int max_headers: Optional maximum number of headers and trailers combined. Default:
+ ``128``.
:param float lingering_time: Maximum time during which the server
reads and ignores additional data coming from the client when
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index ae584a1f737..2d18a27d99a 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -4409,17 +4409,17 @@ async def handler(request):
assert resp.headers["Content-Type"] == "text/plain; charset=utf-8"
-async def test_max_field_size_session_default(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(headers={"Custom": "x" * 8190})
+async def test_max_field_size_session_default(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={"Custom": "x" * 8182})
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
- async with await client.get("/") as resp:
- assert resp.headers["Custom"] == "x" * 8190
+ async with client.get("/") as resp:
+ assert resp.headers["Custom"] == "x" * 8182
async def test_max_field_size_session_default_fail(aiohttp_client) -> None:
@@ -4434,43 +4434,96 @@ async def handler(request):
await client.get("/")
-async def test_max_field_size_session_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(headers={"Custom": "x" * 8191})
+async def test_max_field_size_session_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={"Custom": "x" * 8192})
app = web.Application()
app.add_routes([web.get("/", handler)])
- client = await aiohttp_client(app, max_field_size=8191)
+ client = await aiohttp_client(app, max_field_size=8200)
- async with await client.get("/") as resp:
- assert resp.headers["Custom"] == "x" * 8191
+ async with client.get("/") as resp:
+ assert resp.headers["Custom"] == "x" * 8192
-async def test_max_field_size_request_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(headers={"Custom": "x" * 8191})
+async def test_max_headers_session_default(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(120)})
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
- async with await client.get("/", max_field_size=8191) as resp:
- assert resp.headers["Custom"] == "x" * 8191
+ async with client.get("/") as resp:
+ assert resp.headers["Custom-119"] == "x"
-async def test_max_line_size_session_default(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(status=200, reason="x" * 8190)
+async def test_max_headers_session_default_fail(
+ aiohttp_client: AiohttpClient,
+) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(129)})
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
+ with pytest.raises(aiohttp.ClientResponseError):
+ await client.get("/")
- async with await client.get("/") as resp:
- assert resp.reason == "x" * 8190
+
+async def test_max_headers_session_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(130)})
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app, max_headers=140)
+
+ async with client.get("/") as resp:
+ assert resp.headers["Custom-129"] == "x"
+
+
+async def test_max_headers_request_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(130)})
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app)
+
+ async with client.get("/", max_headers=140) as resp:
+ assert resp.headers["Custom-129"] == "x"
+
+
+async def test_max_field_size_request_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={"Custom": "x" * 8192})
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app)
+
+ async with client.get("/", max_field_size=8200) as resp:
+ assert resp.headers["Custom"] == "x" * 8192
+
+
+async def test_max_line_size_session_default(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(status=200, reason="x" * 8177)
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app)
+
+ async with client.get("/") as resp:
+ assert resp.reason == "x" * 8177
async def test_max_line_size_session_default_fail(aiohttp_client) -> None:
@@ -4485,30 +4538,30 @@ async def handler(request):
await client.get("/")
-async def test_max_line_size_session_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(status=200, reason="x" * 8191)
+async def test_max_line_size_session_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(status=200, reason="x" * 8197)
app = web.Application()
app.add_routes([web.get("/", handler)])
- client = await aiohttp_client(app, max_line_size=8191)
+ client = await aiohttp_client(app, max_line_size=8210)
- async with await client.get("/") as resp:
- assert resp.reason == "x" * 8191
+ async with client.get("/") as resp:
+ assert resp.reason == "x" * 8197
-async def test_max_line_size_request_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(status=200, reason="x" * 8191)
+async def test_max_line_size_request_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(status=200, reason="x" * 8197)
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
- async with await client.get("/", max_line_size=8191) as resp:
- assert resp.reason == "x" * 8191
+ async with client.get("/", max_line_size=8210) as resp:
+ assert resp.reason == "x" * 8197
async def test_rejected_upload(
diff --git a/tests/test_http_exceptions.py b/tests/test_http_exceptions.py
index cd3b08f59db..6186a71c681 100644
--- a/tests/test_http_exceptions.py
+++ b/tests/test_http_exceptions.py
@@ -69,32 +69,32 @@ def test_repr(self) -> None:
class TestLineTooLong:
def test_ctor(self) -> None:
- err = http_exceptions.LineTooLong("spam", "10", "12")
+ err = http_exceptions.LineTooLong(b"spam", 10)
assert err.code == 400
- assert err.message == "Got more than 10 bytes (12) when reading spam."
+ assert err.message == "Got more than 10 bytes when reading: b'spam'."
assert err.headers is None
def test_pickle(self) -> None:
- err = http_exceptions.LineTooLong(line="spam", limit="10", actual_size="12")
+ err = http_exceptions.LineTooLong(line=b"spam", limit=10, actual_size="12")
err.foo = "bar"
for proto in range(pickle.HIGHEST_PROTOCOL + 1):
pickled = pickle.dumps(err, proto)
err2 = pickle.loads(pickled)
assert err2.code == 400
- assert err2.message == ("Got more than 10 bytes (12) when reading spam.")
+ assert err2.message == ("Got more than 10 bytes when reading: b'spam'.")
assert err2.headers is None
assert err2.foo == "bar"
def test_str(self) -> None:
- err = http_exceptions.LineTooLong(line="spam", limit="10", actual_size="12")
- expected = "400, message:\n Got more than 10 bytes (12) when reading spam."
+ err = http_exceptions.LineTooLong(line=b"spam", limit=10)
+ expected = "400, message:\n Got more than 10 bytes when reading: b'spam'."
assert str(err) == expected
def test_repr(self) -> None:
- err = http_exceptions.LineTooLong(line="spam", limit="10", actual_size="12")
+ err = http_exceptions.LineTooLong(line=b"spam", limit=10)
assert repr(err) == (
- ""
+ '"
)
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 1a426825ede..ffb12b51195 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -23,6 +23,7 @@
HttpPayloadParser,
HttpRequestParser,
HttpRequestParserPy,
+ HttpResponseParser,
HttpResponseParserPy,
HttpVersion,
)
@@ -75,7 +76,7 @@ def parser(loop: Any, protocol: Any, request: Any):
loop,
2**16,
max_line_size=8190,
- max_headers=32768,
+ max_headers=128,
max_field_size=8190,
)
@@ -94,7 +95,7 @@ def response(loop: Any, protocol: Any, request: Any):
loop,
2**16,
max_line_size=8190,
- max_headers=32768,
+ max_headers=128,
max_field_size=8190,
read_until_eof=True,
)
@@ -270,15 +271,6 @@ def test_whitespace_before_header(parser: Any) -> None:
parser.feed_data(text)
-def test_parse_headers_longline(parser: Any) -> None:
- invalid_unicode_byte = b"\xd9"
- header_name = b"Test" + invalid_unicode_byte + b"Header" + b"A" * 8192
- text = b"GET /test HTTP/1.1\r\n" + header_name + b": test\r\n" + b"\r\n" + b"\r\n"
- with pytest.raises((http_exceptions.LineTooLong, http_exceptions.BadHttpMessage)):
- # FIXME: `LineTooLong` doesn't seem to actually be happening
- parser.feed_data(text)
-
-
@pytest.fixture
def xfail_c_parser_status(request) -> None:
if isinstance(request.getfixturevalue("parser"), HttpRequestParserPy):
@@ -710,13 +702,14 @@ def test_max_header_field_size(parser, size) -> None:
name = b"t" * size
text = b"GET /test HTTP/1.1\r\n" + name + b":data\r\n\r\n"
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
- parser.feed_data(text)
+ for i in range(0, len(text), 5000): # pragma: no branch
+ parser.feed_data(text[i : i + 5000])
-def test_max_header_field_size_under_limit(parser) -> None:
- name = b"t" * 8190
+def test_max_header_size_under_limit(parser: HttpRequestParser) -> None:
+ name = b"t" * 8185
text = b"GET /test HTTP/1.1\r\n" + name + b":data\r\n\r\n"
messages, upgrade, tail = parser.feed_data(text)
@@ -738,13 +731,67 @@ def test_max_header_value_size(parser, size) -> None:
name = b"t" * size
text = b"GET /test HTTP/1.1\r\ndata:" + name + b"\r\n\r\n"
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
+ with pytest.raises(http_exceptions.LineTooLong, match=match):
+ for i in range(0, len(text), 4000): # pragma: no branch
+ parser.feed_data(text[i : i + 4000])
+
+
+def test_max_header_combined_size(parser: HttpRequestParser) -> None:
+ k = b"t" * 4100
+ text = b"GET /test HTTP/1.1\r\n" + k + b":" + k + b"\r\n\r\n"
+
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
parser.feed_data(text)
-def test_max_header_value_size_under_limit(parser) -> None:
- value = b"A" * 8190
+@pytest.mark.parametrize("size", [40960, 8191])
+async def test_max_trailer_size(parser: HttpRequestParser, size: int) -> None:
+ value = b"t" * size
+ text = (
+ b"GET /test HTTP/1.1\r\nTransfer-Encoding: chunked\r\n\r\n"
+ + hex(4000)[2:].encode()
+ + b"\r\n"
+ + b"b" * 4000
+ + b"\r\n0\r\ntest: "
+ + value
+ + b"\r\n\r\n"
+ )
+
+ match = "400, message:\n Got more than 8190 bytes when reading"
+ with pytest.raises(http_exceptions.LineTooLong, match=match):
+ payload = None
+ for i in range(0, len(text), 3000): # pragma: no branch
+ messages, upgrade, tail = parser.feed_data(text[i : i + 3000])
+ if messages:
+ payload = messages[0][-1]
+ # Trailers are not seen until payload is read.
+ assert payload is not None
+ await payload.read()
+
+
+@pytest.mark.parametrize("headers,trailers", ((129, 0), (0, 129), (64, 65)))
+async def test_max_headers(
+ parser: HttpRequestParser, headers: int, trailers: int
+) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\nTransfer-Encoding: chunked"
+ + b"".join(b"\r\nHeader-%d: Value" % i for i in range(headers))
+ + b"\r\n\r\n4\r\ntest\r\n0"
+ + b"".join(b"\r\nTrailer-%d: Value" % i for i in range(trailers))
+ + b"\r\n\r\n"
+ )
+
+ match = "Too many (headers|trailers) received"
+ with pytest.raises(http_exceptions.BadHttpMessage, match=match):
+ messages, upgrade, tail = parser.feed_data(text)
+ # Trailers are not seen until payload is read.
+ await messages[0][-1].read()
+
+
+def test_max_header_value_size_under_limit(parser: HttpRequestParser) -> None:
+ value = b"A" * 8185
text = b"GET /test HTTP/1.1\r\ndata:" + value + b"\r\n\r\n"
messages, upgrade, tail = parser.feed_data(text)
@@ -766,13 +813,16 @@ def test_max_header_value_size_continuation(response, size) -> None:
name = b"T" * (size - 5)
text = b"HTTP/1.1 200 Ok\r\ndata: test\r\n " + name + b"\r\n\r\n"
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
- response.feed_data(text)
+ for i in range(0, len(text), 9000): # pragma: no branch
+ response.feed_data(text[i : i + 9000])
-def test_max_header_value_size_continuation_under_limit(response) -> None:
- value = b"A" * 8185
+def test_max_header_value_size_continuation_under_limit(
+ response: HttpResponseParser,
+) -> None:
+ value = b"A" * 8179
text = b"HTTP/1.1 200 Ok\r\ndata: test\r\n " + value + b"\r\n\r\n"
messages, upgrade, tail = response.feed_data(text)
@@ -1011,13 +1061,13 @@ def test_http_request_parser_bad_nonascii_uri(parser: Any) -> None:
@pytest.mark.parametrize("size", [40965, 8191])
def test_http_request_max_status_line(parser, size) -> None:
path = b"t" * (size - 5)
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
parser.feed_data(b"GET /path" + path + b" HTTP/1.1\r\n\r\n")
-def test_http_request_max_status_line_under_limit(parser) -> None:
- path = b"t" * (8190 - 5)
+def test_http_request_max_status_line_under_limit(parser: HttpRequestParser) -> None:
+ path = b"t" * 8172
messages, upgraded, tail = parser.feed_data(
b"GET /path" + path + b" HTTP/1.1\r\n\r\n"
)
@@ -1094,13 +1144,15 @@ def test_http_response_parser_strict_obs_line_folding(response: Any) -> None:
@pytest.mark.parametrize("size", [40962, 8191])
def test_http_response_parser_bad_status_line_too_long(response, size) -> None:
reason = b"t" * (size - 2)
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
response.feed_data(b"HTTP/1.1 200 Ok" + reason + b"\r\n\r\n")
-def test_http_response_parser_status_line_under_limit(response) -> None:
- reason = b"O" * 8190
+def test_http_response_parser_status_line_under_limit(
+ response: HttpResponseParser,
+) -> None:
+ reason = b"O" * 8177
messages, upgraded, tail = response.feed_data(
b"HTTP/1.1 200 " + reason + b"\r\n\r\n"
)
@@ -1610,7 +1662,7 @@ def test_parse_bad_method_for_c_parser_raises(loop, protocol):
loop,
2**16,
max_line_size=8190,
- max_headers=32768,
+ max_headers=128,
max_field_size=8190,
)
@@ -1943,7 +1995,7 @@ async def test_streaming_decompress_large_payload(
dbuf = DeflateBuffer(buf, "deflate")
# Feed compressed data in chunks (simulating network streaming)
- for i in range(0, len(compressed), chunk_size):
+ for i in range(0, len(compressed), chunk_size): # pragma: no branch
chunk = compressed[i : i + chunk_size]
dbuf.feed_data(chunk, len(chunk))
From 0c2e9da51126238a421568eb7c5b53e5b5d17b36 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Mon, 12 Jan 2026 18:49:19 +0000
Subject: [PATCH 015/141] Add max_headers parameter (#11955) (#11959) (#11960)
(cherry picked from commit ed6440ca49ef4907ab9d99ba7e329aab702b7173)
(cherry picked from commit 30ec25f8a58c5dc3f8fdb3eec31f555eeaabd30a)
---
CHANGES/11955.feature.rst | 1 +
aiohttp/_http_parser.pyx | 31 +++++----
aiohttp/client.py | 9 +++
aiohttp/client_proto.py | 2 +
aiohttp/http_exceptions.py | 9 +--
aiohttp/http_parser.py | 84 +++++++++++++----------
aiohttp/web_protocol.py | 2 +-
docs/client_reference.rst | 17 +++--
docs/web_reference.rst | 5 +-
tests/test_client_functional.py | 117 +++++++++++++++++++++++---------
tests/test_http_exceptions.py | 18 ++---
tests/test_http_parser.py | 112 ++++++++++++++++++++++--------
12 files changed, 277 insertions(+), 130 deletions(-)
create mode 100644 CHANGES/11955.feature.rst
diff --git a/CHANGES/11955.feature.rst b/CHANGES/11955.feature.rst
new file mode 100644
index 00000000000..eaea1016e60
--- /dev/null
+++ b/CHANGES/11955.feature.rst
@@ -0,0 +1 @@
+Added ``max_headers`` parameter to limit the number of headers that should be read from a response -- by :user:`Dreamsorcerer`.
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index 4a7101edbcb..cc8b13cfbe8 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -279,6 +279,7 @@ cdef class HttpParser:
object _name
bytes _raw_value
bint _has_value
+ int _header_name_size
object _protocol
object _loop
@@ -329,7 +330,7 @@ cdef class HttpParser:
self, cparser.llhttp_type mode,
object protocol, object loop, int limit,
object timer=None,
- size_t max_line_size=8190, size_t max_headers=32768,
+ size_t max_line_size=8190, size_t max_headers=128,
size_t max_field_size=8190, payload_exception=None,
bint response_with_body=True, bint read_until_eof=False,
bint auto_decompress=True,
@@ -352,6 +353,7 @@ cdef class HttpParser:
self._raw_name = EMPTY_BYTES
self._raw_value = EMPTY_BYTES
self._has_value = False
+ self._header_name_size = 0
self._max_line_size = max_line_size
self._max_headers = max_headers
@@ -383,11 +385,14 @@ cdef class HttpParser:
value = self._raw_value.decode('utf-8', 'surrogateescape')
self._headers.append((name, value))
+ if len(self._headers) > self._max_headers:
+ raise BadHttpMessage("Too many headers received")
if name is CONTENT_ENCODING:
self._content_encoding = value
self._has_value = False
+ self._header_name_size = 0
self._raw_headers.append((self._raw_name, self._raw_value))
self._raw_name = EMPTY_BYTES
self._raw_value = EMPTY_BYTES
@@ -574,7 +579,7 @@ cdef class HttpRequestParser(HttpParser):
def __init__(
self, protocol, loop, int limit, timer=None,
- size_t max_line_size=8190, size_t max_headers=32768,
+ size_t max_line_size=8190, size_t max_headers=128,
size_t max_field_size=8190, payload_exception=None,
bint response_with_body=True, bint read_until_eof=False,
bint auto_decompress=True,
@@ -638,7 +643,7 @@ cdef class HttpResponseParser(HttpParser):
def __init__(
self, protocol, loop, int limit, timer=None,
- size_t max_line_size=8190, size_t max_headers=32768,
+ size_t max_line_size=8190, size_t max_headers=128,
size_t max_field_size=8190, payload_exception=None,
bint response_with_body=True, bint read_until_eof=False,
bint auto_decompress=True
@@ -677,8 +682,8 @@ cdef int cb_on_url(cparser.llhttp_t* parser,
cdef HttpParser pyparser = parser.data
try:
if length > pyparser._max_line_size:
- raise LineTooLong(
- 'Status line is too long', pyparser._max_line_size, length)
+ status = pyparser._buf + at[:length]
+ raise LineTooLong(status[:100] + b"...", pyparser._max_line_size)
extend(pyparser._buf, at, length)
except BaseException as ex:
pyparser._last_error = ex
@@ -690,11 +695,10 @@ cdef int cb_on_url(cparser.llhttp_t* parser,
cdef int cb_on_status(cparser.llhttp_t* parser,
const char *at, size_t length) except -1:
cdef HttpParser pyparser = parser.data
- cdef str reason
try:
if length > pyparser._max_line_size:
- raise LineTooLong(
- 'Status line is too long', pyparser._max_line_size, length)
+ reason = pyparser._buf + at[:length]
+ raise LineTooLong(reason[:100] + b"...", pyparser._max_line_size)
extend(pyparser._buf, at, length)
except BaseException as ex:
pyparser._last_error = ex
@@ -711,8 +715,9 @@ cdef int cb_on_header_field(cparser.llhttp_t* parser,
pyparser._on_status_complete()
size = len(pyparser._raw_name) + length
if size > pyparser._max_field_size:
- raise LineTooLong(
- 'Header name is too long', pyparser._max_field_size, size)
+ name = pyparser._raw_name + at[:length]
+ raise LineTooLong(name[:100] + b"...", pyparser._max_field_size)
+ pyparser._header_name_size = size
pyparser._on_header_field(at, length)
except BaseException as ex:
pyparser._last_error = ex
@@ -727,9 +732,9 @@ cdef int cb_on_header_value(cparser.llhttp_t* parser,
cdef Py_ssize_t size
try:
size = len(pyparser._raw_value) + length
- if size > pyparser._max_field_size:
- raise LineTooLong(
- 'Header value is too long', pyparser._max_field_size, size)
+ if pyparser._header_name_size + size > pyparser._max_field_size:
+ value = pyparser._raw_value + at[:length]
+ raise LineTooLong(value[:100] + b"...", pyparser._max_field_size)
pyparser._on_header_value(at, length)
except BaseException as ex:
pyparser._last_error = ex
diff --git a/aiohttp/client.py b/aiohttp/client.py
index bc4ee17caf0..e1ea692be71 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -195,6 +195,7 @@ class _RequestOptions(TypedDict, total=False):
auto_decompress: Union[bool, None]
max_line_size: Union[int, None]
max_field_size: Union[int, None]
+ max_headers: Union[int, None]
middlewares: Optional[Sequence[ClientMiddlewareType]]
@@ -259,6 +260,7 @@ class ClientSession:
"_read_bufsize",
"_max_line_size",
"_max_field_size",
+ "_max_headers",
"_resolve_charset",
"_default_proxy",
"_default_proxy_auth",
@@ -303,6 +305,7 @@ def __init__(
read_bufsize: int = 2**16,
max_line_size: int = 8190,
max_field_size: int = 8190,
+ max_headers: int = 128,
fallback_charset_resolver: _CharsetResolver = lambda r, b: "utf-8",
middlewares: Sequence[ClientMiddlewareType] = (),
ssl_shutdown_timeout: Union[_SENTINEL, None, float] = sentinel,
@@ -402,6 +405,7 @@ def __init__(
self._read_bufsize = read_bufsize
self._max_line_size = max_line_size
self._max_field_size = max_field_size
+ self._max_headers = max_headers
# Convert to list of tuples
if headers:
@@ -518,6 +522,7 @@ async def _request(
auto_decompress: Optional[bool] = None,
max_line_size: Optional[int] = None,
max_field_size: Optional[int] = None,
+ max_headers: Optional[int] = None,
middlewares: Optional[Sequence[ClientMiddlewareType]] = None,
) -> ClientResponse:
@@ -607,6 +612,9 @@ async def _request(
if max_field_size is None:
max_field_size = self._max_field_size
+ if max_headers is None:
+ max_headers = self._max_headers
+
traces = [
Trace(
self,
@@ -750,6 +758,7 @@ async def _connect_and_send_request(
timeout_ceil_threshold=self._connector._timeout_ceil_threshold,
max_line_size=max_line_size,
max_field_size=max_field_size,
+ max_headers=max_headers,
)
try:
resp = await req.send(conn)
diff --git a/aiohttp/client_proto.py b/aiohttp/client_proto.py
index e2fb1ce64cb..aea31917f48 100644
--- a/aiohttp/client_proto.py
+++ b/aiohttp/client_proto.py
@@ -230,6 +230,7 @@ def set_response_params(
timeout_ceil_threshold: float = 5,
max_line_size: int = 8190,
max_field_size: int = 8190,
+ max_headers: int = 128,
) -> None:
self._skip_payload = skip_payload
@@ -248,6 +249,7 @@ def set_response_params(
auto_decompress=auto_decompress,
max_line_size=max_line_size,
max_field_size=max_field_size,
+ max_headers=max_headers,
)
if self._tail:
diff --git a/aiohttp/http_exceptions.py b/aiohttp/http_exceptions.py
index 0b5867c7861..a389e94c0b5 100644
--- a/aiohttp/http_exceptions.py
+++ b/aiohttp/http_exceptions.py
@@ -80,11 +80,12 @@ class DecompressSizeError(PayloadEncodingError):
class LineTooLong(BadHttpMessage):
def __init__(
- self, line: str, limit: str = "Unknown", actual_size: str = "Unknown"
+ self,
+ line: Union[str, bytes],
+ limit: Union[str, int] = "Unknown",
+ actual_size: str = "Unknown",
) -> None:
- super().__init__(
- f"Got more than {limit} bytes ({actual_size}) when reading {line}."
- )
+ super().__init__(f"Got more than {limit} bytes when reading: {line!r}.")
self.args = (line, limit, actual_size)
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index 393e76a1586..d7f127896a8 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -169,20 +169,10 @@ def parse_headers(
raise InvalidHeader(line)
bvalue = bvalue.lstrip(b" \t")
- if len(bname) > self.max_field_size:
- raise LineTooLong(
- "request header name {}".format(
- bname.decode("utf8", "backslashreplace")
- ),
- str(self.max_field_size),
- str(len(bname)),
- )
name = bname.decode("utf-8", "surrogateescape")
if not TOKENRE.fullmatch(name):
raise InvalidHeader(bname)
- header_length = len(bvalue)
-
# next line
lines_idx += 1
line = lines[lines_idx]
@@ -192,16 +182,14 @@ def parse_headers(
# Deprecated: https://www.rfc-editor.org/rfc/rfc9112.html#name-obsolete-line-folding
if continuation:
+ header_length = len(bvalue)
bvalue_lst = [bvalue]
while continuation:
header_length += len(line)
if header_length > self.max_field_size:
+ header_line = bname + b": " + b"".join(bvalue_lst)
raise LineTooLong(
- "request header field {}".format(
- bname.decode("utf8", "backslashreplace")
- ),
- str(self.max_field_size),
- str(header_length),
+ header_line[:100] + b"...", self.max_field_size
)
bvalue_lst.append(line)
@@ -215,15 +203,6 @@ def parse_headers(
line = b""
break
bvalue = b"".join(bvalue_lst)
- else:
- if header_length > self.max_field_size:
- raise LineTooLong(
- "request header field {}".format(
- bname.decode("utf8", "backslashreplace")
- ),
- str(self.max_field_size),
- str(header_length),
- )
bvalue = bvalue.strip(b" \t")
value = bvalue.decode("utf-8", "surrogateescape")
@@ -254,7 +233,7 @@ def __init__(
loop: Optional[asyncio.AbstractEventLoop] = None,
limit: int = 2**16,
max_line_size: int = 8190,
- max_headers: int = 32768,
+ max_headers: int = 128,
max_field_size: int = 8190,
timer: Optional[BaseTimerContext] = None,
code: Optional[int] = None,
@@ -269,6 +248,7 @@ def __init__(
self.max_line_size = max_line_size
self.max_headers = max_headers
self.max_field_size = max_field_size
+ self.max_headers = max_headers
self.timer = timer
self.code = code
self.method = method
@@ -327,6 +307,7 @@ def feed_data(
data_len = len(data)
start_pos = 0
loop = self.loop
+ max_line_length = self.max_line_size
should_close = False
while start_pos < data_len:
@@ -348,11 +329,21 @@ def feed_data(
line = data[start_pos:pos]
if SEP == b"\n": # For lax response parsing
line = line.rstrip(b"\r")
+ if len(line) > max_line_length:
+ raise LineTooLong(line[:100] + b"...", max_line_length)
+
self._lines.append(line)
+ # After processing the status/request line, everything is a header.
+ max_line_length = self.max_field_size
+
+ if len(self._lines) > self.max_headers:
+ raise BadHttpMessage("Too many headers received")
+
start_pos = pos + len(SEP)
# \r\n\r\n found
if self._lines[-1] == EMPTY:
+ max_trailers = self.max_headers - len(self._lines)
try:
msg: _MsgT = self.parse_message(self._lines)
finally:
@@ -411,6 +402,9 @@ def get_content_length() -> Optional[int]:
auto_decompress=self._auto_decompress,
lax=self.lax,
headers_parser=self._headers_parser,
+ max_line_size=self.max_line_size,
+ max_field_size=self.max_field_size,
+ max_trailers=max_trailers,
)
if not payload_parser.done:
self._payload_parser = payload_parser
@@ -430,6 +424,9 @@ def get_content_length() -> Optional[int]:
auto_decompress=self._auto_decompress,
lax=self.lax,
headers_parser=self._headers_parser,
+ max_line_size=self.max_line_size,
+ max_field_size=self.max_field_size,
+ max_trailers=max_trailers,
)
elif not empty_body and length is None and self.read_until_eof:
payload = StreamReader(
@@ -449,6 +446,9 @@ def get_content_length() -> Optional[int]:
auto_decompress=self._auto_decompress,
lax=self.lax,
headers_parser=self._headers_parser,
+ max_line_size=self.max_line_size,
+ max_field_size=self.max_field_size,
+ max_trailers=max_trailers,
)
if not payload_parser.done:
self._payload_parser = payload_parser
@@ -459,6 +459,8 @@ def get_content_length() -> Optional[int]:
should_close = msg.should_close
else:
self._tail = data[start_pos:]
+ if len(self._tail) > self.max_line_size:
+ raise LineTooLong(self._tail[:100] + b"...", self.max_line_size)
data = EMPTY
break
@@ -594,11 +596,6 @@ def parse_message(self, lines: List[bytes]) -> RawRequestMessage:
except ValueError:
raise BadHttpMethod(line) from None
- if len(path) > self.max_line_size:
- raise LineTooLong(
- "Status line is too long", str(self.max_line_size), str(len(path))
- )
-
# method
if not TOKENRE.fullmatch(method):
raise BadHttpMethod(method)
@@ -714,11 +711,6 @@ def parse_message(self, lines: List[bytes]) -> RawResponseMessage:
status = status.strip()
reason = ""
- if len(reason) > self.max_line_size:
- raise LineTooLong(
- "Status line is too long", str(self.max_line_size), str(len(reason))
- )
-
# version
match = VERSRE.fullmatch(version)
if match is None:
@@ -783,6 +775,9 @@ def __init__(
lax: bool = False,
*,
headers_parser: HeadersParser,
+ max_line_size: int = 8190,
+ max_field_size: int = 8190,
+ max_trailers: int = 128,
) -> None:
self._length = 0
self._type = ParseState.PARSE_UNTIL_EOF
@@ -792,6 +787,9 @@ def __init__(
self._auto_decompress = auto_decompress
self._lax = lax
self._headers_parser = headers_parser
+ self._max_line_size = max_line_size
+ self._max_field_size = max_field_size
+ self._max_trailers = max_trailers
self._trailer_lines: list[bytes] = []
self.done = False
@@ -855,6 +853,15 @@ def feed_data(
# Chunked transfer encoding parser
elif self._type == ParseState.PARSE_CHUNKED:
if self._chunk_tail:
+ # We should never have a tail if we're inside the payload body.
+ assert self._chunk != ChunkState.PARSE_CHUNKED_CHUNK
+ # We should check the length is sane.
+ max_line_length = self._max_line_size
+ if self._chunk == ChunkState.PARSE_TRAILERS:
+ max_line_length = self._max_field_size
+ if len(self._chunk_tail) > max_line_length:
+ raise LineTooLong(self._chunk_tail[:100] + b"...", max_line_length)
+
chunk = self._chunk_tail + chunk
self._chunk_tail = b""
@@ -938,8 +945,15 @@ def feed_data(
chunk = chunk[pos + len(SEP) :]
if SEP == b"\n": # For lax response parsing
line = line.rstrip(b"\r")
+
+ if len(line) > self._max_field_size:
+ raise LineTooLong(line[:100] + b"...", self._max_field_size)
+
self._trailer_lines.append(line)
+ if len(self._trailer_lines) > self._max_trailers:
+ raise BadHttpMessage("Too many trailers received")
+
# \r\n\r\n found, end of stream
if self._trailer_lines[-1] == b"":
# Headers and trailers are defined the same way,
diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py
index 1bd344ae42a..58bbdfd849b 100644
--- a/aiohttp/web_protocol.py
+++ b/aiohttp/web_protocol.py
@@ -188,7 +188,7 @@ def __init__(
access_log_format: str = AccessLogger.LOG_FORMAT,
debug: bool = False,
max_line_size: int = 8190,
- max_headers: int = 32768,
+ max_headers: int = 128,
max_field_size: int = 8190,
lingering_time: float = 10.0,
read_bufsize: int = 2**16,
diff --git a/docs/client_reference.rst b/docs/client_reference.rst
index d8b36b95c91..a1fc98b3c0f 100644
--- a/docs/client_reference.rst
+++ b/docs/client_reference.rst
@@ -57,6 +57,7 @@ The client session supports the context manager protocol for self closing.
read_bufsize=2**16, \
max_line_size=8190, \
max_field_size=8190, \
+ max_headers=128, \
fallback_charset_resolver=lambda r, b: "utf-8", \
ssl_shutdown_timeout=0)
@@ -245,7 +246,9 @@ The client session supports the context manager protocol for self closing.
:param int max_line_size: Maximum allowed size of lines in responses.
- :param int max_field_size: Maximum allowed size of header fields in responses.
+ :param int max_field_size: Maximum allowed size of header name and value combined in responses.
+
+ :param int max_headers: Maximum number of headers and trailers combined in responses.
:param Callable[[ClientResponse,bytes],str] fallback_charset_resolver:
A :term:`callable` that accepts a :class:`ClientResponse` and the
@@ -425,7 +428,8 @@ The client session supports the context manager protocol for self closing.
read_bufsize=None, \
auto_decompress=None, \
max_line_size=None, \
- max_field_size=None)
+ max_field_size=None, \
+ max_headers=None)
:async:
:noindexentry:
@@ -589,7 +593,9 @@ The client session supports the context manager protocol for self closing.
:param int max_line_size: Maximum allowed size of lines in responses.
- :param int max_field_size: Maximum allowed size of header fields in responses.
+ :param int max_field_size: Maximum allowed size of header name and value combined in responses.
+
+ :param int max_headers: Maximum number of headers and trailers combined in responses.
:return ClientResponse: a :class:`client response `
object.
@@ -909,6 +915,7 @@ certification chaining.
auto_decompress=None, \
max_line_size=None, \
max_field_size=None, \
+ max_headers=None, \
version=aiohttp.HttpVersion11, \
connector=None)
:async:
@@ -1046,7 +1053,9 @@ certification chaining.
:param int max_line_size: Maximum allowed size of lines in responses.
- :param int max_field_size: Maximum allowed size of header fields in responses.
+ :param int max_field_size: Maximum allowed size of header name and value combined in responses.
+
+ :param int max_headers: Maximum number of headers and trailers combined in responses.
:param aiohttp.protocol.HttpVersion version: Request HTTP version,
``HTTP 1.1`` by default. (optional)
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index 5ae15478b4f..3ff3299b316 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -2837,9 +2837,10 @@ application on specific TCP or Unix socket, e.g.::
:attr:`helpers.AccessLogger.LOG_FORMAT`.
:param int max_line_size: Optional maximum header line size. Default:
``8190``.
- :param int max_headers: Optional maximum header size. Default: ``32768``.
- :param int max_field_size: Optional maximum header field size. Default:
+ :param int max_field_size: Optional maximum header combined name and value size. Default:
``8190``.
+ :param int max_headers: Optional maximum number of headers and trailers combined. Default:
+ ``128``.
:param float lingering_time: Maximum time during which the server
reads and ignores additional data coming from the client when
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index ebada9f572f..1ae7e1b4cb3 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -4417,17 +4417,17 @@ async def handler(request):
assert resp.headers["Content-Type"] == "text/plain; charset=utf-8"
-async def test_max_field_size_session_default(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(headers={"Custom": "x" * 8190})
+async def test_max_field_size_session_default(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={"Custom": "x" * 8182})
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
- async with await client.get("/") as resp:
- assert resp.headers["Custom"] == "x" * 8190
+ async with client.get("/") as resp:
+ assert resp.headers["Custom"] == "x" * 8182
async def test_max_field_size_session_default_fail(aiohttp_client) -> None:
@@ -4442,43 +4442,96 @@ async def handler(request):
await client.get("/")
-async def test_max_field_size_session_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(headers={"Custom": "x" * 8191})
+async def test_max_field_size_session_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={"Custom": "x" * 8192})
app = web.Application()
app.add_routes([web.get("/", handler)])
- client = await aiohttp_client(app, max_field_size=8191)
+ client = await aiohttp_client(app, max_field_size=8200)
- async with await client.get("/") as resp:
- assert resp.headers["Custom"] == "x" * 8191
+ async with client.get("/") as resp:
+ assert resp.headers["Custom"] == "x" * 8192
-async def test_max_field_size_request_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(headers={"Custom": "x" * 8191})
+async def test_max_headers_session_default(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(120)})
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
- async with await client.get("/", max_field_size=8191) as resp:
- assert resp.headers["Custom"] == "x" * 8191
+ async with client.get("/") as resp:
+ assert resp.headers["Custom-119"] == "x"
-async def test_max_line_size_session_default(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(status=200, reason="x" * 8190)
+async def test_max_headers_session_default_fail(
+ aiohttp_client: AiohttpClient,
+) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(129)})
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
+ with pytest.raises(aiohttp.ClientResponseError):
+ await client.get("/")
- async with await client.get("/") as resp:
- assert resp.reason == "x" * 8190
+
+async def test_max_headers_session_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(130)})
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app, max_headers=140)
+
+ async with client.get("/") as resp:
+ assert resp.headers["Custom-129"] == "x"
+
+
+async def test_max_headers_request_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={f"Custom-{i}": "x" for i in range(130)})
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app)
+
+ async with client.get("/", max_headers=140) as resp:
+ assert resp.headers["Custom-129"] == "x"
+
+
+async def test_max_field_size_request_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(headers={"Custom": "x" * 8192})
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app)
+
+ async with client.get("/", max_field_size=8200) as resp:
+ assert resp.headers["Custom"] == "x" * 8192
+
+
+async def test_max_line_size_session_default(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(status=200, reason="x" * 8177)
+
+ app = web.Application()
+ app.add_routes([web.get("/", handler)])
+
+ client = await aiohttp_client(app)
+
+ async with client.get("/") as resp:
+ assert resp.reason == "x" * 8177
async def test_max_line_size_session_default_fail(aiohttp_client) -> None:
@@ -4493,30 +4546,30 @@ async def handler(request):
await client.get("/")
-async def test_max_line_size_session_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(status=200, reason="x" * 8191)
+async def test_max_line_size_session_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(status=200, reason="x" * 8197)
app = web.Application()
app.add_routes([web.get("/", handler)])
- client = await aiohttp_client(app, max_line_size=8191)
+ client = await aiohttp_client(app, max_line_size=8210)
- async with await client.get("/") as resp:
- assert resp.reason == "x" * 8191
+ async with client.get("/") as resp:
+ assert resp.reason == "x" * 8197
-async def test_max_line_size_request_explicit(aiohttp_client) -> None:
- async def handler(request):
- return web.Response(status=200, reason="x" * 8191)
+async def test_max_line_size_request_explicit(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(status=200, reason="x" * 8197)
app = web.Application()
app.add_routes([web.get("/", handler)])
client = await aiohttp_client(app)
- async with await client.get("/", max_line_size=8191) as resp:
- assert resp.reason == "x" * 8191
+ async with client.get("/", max_line_size=8210) as resp:
+ assert resp.reason == "x" * 8197
async def test_rejected_upload(
diff --git a/tests/test_http_exceptions.py b/tests/test_http_exceptions.py
index cd3b08f59db..6186a71c681 100644
--- a/tests/test_http_exceptions.py
+++ b/tests/test_http_exceptions.py
@@ -69,32 +69,32 @@ def test_repr(self) -> None:
class TestLineTooLong:
def test_ctor(self) -> None:
- err = http_exceptions.LineTooLong("spam", "10", "12")
+ err = http_exceptions.LineTooLong(b"spam", 10)
assert err.code == 400
- assert err.message == "Got more than 10 bytes (12) when reading spam."
+ assert err.message == "Got more than 10 bytes when reading: b'spam'."
assert err.headers is None
def test_pickle(self) -> None:
- err = http_exceptions.LineTooLong(line="spam", limit="10", actual_size="12")
+ err = http_exceptions.LineTooLong(line=b"spam", limit=10, actual_size="12")
err.foo = "bar"
for proto in range(pickle.HIGHEST_PROTOCOL + 1):
pickled = pickle.dumps(err, proto)
err2 = pickle.loads(pickled)
assert err2.code == 400
- assert err2.message == ("Got more than 10 bytes (12) when reading spam.")
+ assert err2.message == ("Got more than 10 bytes when reading: b'spam'.")
assert err2.headers is None
assert err2.foo == "bar"
def test_str(self) -> None:
- err = http_exceptions.LineTooLong(line="spam", limit="10", actual_size="12")
- expected = "400, message:\n Got more than 10 bytes (12) when reading spam."
+ err = http_exceptions.LineTooLong(line=b"spam", limit=10)
+ expected = "400, message:\n Got more than 10 bytes when reading: b'spam'."
assert str(err) == expected
def test_repr(self) -> None:
- err = http_exceptions.LineTooLong(line="spam", limit="10", actual_size="12")
+ err = http_exceptions.LineTooLong(line=b"spam", limit=10)
assert repr(err) == (
- ""
+ '"
)
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index bf3428e770b..84cdeb729e5 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -23,6 +23,7 @@
HttpPayloadParser,
HttpRequestParser,
HttpRequestParserPy,
+ HttpResponseParser,
HttpResponseParserPy,
HttpVersion,
)
@@ -75,7 +76,7 @@ def parser(loop: Any, protocol: Any, request: Any):
loop,
2**16,
max_line_size=8190,
- max_headers=32768,
+ max_headers=128,
max_field_size=8190,
)
@@ -94,7 +95,7 @@ def response(loop: Any, protocol: Any, request: Any):
loop,
2**16,
max_line_size=8190,
- max_headers=32768,
+ max_headers=128,
max_field_size=8190,
read_until_eof=True,
)
@@ -270,15 +271,6 @@ def test_whitespace_before_header(parser: Any) -> None:
parser.feed_data(text)
-def test_parse_headers_longline(parser: Any) -> None:
- invalid_unicode_byte = b"\xd9"
- header_name = b"Test" + invalid_unicode_byte + b"Header" + b"A" * 8192
- text = b"GET /test HTTP/1.1\r\n" + header_name + b": test\r\n" + b"\r\n" + b"\r\n"
- with pytest.raises((http_exceptions.LineTooLong, http_exceptions.BadHttpMessage)):
- # FIXME: `LineTooLong` doesn't seem to actually be happening
- parser.feed_data(text)
-
-
@pytest.fixture
def xfail_c_parser_status(request) -> None:
if isinstance(request.getfixturevalue("parser"), HttpRequestParserPy):
@@ -710,13 +702,14 @@ def test_max_header_field_size(parser, size) -> None:
name = b"t" * size
text = b"GET /test HTTP/1.1\r\n" + name + b":data\r\n\r\n"
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
- parser.feed_data(text)
+ for i in range(0, len(text), 5000): # pragma: no branch
+ parser.feed_data(text[i : i + 5000])
-def test_max_header_field_size_under_limit(parser) -> None:
- name = b"t" * 8190
+def test_max_header_size_under_limit(parser: HttpRequestParser) -> None:
+ name = b"t" * 8185
text = b"GET /test HTTP/1.1\r\n" + name + b":data\r\n\r\n"
messages, upgrade, tail = parser.feed_data(text)
@@ -738,13 +731,67 @@ def test_max_header_value_size(parser, size) -> None:
name = b"t" * size
text = b"GET /test HTTP/1.1\r\ndata:" + name + b"\r\n\r\n"
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
+ with pytest.raises(http_exceptions.LineTooLong, match=match):
+ for i in range(0, len(text), 4000): # pragma: no branch
+ parser.feed_data(text[i : i + 4000])
+
+
+def test_max_header_combined_size(parser: HttpRequestParser) -> None:
+ k = b"t" * 4100
+ text = b"GET /test HTTP/1.1\r\n" + k + b":" + k + b"\r\n\r\n"
+
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
parser.feed_data(text)
-def test_max_header_value_size_under_limit(parser) -> None:
- value = b"A" * 8190
+@pytest.mark.parametrize("size", [40960, 8191])
+async def test_max_trailer_size(parser: HttpRequestParser, size: int) -> None:
+ value = b"t" * size
+ text = (
+ b"GET /test HTTP/1.1\r\nTransfer-Encoding: chunked\r\n\r\n"
+ + hex(4000)[2:].encode()
+ + b"\r\n"
+ + b"b" * 4000
+ + b"\r\n0\r\ntest: "
+ + value
+ + b"\r\n\r\n"
+ )
+
+ match = "400, message:\n Got more than 8190 bytes when reading"
+ with pytest.raises(http_exceptions.LineTooLong, match=match):
+ payload = None
+ for i in range(0, len(text), 3000): # pragma: no branch
+ messages, upgrade, tail = parser.feed_data(text[i : i + 3000])
+ if messages:
+ payload = messages[0][-1]
+ # Trailers are not seen until payload is read.
+ assert payload is not None
+ await payload.read()
+
+
+@pytest.mark.parametrize("headers,trailers", ((129, 0), (0, 129), (64, 65)))
+async def test_max_headers(
+ parser: HttpRequestParser, headers: int, trailers: int
+) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\nTransfer-Encoding: chunked"
+ + b"".join(b"\r\nHeader-%d: Value" % i for i in range(headers))
+ + b"\r\n\r\n4\r\ntest\r\n0"
+ + b"".join(b"\r\nTrailer-%d: Value" % i for i in range(trailers))
+ + b"\r\n\r\n"
+ )
+
+ match = "Too many (headers|trailers) received"
+ with pytest.raises(http_exceptions.BadHttpMessage, match=match):
+ messages, upgrade, tail = parser.feed_data(text)
+ # Trailers are not seen until payload is read.
+ await messages[0][-1].read()
+
+
+def test_max_header_value_size_under_limit(parser: HttpRequestParser) -> None:
+ value = b"A" * 8185
text = b"GET /test HTTP/1.1\r\ndata:" + value + b"\r\n\r\n"
messages, upgrade, tail = parser.feed_data(text)
@@ -766,13 +813,16 @@ def test_max_header_value_size_continuation(response, size) -> None:
name = b"T" * (size - 5)
text = b"HTTP/1.1 200 Ok\r\ndata: test\r\n " + name + b"\r\n\r\n"
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
- response.feed_data(text)
+ for i in range(0, len(text), 9000): # pragma: no branch
+ response.feed_data(text[i : i + 9000])
-def test_max_header_value_size_continuation_under_limit(response) -> None:
- value = b"A" * 8185
+def test_max_header_value_size_continuation_under_limit(
+ response: HttpResponseParser,
+) -> None:
+ value = b"A" * 8179
text = b"HTTP/1.1 200 Ok\r\ndata: test\r\n " + value + b"\r\n\r\n"
messages, upgrade, tail = response.feed_data(text)
@@ -1011,13 +1061,13 @@ def test_http_request_parser_bad_nonascii_uri(parser: Any) -> None:
@pytest.mark.parametrize("size", [40965, 8191])
def test_http_request_max_status_line(parser, size) -> None:
path = b"t" * (size - 5)
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
parser.feed_data(b"GET /path" + path + b" HTTP/1.1\r\n\r\n")
-def test_http_request_max_status_line_under_limit(parser) -> None:
- path = b"t" * (8190 - 5)
+def test_http_request_max_status_line_under_limit(parser: HttpRequestParser) -> None:
+ path = b"t" * 8172
messages, upgraded, tail = parser.feed_data(
b"GET /path" + path + b" HTTP/1.1\r\n\r\n"
)
@@ -1094,13 +1144,15 @@ def test_http_response_parser_strict_obs_line_folding(response: Any) -> None:
@pytest.mark.parametrize("size", [40962, 8191])
def test_http_response_parser_bad_status_line_too_long(response, size) -> None:
reason = b"t" * (size - 2)
- match = f"400, message:\n Got more than 8190 bytes \\({size}\\) when reading"
+ match = "400, message:\n Got more than 8190 bytes when reading"
with pytest.raises(http_exceptions.LineTooLong, match=match):
response.feed_data(b"HTTP/1.1 200 Ok" + reason + b"\r\n\r\n")
-def test_http_response_parser_status_line_under_limit(response) -> None:
- reason = b"O" * 8190
+def test_http_response_parser_status_line_under_limit(
+ response: HttpResponseParser,
+) -> None:
+ reason = b"O" * 8177
messages, upgraded, tail = response.feed_data(
b"HTTP/1.1 200 " + reason + b"\r\n\r\n"
)
@@ -1610,7 +1662,7 @@ def test_parse_bad_method_for_c_parser_raises(loop, protocol):
loop,
2**16,
max_line_size=8190,
- max_headers=32768,
+ max_headers=128,
max_field_size=8190,
)
@@ -1943,7 +1995,7 @@ async def test_streaming_decompress_large_payload(
dbuf = DeflateBuffer(buf, "deflate")
# Feed compressed data in chunks (simulating network streaming)
- for i in range(0, len(compressed), chunk_size):
+ for i in range(0, len(compressed), chunk_size): # pragma: no branch
chunk = compressed[i : i + chunk_size]
dbuf.feed_data(chunk, len(chunk))
From a2979cef34b20fa08c28a25721297520e065ba61 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 13 Jan 2026 11:05:45 +0000
Subject: [PATCH 016/141] Bump identify from 2.6.15 to 2.6.16 (#11961)
Bumps [identify](https://github.com/pre-commit/identify) from 2.6.15 to
2.6.16.
Commits
e31a62b
v2.6.16
de8beb6
Merge pull request #558
from seanbudd/patch-1
b5574ac
Add support for '.xliff' file extension
059831f
Merge pull request #555
from Roxedus/feat/ipxe
7e6b541
Add .ipxe extension
9e78792
Merge pull request #554
from pre-commit/pre-commit-ci-update-config
a35c416
[pre-commit.ci] pre-commit autoupdate
5cab69e
Merge pull request #553
from pre-commit/pre-commit-ci-update-config
c8edd7e
[pre-commit.ci] pre-commit autoupdate
47d582b
Merge pull request #551
from pre-commit/pre-commit-ci-update-config
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 5d502b221b4..25707bac869 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -85,7 +85,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==23.0.0
# via -r requirements/base.in
-identify==2.6.15
+identify==2.6.16
# via pre-commit
idna==3.10
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 61c50af190d..7c2d91c6b7e 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -83,7 +83,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==23.0.0
# via -r requirements/base.in
-identify==2.6.15
+identify==2.6.16
# via pre-commit
idna==3.10
# via
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 01a01448106..58ae5f48bcf 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -35,7 +35,7 @@ forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
# via -r requirements/lint.in
-identify==2.6.15
+identify==2.6.16
# via pre-commit
idna==3.10
# via trustme
From 07eb251590d09a8d9714e1d4abc7c34bdc8495d4 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 15 Jan 2026 10:59:09 +0000
Subject: [PATCH 017/141] Bump librt from 0.7.7 to 0.7.8 (#11964)
Bumps [librt](https://github.com/mypyc/librt) from 0.7.7 to 0.7.8.
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 25707bac869..0df58d0e1d5 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -104,7 +104,7 @@ jinja2==3.1.6
# via
# sphinx
# towncrier
-librt==0.7.7
+librt==0.7.8
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 7c2d91c6b7e..120523b9159 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -102,7 +102,7 @@ jinja2==3.1.6
# via
# sphinx
# towncrier
-librt==0.7.7
+librt==0.7.8
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 58ae5f48bcf..98462f86bf8 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -43,7 +43,7 @@ iniconfig==2.3.0
# via pytest
isal==1.7.2
# via -r requirements/lint.in
-librt==0.7.7
+librt==0.7.8
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 8416cd7ab40..62a8395e749 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -34,7 +34,7 @@ iniconfig==2.3.0
# via pytest
isal==1.8.0 ; python_version < "3.14"
# via -r requirements/test-common.in
-librt==0.7.7
+librt==0.7.8
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index f32450f9d9c..d7c5f8c933c 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -57,7 +57,7 @@ iniconfig==2.3.0
# via pytest
isal==1.8.0 ; python_version < "3.14"
# via -r requirements/test-common.in
-librt==0.7.7
+librt==0.7.8
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/test.txt b/requirements/test.txt
index 40d54aa47d6..2979d8216d4 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -57,7 +57,7 @@ iniconfig==2.3.0
# via pytest
isal==1.7.2 ; python_version < "3.14"
# via -r requirements/test-common.in
-librt==0.7.7
+librt==0.7.8
# via mypy
markdown-it-py==4.0.0
# via rich
From 131cb734836bb7f55e74d9a61bf75ff9f715eb5e Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 15 Jan 2026 11:06:20 +0000
Subject: [PATCH 018/141] Bump regex from 2025.11.3 to 2026.1.15 (#11965)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [regex](https://github.com/mrabarnett/mrab-regex) from 2025.11.3
to 2026.1.15.
Changelog
Sourced from regex's
changelog.
Version: 2026.1.15
Re-uploaded.
Version: 2026.1.14
Git issue 596: Specifying {e<=0} causes ca 210× slow-down.
Added RISC-V wheels.
Version: 2025.11.3
Git issue 594: Support relative PARNO in recursive
subpatterns.
Version: 2025.10.23
'setup.py' was missing from the source distribution.
Version: 2025.10.22
Fixed test in main.yml.
Version: 2025.10.21
Moved tests into subfolder.
Version: 2025.10.20
Re-organised files.
Updated to Unicode 17.0.0.
Version: 2025.9.20
Enable free-threading support in cibuildwheel in another
place.
Version: 2025.9.19
Enable free-threading support in cibuildwheel.
Version: 2025.9.18
Git issue 565: Support the free-threaded build of CPython
3.13
Version: 2025.9.1
Git PR 585: Fix AttributeError: 'AnyAll' object has no
attribute '_key'
Version: 2025.8.29
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
5 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 0df58d0e1d5..9ac0de17cc1 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -200,7 +200,7 @@ pyyaml==6.0.3
# via pre-commit
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2025.11.3
+regex==2026.1.15
# via re-assert
requests==2.32.5
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 120523b9159..f1a48a347e5 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -195,7 +195,7 @@ pyyaml==6.0.3
# via pre-commit
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2025.11.3
+regex==2026.1.15
# via re-assert
requests==2.32.5
# via sphinx
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 62a8395e749..70782b04f24 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -87,7 +87,7 @@ python-on-whales==0.79.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2025.11.3
+regex==2026.1.15
# via re-assert
rich==14.2.0
# via pytest-codspeed
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index d7c5f8c933c..77a735dfeb1 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -122,7 +122,7 @@ python-on-whales==0.79.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2025.11.3
+regex==2026.1.15
# via re-assert
rich==14.2.0
# via pytest-codspeed
diff --git a/requirements/test.txt b/requirements/test.txt
index 2979d8216d4..712b921cdcf 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -122,7 +122,7 @@ python-on-whales==0.79.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2025.11.3
+regex==2026.1.15
# via re-assert
rich==14.2.0
# via pytest-codspeed
From bd919070be28a1fdeb7f078a25de69cf022778f6 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 15 Jan 2026 11:19:54 +0000
Subject: [PATCH 019/141] Bump python-on-whales from 0.79.0 to 0.80.0 (#11950)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps
[python-on-whales](https://github.com/gabrieldemarmiesse/python-on-whales)
from 0.79.0 to 0.80.0.
Release notes
Sourced from python-on-whales's
releases.
v0.80.0
What's Changed
New Contributors
Full Changelog: https://github.com/gabrieldemarmiesse/python-on-whales/compare/v0.79.0...v0.80.0
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 9ac0de17cc1..b47d82936a7 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -192,7 +192,7 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
-python-on-whales==0.79.0
+python-on-whales==0.80.0
# via
# -r requirements/lint.in
# -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index f1a48a347e5..0a77027b229 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -187,7 +187,7 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
-python-on-whales==0.79.0
+python-on-whales==0.80.0
# via
# -r requirements/lint.in
# -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 98462f86bf8..8b00f162980 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -88,7 +88,7 @@ pytest-mock==3.15.1
# via -r requirements/lint.in
python-dateutil==2.9.0.post0
# via freezegun
-python-on-whales==0.79.0
+python-on-whales==0.80.0
# via -r requirements/lint.in
pyyaml==6.0.3
# via pre-commit
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 70782b04f24..f98a5a24ec4 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -83,7 +83,7 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
-python-on-whales==0.79.0
+python-on-whales==0.80.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 77a735dfeb1..c4815ef0e05 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -118,7 +118,7 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
-python-on-whales==0.79.0
+python-on-whales==0.80.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index 712b921cdcf..71c5d8adcc4 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -118,7 +118,7 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
-python-on-whales==0.79.0
+python-on-whales==0.80.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
From c172aab3c4a227e091ea49e76120f4655739773b Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 19 Jan 2026 12:24:58 +0000
Subject: [PATCH 020/141] Bump actions/cache from 5.0.1 to 5.0.2 (#11971)
Bumps [actions/cache](https://github.com/actions/cache) from 5.0.1 to
5.0.2.
Release notes
Sourced from actions/cache's
releases.
v.5.0.2
v5.0.2
What's Changed
When creating cache entries, 429s returned from the cache service
will not be retried.
Changelog
Sourced from actions/cache's
changelog.
5.0.2
- Bump
@actions/cache to v5.0.3 #1692
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 173b97fad57..4ebed113f06 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
with:
python-version: 3.11
- name: Cache PyPI
- uses: actions/cache@v5.0.1
+ uses: actions/cache@v5.0.2
with:
key: pip-lint-${{ hashFiles('requirements/*.txt') }}
path: ~/.cache/pip
@@ -96,7 +96,7 @@ jobs:
with:
submodules: true
- name: Cache llhttp generated files
- uses: actions/cache@v5.0.1
+ uses: actions/cache@v5.0.2
id: cache
with:
key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -160,7 +160,7 @@ jobs:
echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
shell: bash
- name: Cache PyPI
- uses: actions/cache@v5.0.1
+ uses: actions/cache@v5.0.2
with:
key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
path: ${{ steps.pip-cache.outputs.dir }}
From d3405d99cc9d9260bcff8890642d1641b231c1c4 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Wed, 21 Jan 2026 10:53:05 +0000
Subject: [PATCH 021/141] Bump setuptools from 80.9.0 to 80.10.1 (#11974)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [setuptools](https://github.com/pypa/setuptools) from 80.9.0 to
80.10.1.
Changelog
Sourced from setuptools's
changelog.
v80.10.1
Misc
v80.10.0
Features
- Remove post-release tags on setuptools' own build. (#4530)
- Refreshed vendored dependencies. (#5139)
Misc
Commits
adfb0c9
Bump version: 80.10.0 → 80.10.1
8535d10
docs: Link pyproject.toml to ext_modules (#5125)
fafbe2c
[CI] Workaround for GHA handling of 'skipped' in job dependency chain
(#5152)
d171023
Add news fragment
3dbba06
Refine comment to reference issue
e4922c8
Apply suggestion from @webknjaz
218c146
[CI] Workaround for GHA handling of 'skipped' in job dependency
chain
2903171
Bump version: 80.9.0 → 80.10.0
23a2b18
[CI] Allow the action check-changed-folders to be skipped
in the check ac...
660e581
[CI] Allow the action check-changed-folders to be skipped
in the check ac...
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index b47d82936a7..dffe6770496 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -294,5 +294,5 @@ zlib-ng==1.0.0
# The following packages are considered to be unsafe in a requirements file:
pip==25.3
# via pip-tools
-setuptools==80.9.0
+setuptools==80.10.1
# via pip-tools
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 0a77027b229..48d7e5196dd 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -284,5 +284,5 @@ zlib-ng==1.0.0
# The following packages are considered to be unsafe in a requirements file:
pip==25.3
# via pip-tools
-setuptools==80.9.0
+setuptools==80.10.1
# via pip-tools
From 2b6bb4b9927d7ee5c75441284c82c16539e9f2de Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Wed, 21 Jan 2026 16:53:36 +0000
Subject: [PATCH 022/141] [PR #11972/4a51c4ba backport][3.14]
NEEDS_CLEANUP_CLOSED fixed in Python 3.12.8 (#11976)
**This is a backport of PR #11972 as merged into master
(4a51c4ba30b886b5d1505e4aed790df0cc7cd8d2).**
Co-authored-by: Robsdedude
---
CHANGES/11972.bugfix.rst | 2 ++
CONTRIBUTORS.txt | 1 +
aiohttp/connector.py | 4 ++--
3 files changed, 5 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/11972.bugfix.rst
diff --git a/CHANGES/11972.bugfix.rst b/CHANGES/11972.bugfix.rst
new file mode 100644
index 00000000000..8a6c2f56f28
--- /dev/null
+++ b/CHANGES/11972.bugfix.rst
@@ -0,0 +1,2 @@
+Fixed false-positive :py:class:`DeprecationWarning` for passing ``enable_cleanup_closed=True`` to :py:class:`~aiohttp.TCPConnector` specifically on Python 3.12.7.
+-- by :user:`Robsdedude`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index f31a3623140..5664ed517b6 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -313,6 +313,7 @@ Robert Nikolich
Roman Podoliaka
Roman Postnov
Rong Zhang
+Rouven Bauer
Samir Akarioh
Samuel Colvin
Samuel Gaist
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index 93cdfa57d24..d6b0abefc69 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -75,9 +75,9 @@
3,
13,
1,
-) or sys.version_info < (3, 12, 7)
+) or sys.version_info < (3, 12, 8)
# Cleanup closed is no longer needed after https://github.com/python/cpython/pull/118960
-# which first appeared in Python 3.12.7 and 3.13.1
+# which first appeared in Python 3.12.8 and 3.13.1
__all__ = (
From e03822d594ceaa436d247dc25d6c6bc8ac2b1282 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Wed, 21 Jan 2026 16:53:50 +0000
Subject: [PATCH 023/141] [PR #11972/4a51c4ba backport][3.13]
NEEDS_CLEANUP_CLOSED fixed in Python 3.12.8 (#11975)
**This is a backport of PR #11972 as merged into master
(4a51c4ba30b886b5d1505e4aed790df0cc7cd8d2).**
Co-authored-by: Robsdedude
---
CHANGES/11972.bugfix.rst | 2 ++
CONTRIBUTORS.txt | 1 +
aiohttp/connector.py | 4 ++--
3 files changed, 5 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/11972.bugfix.rst
diff --git a/CHANGES/11972.bugfix.rst b/CHANGES/11972.bugfix.rst
new file mode 100644
index 00000000000..8a6c2f56f28
--- /dev/null
+++ b/CHANGES/11972.bugfix.rst
@@ -0,0 +1,2 @@
+Fixed false-positive :py:class:`DeprecationWarning` for passing ``enable_cleanup_closed=True`` to :py:class:`~aiohttp.TCPConnector` specifically on Python 3.12.7.
+-- by :user:`Robsdedude`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index 0c506668d90..a5c94854ee2 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -311,6 +311,7 @@ Robert Nikolich
Roman Podoliaka
Roman Postnov
Rong Zhang
+Rouven Bauer
Samir Akarioh
Samuel Colvin
Samuel Gaist
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index 290a42400f9..384741ed994 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -92,9 +92,9 @@
3,
13,
1,
-) or sys.version_info < (3, 12, 7)
+) or sys.version_info < (3, 12, 8)
# Cleanup closed is no longer needed after https://github.com/python/cpython/pull/118960
-# which first appeared in Python 3.12.7 and 3.13.1
+# which first appeared in Python 3.12.8 and 3.13.1
__all__ = (
From 4511f9fa8bb356f42b9605e933e345772e8881b5 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 22 Jan 2026 11:06:58 +0000
Subject: [PATCH 024/141] Bump wheel from 0.45.1 to 0.46.2 (#11979)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [wheel](https://github.com/pypa/wheel) from 0.45.1 to 0.46.2.
Release notes
Sourced from wheel's
releases.
0.46.2
- Restored the
bdist_wheel command for compatibility with
setuptools older than v70.1
- Importing
wheel.bdist_wheel now emits a
FutureWarning instead of a
DeprecationWarning
- Fixed
wheel unpack potentially altering the permissions
of files outside of the destination tree with maliciously crafted wheels
(CVE-2026-24049)
0.46.1
- Temporarily restored the
wheel.macosx_libfile module
(#659)
0.46.0
- Dropped support for Python 3.8
- Removed the
bdist_wheel setuptools command
implementation and entry point. The wheel.bdist_wheel
module is now just an alias to
setuptools.command.bdist_wheel, emitting a deprecation
warning on import.
- Removed vendored
packaging in favor of a run-time
dependency on it
- Made the
wheel.metadata module private (with a
deprecation warning if it's imported
- Made the
wheel.cli package private (no deprecation
warning)
- Fixed an exception when calling the
convert command
with an empty description field
Changelog
Sourced from wheel's
changelog.
Release Notes
0.46.2 (2026-01-22)
- Restored the
bdist_wheel command for compatibility with
setuptools older than
v70.1
- Importing
wheel.bdist_wheel now emits a
FutureWarning instead of a
DeprecationWarning
- Fixed
wheel unpack potentially altering the permissions
of files outside of the
destination tree with maliciously crafted wheels (CVE-2026-24049)
0.46.1 (2025-04-08)
- Temporarily restored the
wheel.macosx_libfile module
([#659](https://github.com/pypa/wheel/issues/659)
<https://github.com/pypa/wheel/issues/659>_)
0.46.0 (2025-04-03)
- Dropped support for Python 3.8
- Removed the
bdist_wheel setuptools command
implementation and entry point.
The wheel.bdist_wheel module is now just an alias to
setuptools.command.bdist_wheel, emitting a deprecation
warning on import.
- Removed vendored
packaging in favor of a run-time
dependency on it
- Made the
wheel.metadata module private (with a
deprecation warning if it's
imported
- Made the
wheel.cli package private (no deprecation
warning)
- Fixed an exception when calling the
convert command
with an empty description
field
0.45.1 (2024-11-23)
- Fixed pure Python wheels converted from eggs and wininst files
having the ABI tag in
the file name
0.45.0 (2024-11-08)
-
Refactored the convert command to not need setuptools to
be installed
-
Don't configure setuptools logging unless running
bdist_wheel
-
Added a redirection from wheel.bdist_wheel.bdist_wheel
to
setuptools.command.bdist_wheel.bdist_wheel to improve
compatibility with
setuptools' latest fixes.
Projects are still advised to migrate away from the deprecated module
and import
the setuptools' implementation explicitly. (PR by @abravalheri)
0.44.0 (2024-08-04)
- Canonicalized requirements in METADATA file (PR by Wim
Jeantine-Glenn)
- Deprecated the
bdist_wheel module, as the code was
migrated to setuptools
... (truncated)
Commits
eba4036
Updated the version number for v0.46.2
557fb54
Created a new release
7a7d2de
Fixed security issue around wheel unpack (#675)
41418fa
Fixed test failures due to metadata normalization changes
c1d442b
[pre-commit.ci] pre-commit autoupdate (#674)
0bac882
Update github actions environments (#673)
be9f45b
[pre-commit.ci] pre-commit autoupdate (#667)
6244f08
Update pre-commit ruff legacy alias (#668)
15b7577
PEP 639 compliance (#670)
fc8cb41
Revert "Removed redundant Python version from the publish workflow
(#666)"
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 3 ++-
requirements/dev.txt | 3 ++-
2 files changed, 4 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index dffe6770496..e3712d36cd6 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -131,6 +131,7 @@ packaging==25.0
# gunicorn
# pytest
# sphinx
+ # wheel
pathspec==1.0.3
# via mypy
pip-tools==7.5.2
@@ -282,7 +283,7 @@ virtualenv==20.36.1
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
-wheel==0.45.1
+wheel==0.46.2
# via pip-tools
yarl==1.22.0
# via -r requirements/runtime-deps.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 48d7e5196dd..89b75df9f78 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -128,6 +128,7 @@ packaging==25.0
# gunicorn
# pytest
# sphinx
+ # wheel
pathspec==1.0.3
# via mypy
pip-tools==7.5.2
@@ -272,7 +273,7 @@ virtualenv==20.36.1
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
-wheel==0.45.1
+wheel==0.46.2
# via pip-tools
yarl==1.22.0
# via -r requirements/runtime-deps.in
From a61ace9a40b46d7dd0c9dcd0958bb176b3dace22 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 22 Jan 2026 11:27:03 +0000
Subject: [PATCH 025/141] Bump pycparser from 2.23 to 3.0 (#11982)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [pycparser](https://github.com/eliben/pycparser) from 2.23 to 3.0.
Release notes
Sourced from pycparser's
releases.
release_v3.00
What's Changed
- Removed dependency on PLY, by rewriting pycparser to use a
hand-written lexer and recursive-descent parser for C. No API changes /
functionality changes intended - the same AST is produced.
- Add support for Python 3.14 and drop EOL 3.8 by
@hugovk in eliben/pycparser#581
- Update _ast_gen.py to be in sync with c_ast.py by
@simonlindholm
in eliben/pycparser#582
Full Changelog: https://github.com/eliben/pycparser/compare/release_v2.23...release_v3.00
Commits
77de509
Prepare for release 3.00
e57ccd1
Update README
230e12d
disable uv caching in CI
9c52f40
Update CI to run make check+test via uvx
6b8f064
Use dataclass where applicable; add 'make test' to Makefile
25376cb
Use f-strings instead of older formatting in other auxiliary files
9bd8997
Use f-strings instead of older formatting in core code + tests
664eac2
Modernize some code with pattern matching
842f064
Add type annotations to more examples
076f374
Add types to several exmaples
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/runtime-deps.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
9 files changed, 9 insertions(+), 9 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index 3e57dbe03ae..58b6fb38047 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -40,7 +40,7 @@ propcache==0.4.1
# yarl
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
typing-extensions==4.15.0 ; python_version < "3.13"
# via
diff --git a/requirements/base.txt b/requirements/base.txt
index 22ed18b1e71..98387e13764 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -40,7 +40,7 @@ propcache==0.4.1
# yarl
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
typing-extensions==4.15.0 ; python_version < "3.13"
# via
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index e3712d36cd6..bbe4db027b3 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -154,7 +154,7 @@ proxy-py==2.4.10
# via -r requirements/test-common.in
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
pydantic==2.12.5
# via python-on-whales
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 89b75df9f78..fc6c575bf7f 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -151,7 +151,7 @@ proxy-py==2.4.10
# via -r requirements/test-common.in
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
pydantic==2.12.5
# via python-on-whales
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 8b00f162980..98730165d1b 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -67,7 +67,7 @@ pre-commit==4.5.1
# via -r requirements/lint.in
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
pydantic==2.12.5
# via python-on-whales
diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt
index bace48ca565..8ff95aee594 100644
--- a/requirements/runtime-deps.txt
+++ b/requirements/runtime-deps.txt
@@ -36,7 +36,7 @@ propcache==0.4.1
# yarl
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
typing-extensions==4.15.0 ; python_version < "3.13"
# via
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index f98a5a24ec4..c6e62352862 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -56,7 +56,7 @@ pluggy==1.6.0
# pytest-cov
proxy-py==2.4.10
# via -r requirements/test-common.in
-pycparser==2.23
+pycparser==3.0
# via cffi
pydantic==2.12.5
# via python-on-whales
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index c4815ef0e05..fdb34860c93 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -91,7 +91,7 @@ proxy-py==2.4.10
# via -r requirements/test-common.in
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
pydantic==2.12.5
# via python-on-whales
diff --git a/requirements/test.txt b/requirements/test.txt
index 71c5d8adcc4..724764aee76 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -91,7 +91,7 @@ proxy-py==2.4.10
# via -r requirements/test-common.in
pycares==4.11.0
# via aiodns
-pycparser==2.23
+pycparser==3.0
# via cffi
pydantic==2.12.5
# via python-on-whales
From 5ecc08a2282e376f3e0136794d8b8f8d3f7d21b9 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 23 Jan 2026 10:53:06 +0000
Subject: [PATCH 026/141] Bump gunicorn from 23.0.0 to 24.0.0 (#11983)
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 23.0.0 to
24.0.0.
Release notes
Sourced from gunicorn's
releases.
24.0.0
New Features
-
ASGI Worker (Beta): Native asyncio-based ASGI
support for running async Python frameworks like FastAPI, Starlette, and
Quart without external dependencies
- HTTP/1.1 with keepalive connections
- WebSocket support
- Lifespan protocol for startup/shutdown hooks
- Optional uvloop for improved performance
-
uWSGI Binary Protocol: Support for receiving
requests from nginx via uwsgi_pass directive
-
Documentation Migration: Migrated to MkDocs with
Material theme
Security
- eventlet: Require eventlet >= 0.40.3
(CVE-2021-21419, CVE-2025-58068)
- gevent: Require gevent >= 24.10.1
(CVE-2023-41419, CVE-2024-3219)
- tornado: Require tornado >= 6.5.0
(CVE-2025-47287)
Install
pip install gunicorn==24.0.0
Commits
3960372
Merge pull request #3426
from benoitc/website-2025
d34d3de
docs: Set release date for 24.0.0
066e6d8
docs: Move ASGI worker tab after Gthread
c6b1159
docs: Add Tornado worker to design page
c959dae
docs: Redesign architecture page with visual components
571bc12
docs: Add punchy theme with vibrant colors and modern features
73adc7c
docs: Add collapsible TOC for settings reference
dcec6e7
docs: Modern landing page with custom template
5ea4eb3
docs: Add 2026 changelog and modernize README
0b96103
docs: Configure GitHub Pages deployment with custom domain
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index 58b6fb38047..f9d327925b9 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==23.0.0
+gunicorn==24.0.0
# via -r requirements/base-ft.in
idna==3.10
# via yarl
diff --git a/requirements/base.txt b/requirements/base.txt
index 98387e13764..8c4a8bfbb6b 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==23.0.0
+gunicorn==24.0.0
# via -r requirements/base.in
idna==3.10
# via yarl
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index bbe4db027b3..0bee85cfc1f 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -83,7 +83,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==23.0.0
+gunicorn==24.0.0
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/dev.txt b/requirements/dev.txt
index fc6c575bf7f..00fb1024679 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -81,7 +81,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==23.0.0
+gunicorn==24.0.0
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index fdb34860c93..09b98535a9d 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==23.0.0
+gunicorn==24.0.0
# via -r requirements/base-ft.in
idna==3.10
# via
diff --git a/requirements/test.txt b/requirements/test.txt
index 724764aee76..fd7d2429ea5 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==23.0.0
+gunicorn==24.0.0
# via -r requirements/base.in
idna==3.10
# via
From 500b2b5cf31c4f0d879d258acda25c59cf3a4a51 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 23 Jan 2026 11:09:01 +0000
Subject: [PATCH 027/141] Bump wheel from 0.46.2 to 0.46.3 (#11985)
Bumps [wheel](https://github.com/pypa/wheel) from 0.46.2 to 0.46.3.
Release notes
Sourced from wheel's
releases.
0.46.3
- Fixed
ImportError: cannot import name '_setuptools_logging'
from 'wheel' when installed alongside an old version of
setuptools and running the bdist_wheel command (#676)
Changelog
Sourced from wheel's
changelog.
Release Notes
0.46.3 (2026-01-22)
- Fixed
ImportError: cannot import name '_setuptools_logging'
from 'wheel' when
installed alongside an old version of setuptools and running the
bdist_wheel
command ([#676](https://github.com/pypa/wheel/issues/676)
<https://github.com/pypa/wheel/issues/676>_)
0.46.2 (2026-01-22)
- Restored the
bdist_wheel command for compatibility with
setuptools older than
v70.1
- Importing
wheel.bdist_wheel now emits a
FutureWarning instead of a
DeprecationWarning
- Fixed
wheel unpack potentially altering the permissions
of files outside of the
destination tree with maliciously crafted wheels (CVE-2026-24049)
0.46.1 (2025-04-08)
- Temporarily restored the
wheel.macosx_libfile module
([#659](https://github.com/pypa/wheel/issues/659)
<https://github.com/pypa/wheel/issues/659>_)
0.46.0 (2025-04-03)
- Dropped support for Python 3.8
- Removed the
bdist_wheel setuptools command
implementation and entry point.
The wheel.bdist_wheel module is now just an alias to
setuptools.command.bdist_wheel, emitting a deprecation
warning on import.
- Removed vendored
packaging in favor of a run-time
dependency on it
- Made the
wheel.metadata module private (with a
deprecation warning if it's
imported
- Made the
wheel.cli package private (no deprecation
warning)
- Fixed an exception when calling the
convert command
with an empty description
field
0.45.1 (2024-11-23)
- Fixed pure Python wheels converted from eggs and wininst files
having the ABI tag in
the file name
0.45.0 (2024-11-08)
-
Refactored the convert command to not need setuptools to
be installed
-
Don't configure setuptools logging unless running
bdist_wheel
-
Added a redirection from wheel.bdist_wheel.bdist_wheel
to
setuptools.command.bdist_wheel.bdist_wheel to improve
compatibility with
setuptools' latest fixes.
Projects are still advised to migrate away from the deprecated module
and import
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 0bee85cfc1f..1d046f098c9 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -283,7 +283,7 @@ virtualenv==20.36.1
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
-wheel==0.46.2
+wheel==0.46.3
# via pip-tools
yarl==1.22.0
# via -r requirements/runtime-deps.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 00fb1024679..59776e286e5 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -273,7 +273,7 @@ virtualenv==20.36.1
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
-wheel==0.46.2
+wheel==0.46.3
# via pip-tools
yarl==1.22.0
# via -r requirements/runtime-deps.in
From bca4bcf2b9195a46a7997be300c35f4d747f485c Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Sat, 24 Jan 2026 12:51:53 -0300
Subject: [PATCH 028/141] [PR #11737/f30f43e6 backport][3.14] Skip benchmarks
in ci when running in fork repositories (#11991)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Co-authored-by: Rui Xi
Co-authored-by: 🇺🇦 Sviatoslav Sydorenko
---
.github/workflows/ci-cd.yml | 46 +++++++++++++++++++++++++++++++------
CHANGES/11737.contrib.rst | 3 +++
2 files changed, 42 insertions(+), 7 deletions(-)
create mode 100644 CHANGES/11737.contrib.rst
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 4ebed113f06..2e78b5080eb 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -23,9 +23,33 @@ env:
FORCE_COLOR: 1 # Request colored output from CLI tools supporting it
MYPY_FORCE_COLOR: 1
PY_COLORS: 1
+ UPSTREAM_REPOSITORY_ID: >-
+ 13258039
+
+permissions: {}
jobs:
+ pre-setup:
+ name: Pre-Setup global build settings
+ runs-on: ubuntu-latest
+ outputs:
+ upstream-repository-id: ${{ env.UPSTREAM_REPOSITORY_ID }}
+ release-requested: >-
+ ${{
+ (
+ github.event_name == 'push'
+ && github.ref_type == 'tag'
+ )
+ && true
+ || false
+ }}
+ steps:
+ - name: Dummy
+ if: false
+ run: |
+ echo "Pre-setup step"
+
lint:
name: Linter
runs-on: ubuntu-latest
@@ -243,8 +267,11 @@ jobs:
benchmark:
name: Benchmark
- needs: gen_llhttp
-
+ needs:
+ - gen_llhttp
+ - pre-setup # transitive, for accessing settings
+ if: >-
+ needs.pre-setup.outputs.upstream-repository-id == github.repository_id
runs-on: ubuntu-latest
timeout-minutes: 12
steps:
@@ -279,7 +306,6 @@ jobs:
uses: CodSpeedHQ/action@v4
with:
mode: instrumentation
- token: ${{ secrets.CODSPEED_TOKEN }}
run: python -Im pytest --no-cov --numprocesses=0 -vvvvv --codspeed
@@ -301,9 +327,10 @@ jobs:
pre-deploy:
name: Pre-Deploy
runs-on: ubuntu-latest
- needs: check
- # Run only on pushing a tag
- if: github.event_name == 'push' && contains(github.ref, 'refs/tags/')
+ needs:
+ - check
+ - pre-setup # transitive, for accessing settings
+ if: fromJSON(needs.pre-setup.outputs.release-requested)
steps:
- name: Dummy
run: |
@@ -443,8 +470,13 @@ jobs:
deploy:
name: Deploy
- needs: [build-tarball, build-wheels]
+ needs:
+ - build-tarball
+ - build-wheels
+ - pre-setup # transitive, for accessing settings
runs-on: ubuntu-latest
+ if: >-
+ needs.pre-setup.outputs.upstream-repository-id == github.repository_id
permissions:
contents: write # IMPORTANT: mandatory for making GitHub Releases
diff --git a/CHANGES/11737.contrib.rst b/CHANGES/11737.contrib.rst
new file mode 100644
index 00000000000..2b793f41d6c
--- /dev/null
+++ b/CHANGES/11737.contrib.rst
@@ -0,0 +1,3 @@
+The benchmark CI job now runs only in the upstream repository -- by :user:`Cycloctane`.
+
+It used to always fail in forks, which this change fixed.
From 4a6936d54510d60b46c71b77ecafffd41df07ace Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Sat, 24 Jan 2026 12:52:14 -0300
Subject: [PATCH 029/141] [PR #11737/f30f43e6 backport][3.13] Skip benchmarks
in ci when running in fork repositories (#11990)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Co-authored-by: Rui Xi
Co-authored-by: 🇺🇦 Sviatoslav Sydorenko
Co-authored-by: rodrigo.nogueira
---
.github/workflows/ci-cd.yml | 46 +++++++++++++++++++++++++++++++------
CHANGES/11737.contrib.rst | 3 +++
2 files changed, 42 insertions(+), 7 deletions(-)
create mode 100644 CHANGES/11737.contrib.rst
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 51f34c8b908..df1b0cf2719 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -23,9 +23,33 @@ env:
FORCE_COLOR: 1 # Request colored output from CLI tools supporting it
MYPY_FORCE_COLOR: 1
PY_COLORS: 1
+ UPSTREAM_REPOSITORY_ID: >-
+ 13258039
+
+permissions: {}
jobs:
+ pre-setup:
+ name: Pre-Setup global build settings
+ runs-on: ubuntu-latest
+ outputs:
+ upstream-repository-id: ${{ env.UPSTREAM_REPOSITORY_ID }}
+ release-requested: >-
+ ${{
+ (
+ github.event_name == 'push'
+ && github.ref_type == 'tag'
+ )
+ && true
+ || false
+ }}
+ steps:
+ - name: Dummy
+ if: false
+ run: |
+ echo "Pre-setup step"
+
lint:
name: Linter
runs-on: ubuntu-latest
@@ -243,8 +267,11 @@ jobs:
benchmark:
name: Benchmark
- needs: gen_llhttp
-
+ needs:
+ - gen_llhttp
+ - pre-setup # transitive, for accessing settings
+ if: >-
+ needs.pre-setup.outputs.upstream-repository-id == github.repository_id
runs-on: ubuntu-latest
timeout-minutes: 12
steps:
@@ -279,7 +306,6 @@ jobs:
uses: CodSpeedHQ/action@v4
with:
mode: instrumentation
- token: ${{ secrets.CODSPEED_TOKEN }}
run: python -Im pytest --no-cov --numprocesses=0 -vvvvv --codspeed
@@ -301,9 +327,10 @@ jobs:
pre-deploy:
name: Pre-Deploy
runs-on: ubuntu-latest
- needs: check
- # Run only on pushing a tag
- if: github.event_name == 'push' && contains(github.ref, 'refs/tags/')
+ needs:
+ - check
+ - pre-setup # transitive, for accessing settings
+ if: fromJSON(needs.pre-setup.outputs.release-requested)
steps:
- name: Dummy
run: |
@@ -443,8 +470,13 @@ jobs:
deploy:
name: Deploy
- needs: [build-tarball, build-wheels]
+ needs:
+ - build-tarball
+ - build-wheels
+ - pre-setup # transitive, for accessing settings
runs-on: ubuntu-latest
+ if: >-
+ needs.pre-setup.outputs.upstream-repository-id == github.repository_id
permissions:
contents: write # IMPORTANT: mandatory for making GitHub Releases
diff --git a/CHANGES/11737.contrib.rst b/CHANGES/11737.contrib.rst
new file mode 100644
index 00000000000..2b793f41d6c
--- /dev/null
+++ b/CHANGES/11737.contrib.rst
@@ -0,0 +1,3 @@
+The benchmark CI job now runs only in the upstream repository -- by :user:`Cycloctane`.
+
+It used to always fail in forks, which this change fixed.
From 8c59acdbd3002191026ecb186c5dc8afad37368d Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 26 Jan 2026 12:37:12 +0000
Subject: [PATCH 030/141] Bump rich from 14.2.0 to 14.3.1 (#12000)
Bumps [rich](https://github.com/Textualize/rich) from 14.2.0 to 14.3.1.
Release notes
Sourced from rich's
releases.
The Nerdy Fix release
Fixed issue with characters outside of unicode range reporting 0 cell
size
[14.3.1] - 2026-01-24
Fixed
The more emojis release
Rich now has support for multi-codepoint emojis. There have also been
some Markdown improvements, and a number of fixes. See the release notes
below for details.
[14.3.0] - 2026-01-24
Fixed
Added
Changed
Changelog
Sourced from rich's
changelog.
[14.3.1] - 2026-01-24
Fixed
[14.3.0] - 2026-01-24
Fixed
Added
Changed
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 1d046f098c9..43c8730283e 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -207,7 +207,7 @@ requests==2.32.5
# via
# sphinx
# sphinxcontrib-spelling
-rich==14.2.0
+rich==14.3.1
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 59776e286e5..3b10061ce51 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -200,7 +200,7 @@ regex==2026.1.15
# via re-assert
requests==2.32.5
# via sphinx
-rich==14.2.0
+rich==14.3.1
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 98730165d1b..c1bd00d7ad3 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -92,7 +92,7 @@ python-on-whales==0.80.0
# via -r requirements/lint.in
pyyaml==6.0.3
# via pre-commit
-rich==14.2.0
+rich==14.3.1
# via pytest-codspeed
six==1.17.0
# via python-dateutil
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index c6e62352862..be213c58414 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -89,7 +89,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.1.15
# via re-assert
-rich==14.2.0
+rich==14.3.1
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 09b98535a9d..3d1e50f7f28 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -124,7 +124,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.1.15
# via re-assert
-rich==14.2.0
+rich==14.3.1
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index fd7d2429ea5..5c8de1ff872 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -124,7 +124,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.1.15
# via re-assert
-rich==14.2.0
+rich==14.3.1
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
From e6532c254be644b86293e580e518b03529e4c626 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 26 Jan 2026 12:43:54 +0000
Subject: [PATCH 031/141] Bump gunicorn from 24.0.0 to 24.1.1 (#12003)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 24.0.0 to
24.1.1.
Release notes
Sourced from gunicorn's
releases.
24.1.1
Bug Fixes
- Fix
forwarded_allow_ips and
proxy_allow_ips to remain as strings for backward
compatibility with external tools like uvicorn. Network validation now
uses strict
mode to detect invalid CIDR notation (e.g., 192.168.1.1/24
where host bits are set)
(#3458,
[PR #3459](benoitc/gunicorn#3459))
Full Changelog: https://github.com/benoitc/gunicorn/compare/24.1.0...24.1.1
Gunicorn 24.1.0
New Features
-
Official Docker Image: Gunicorn now publishes
official Docker images to GitHub Container Registry ([PR #3454](benoitc/gunicorn#3454))
- Available at
ghcr.io/benoitc/gunicorn
- Based on Python 3.12 slim image
- Uses recommended worker formula (2 × CPU + 1)
- Configurable via environment variables
-
PROXY Protocol v2 Support: Extended PROXY protocol
implementation to support the binary v2 format in addition to the
existing text-based v1 format ([PR #3451](benoitc/gunicorn#3451))
- New
--proxy-protocol modes: off,
v1, v2, auto
auto mode (default when enabled) detects v1 or v2
automatically
- v2 binary format is more efficient and supports additional
metadata
- Works with HAProxy, AWS NLB/ALB, and other PROXY protocol v2
sources
-
CIDR Network Support:
--forwarded-allow-ips and --proxy-allow-from
now accept CIDR notation (e.g., 192.168.0.0/16) for
specifying trusted networks ([PR #3449](benoitc/gunicorn#3449))
-
Socket Backlog Metric: New
gunicorn.socket.backlog gauge metric reports the current
socket backlog size on Linux systems ([PR #3450](benoitc/gunicorn#3450))
-
InotifyReloader Enhancement: The inotify-based
reloader now watches newly imported modules, not just those loaded at
startup ([PR #3447](benoitc/gunicorn#3447))
Bug Fixes
Installation
pip install gunicorn==24.1.0
</tr></table>
... (truncated)
Commits
375e79e
release: bump version to 24.1.1
ad0c12d
docs: add sponsors section to README
70200ee
chore: add GitHub Sponsors funding configuration
6841804
docs: remove incorrect PR reference from Docker changelog entry
abce0ca
docs: add 24.1.1 changelog entry for forwarded_allow_ips fix
e9a3f30
fix: keep forwarded_allow_ips as strings for backward compatibility (#3459)
d73ff4b
docs: update main changelog with 24.1.0
53f2c31
ci: allow docs deploy on workflow_dispatch
eab5f0b
ci: trigger Docker publish on tags with or without v prefix
a20d3fb
docs: add Docker image to 24.1.0 changelog
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index f9d327925b9..e0bb14436e0 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.0.0
+gunicorn==24.1.1
# via -r requirements/base-ft.in
idna==3.10
# via yarl
diff --git a/requirements/base.txt b/requirements/base.txt
index 8c4a8bfbb6b..b4fcd840159 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.0.0
+gunicorn==24.1.1
# via -r requirements/base.in
idna==3.10
# via yarl
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 43c8730283e..99e6625f45d 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -83,7 +83,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.0.0
+gunicorn==24.1.1
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 3b10061ce51..b54a0b151f3 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -81,7 +81,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.0.0
+gunicorn==24.1.1
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 3d1e50f7f28..35cd1b024ac 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.0.0
+gunicorn==24.1.1
# via -r requirements/base-ft.in
idna==3.10
# via
diff --git a/requirements/test.txt b/requirements/test.txt
index 5c8de1ff872..cef64b68d14 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.0.0
+gunicorn==24.1.1
# via -r requirements/base.in
idna==3.10
# via
From 8e423f44b4821030860e0519fc9d37a2ad125aa2 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 27 Jan 2026 10:55:46 +0000
Subject: [PATCH 032/141] Bump coverage from 7.13.1 to 7.13.2 (#12004)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.13.1
to 7.13.2.
Changelog
Sourced from coverage's
changelog.
Version 7.13.2 — 2026-01-25
-
Fix: when Python is installed via symlinks, for example with
Homebrew, the
standard library files could be incorrectly included in coverage
reports.
This is now fixed, closing issue 2115_.
-
Fix: if a data file is created with no read permissions, the combine
step
would fail completely. Now a warning is issued and the file is skipped.
Closes issue 2117_.
.. _issue 2115: coveragepy/coveragepy#2115
.. _issue 2117: coveragepy/coveragepy#2117
.. _changes_7-13-1:
Commits
513e971
docs: sample HTML for 7.13.2
27a8230
docs: prep for 7.13.2
27d8daa
refactor: plural does more
a2f248c
fix: stdlib might be through a symlink. #2115
bc52a22
debug: re-organize Matchers to show more of what they do
f338d81
debug: build is a tuple, don't show it on two lines
92020e4
refactor(test): convert to parametrized
6387d0a
test: let (most) tests run with no network
1d31e33
build: workflows sometimes need more than 10 min
6294978
refactor: an error message is now uniform across versions
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
5 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 99e6625f45d..4bcfe89eb07 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -55,7 +55,7 @@ click==8.3.1
# slotscheck
# towncrier
# wait-for-it
-coverage==7.13.1
+coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/dev.txt b/requirements/dev.txt
index b54a0b151f3..02f4b804eda 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -55,7 +55,7 @@ click==8.3.1
# slotscheck
# towncrier
# wait-for-it
-coverage==7.13.1
+coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index be213c58414..36db8713082 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -14,7 +14,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.1
+coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 35cd1b024ac..c00ba42807d 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -29,7 +29,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.1
+coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test.txt b/requirements/test.txt
index cef64b68d14..fedc7e83ed6 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -29,7 +29,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.1
+coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
From 1597af69d6f56ebfa2a308ca00a2e50fe15afa1b Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Wed, 28 Jan 2026 10:54:36 +0000
Subject: [PATCH 033/141] Bump cryptography from 46.0.3 to 46.0.4 (#12008)
Bumps [cryptography](https://github.com/pyca/cryptography) from 46.0.3
to 46.0.4.
Changelog
Sourced from cryptography's
changelog.
46.0.4 - 2026-01-27
* `Dropped support for win_arm64 wheels`_.
* Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.5.5.
.. _v46-0-3:
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 4bcfe89eb07..896b0a5d17e 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -59,7 +59,7 @@ coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
-cryptography==46.0.3
+cryptography==46.0.4
# via trustme
cython==3.2.4
# via -r requirements/cython.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 02f4b804eda..23aea1024e4 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -59,7 +59,7 @@ coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
-cryptography==46.0.3
+cryptography==46.0.4
# via trustme
distlib==0.4.0
# via virtualenv
diff --git a/requirements/lint.txt b/requirements/lint.txt
index c1bd00d7ad3..35a3478de9d 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -23,7 +23,7 @@ cfgv==3.5.0
# via pre-commit
click==8.3.1
# via slotscheck
-cryptography==46.0.3
+cryptography==46.0.4
# via trustme
distlib==0.4.0
# via virtualenv
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 36db8713082..053fa43a726 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -18,7 +18,7 @@ coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
-cryptography==46.0.3
+cryptography==46.0.4
# via trustme
exceptiongroup==1.3.1
# via pytest
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index c00ba42807d..ba666876332 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -33,7 +33,7 @@ coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
-cryptography==46.0.3
+cryptography==46.0.4
# via trustme
exceptiongroup==1.3.1
# via pytest
diff --git a/requirements/test.txt b/requirements/test.txt
index fedc7e83ed6..9f36de602a0 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -33,7 +33,7 @@ coverage==7.13.2
# via
# -r requirements/test-common.in
# pytest-cov
-cryptography==46.0.3
+cryptography==46.0.4
# via trustme
exceptiongroup==1.3.1
# via pytest
From a396ec56796a5abb17aa5d8512e1739aeaf2d9e6 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 29 Jan 2026 10:52:05 +0000
Subject: [PATCH 034/141] Bump actions/cache from 5.0.2 to 5.0.3 (#12010)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [actions/cache](https://github.com/actions/cache) from 5.0.2 to
5.0.3.
Release notes
Sourced from actions/cache's
releases.
v5.0.3
What's Changed
Full Changelog: https://github.com/actions/cache/compare/v5...v5.0.3
Changelog
Sourced from actions/cache's
changelog.
5.0.3
Commits
cdf6c1f
Merge pull request #1695
from actions/Link-/prepare-5.0.3
a1bee22
Add review for the @actions/http-client license
4695763
Add licensed output
dc73bb9
Upgrade dependencies and address security warnings
345d5c2
Add 5.0.3 builds
- See full diff in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 2e78b5080eb..f669369990e 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -71,7 +71,7 @@ jobs:
with:
python-version: 3.11
- name: Cache PyPI
- uses: actions/cache@v5.0.2
+ uses: actions/cache@v5.0.3
with:
key: pip-lint-${{ hashFiles('requirements/*.txt') }}
path: ~/.cache/pip
@@ -120,7 +120,7 @@ jobs:
with:
submodules: true
- name: Cache llhttp generated files
- uses: actions/cache@v5.0.2
+ uses: actions/cache@v5.0.3
id: cache
with:
key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -184,7 +184,7 @@ jobs:
echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
shell: bash
- name: Cache PyPI
- uses: actions/cache@v5.0.2
+ uses: actions/cache@v5.0.3
with:
key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
path: ${{ steps.pip-cache.outputs.dir }}
From bcf869138092ce6b958f0c48f592f91a35b426da Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 2 Feb 2026 12:33:55 +0000
Subject: [PATCH 035/141] Bump rich from 14.3.1 to 14.3.2 (#12015)
Bumps [rich](https://github.com/Textualize/rich) from 14.3.1 to 14.3.2.
Release notes
Sourced from rich's
releases.
The ZWJy release
A fix for cell_len edge cases
[14.3.2] - 2026-02-01
Fixed
Changelog
Sourced from rich's
changelog.
[14.3.2] - 2026-02-01
Fixed
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 896b0a5d17e..acd0e6b4330 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -207,7 +207,7 @@ requests==2.32.5
# via
# sphinx
# sphinxcontrib-spelling
-rich==14.3.1
+rich==14.3.2
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 23aea1024e4..dc171022631 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -200,7 +200,7 @@ regex==2026.1.15
# via re-assert
requests==2.32.5
# via sphinx
-rich==14.3.1
+rich==14.3.2
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 35a3478de9d..e8a4ee26512 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -92,7 +92,7 @@ python-on-whales==0.80.0
# via -r requirements/lint.in
pyyaml==6.0.3
# via pre-commit
-rich==14.3.1
+rich==14.3.2
# via pytest-codspeed
six==1.17.0
# via python-dateutil
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 053fa43a726..a4c832d4896 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -89,7 +89,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.1.15
# via re-assert
-rich==14.3.1
+rich==14.3.2
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index ba666876332..46afbd433f9 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -124,7 +124,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.1.15
# via re-assert
-rich==14.3.1
+rich==14.3.2
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index 9f36de602a0..b26f5fdb5dc 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -124,7 +124,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.1.15
# via re-assert
-rich==14.3.1
+rich==14.3.2
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
From f48cb1f8a56ec8d67987d890c08c7220f21f41ae Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 2 Feb 2026 13:00:21 +0000
Subject: [PATCH 036/141] Bump setuptools from 80.10.1 to 80.10.2 (#12002)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [setuptools](https://github.com/pypa/setuptools) from 80.10.1 to
80.10.2.
Changelog
Sourced from setuptools's
changelog.
v80.10.2
Bugfixes
- Update vendored dependencies. (#5159)
Misc
Commits
5cf2d08
Bump version: 80.10.1 → 80.10.2
852cd5e
Merge pull request #5166
from pypa/bugfix/5159-vendor-bin-free
11115ee
Suppress deprecation warning.
5cf9185
Update vendored dependencies.
cf59f41
Delete all binaries generated by vendored package install.
89a5981
Add missing newsfragments
c0114af
Postpone deprecation warnings related to PEP 639 to 2027-Feb-18 (#5115)
de07603
Revert "[CI] Constraint transient test dependency on pyobjc"
(#5128)
3afd5d6
Revert "[CI] Constraint transient test dependency on
pyobjc"
- See full diff in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index acd0e6b4330..77ab30acedf 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -295,5 +295,5 @@ zlib-ng==1.0.0
# The following packages are considered to be unsafe in a requirements file:
pip==25.3
# via pip-tools
-setuptools==80.10.1
+setuptools==80.10.2
# via pip-tools
diff --git a/requirements/dev.txt b/requirements/dev.txt
index dc171022631..91dde8fbcff 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -285,5 +285,5 @@ zlib-ng==1.0.0
# The following packages are considered to be unsafe in a requirements file:
pip==25.3
# via pip-tools
-setuptools==80.10.1
+setuptools==80.10.2
# via pip-tools
From d187ece51b4ed40d7998cdbf29d4bae1c406426b Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 3 Feb 2026 10:56:47 +0000
Subject: [PATCH 037/141] Bump gunicorn from 24.1.1 to 25.0.1 (#12019)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 24.1.1 to
25.0.1.
Release notes
Sourced from gunicorn's
releases.
25.0.1
Bug Fixes
- Fix ASGI streaming responses (SSE) hanging: add chunked transfer
encoding for
HTTP/1.1 responses without Content-Length header. Without chunked
encoding,
clients wait for connection close to determine end-of-response.
Changes
- Update celery_alternative example to use FastAPI with native ASGI
worker and
uvloop for async task execution
Testing
- Add ASGI compliance test suite with Docker-based integration tests
covering HTTP,
WebSocket, streaming, lifespan, framework integration (Starlette,
FastAPI),
HTTP/2, and concurrency scenarios
Gunicorn 25.0.0
New Features
-
Dirty Arbiters: Separate process pool for executing
long-running, blocking
operations (AI model loading, heavy computation) without blocking HTTP
workers
([PR #3460](benoitc/gunicorn#3460))
- Inspired by Erlang's dirty schedulers
- Asyncio-based with Unix socket IPC
- Stateful workers that persist loaded resources
- New settings:
--dirty-app,
--dirty-workers, --dirty-timeout,
--dirty-threads, --dirty-graceful-timeout
- Lifecycle hooks:
on_dirty_starting,
dirty_post_fork,
dirty_worker_init, dirty_worker_exit
-
Per-App Worker Allocation for Dirty Arbiters:
Control how many dirty workers
load each app for memory optimization with heavy models
([PR #3473](benoitc/gunicorn#3473))
- Set
workers class attribute on DirtyApp (e.g.,
workers = 2)
- Or use config format
module:class:N (e.g.,
myapp:HeavyModel:2)
- Requests automatically routed to workers with the target app
- New exception
DirtyNoWorkersAvailableError for graceful
error handling
- Example: 8 workers × 10GB model = 80GB → with
workers=2: 20GB (75% savings)
-
HTTP/2 Support (Beta): Native HTTP/2 (RFC 7540)
support for improved performance
with modern clients ([PR #3468](benoitc/gunicorn#3468))
- Multiplexed streams over a single connection
- Header compression (HPACK)
- Flow control and stream prioritization
- Works with gthread, gevent, and ASGI workers
- New settings:
--http-protocols,
--http2-max-concurrent-streams,
--http2-initial-window-size,
--http2-max-frame-size,
--http2-max-header-list-size
- Requires SSL/TLS and h2 library:
pip install
gunicorn[http2]
... (truncated)
Commits
3bf529f
docs: sync news.md with 2026-news.md
1f4f245
Merge pull request #3478
from benoitc/feature/asgi-compliance-testbed
e1519c0
docs: add ASGI compliance test suite to changelog
0885005
fix(tests): correct assertions in ASGI compliance tests
658924c
docs: update changelog for 25.0.1
c5b6e82
chore: bump version to 25.0.1
ce352dc
fix(asgi): add chunked transfer encoding for streaming responses
29b8a3a
Merge pull request #3476
from benoitc/dependabot/github_actions/actions/check...
791ab46
chore(deps): bump actions/checkout from 4 to 6
9235b72
Merge pull request #3475
from benoitc/dependabot/github_actions/actions/uploa...
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index e0bb14436e0..fcdf5f38898 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.1.1
+gunicorn==25.0.1
# via -r requirements/base-ft.in
idna==3.10
# via yarl
diff --git a/requirements/base.txt b/requirements/base.txt
index b4fcd840159..f643e6d8f6e 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.1.1
+gunicorn==25.0.1
# via -r requirements/base.in
idna==3.10
# via yarl
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 77ab30acedf..a8db8a92947 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -83,7 +83,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.1.1
+gunicorn==25.0.1
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 91dde8fbcff..7072457e396 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -81,7 +81,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.1.1
+gunicorn==25.0.1
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 46afbd433f9..9069fd70483 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.1.1
+gunicorn==25.0.1
# via -r requirements/base-ft.in
idna==3.10
# via
diff --git a/requirements/test.txt b/requirements/test.txt
index b26f5fdb5dc..bced139e0df 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==24.1.1
+gunicorn==25.0.1
# via -r requirements/base.in
idna==3.10
# via
From c643a563fd73cf87b876a419871a29bafe362d31 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Wed, 4 Feb 2026 11:04:35 +0000
Subject: [PATCH 038/141] Bump babel from 2.17.0 to 2.18.0 (#12026)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [babel](https://github.com/python-babel/babel) from 2.17.0 to
2.18.0.
Release notes
Sourced from babel's
releases.
v2.18.0
Happy 2026! Like last year's release (ahem...), this one too is being
made from FOSDEM 2026, in Brussels, Belgium. 🇧🇪
We'll aspire for a less glacial release cycle for 2.19. 😁
Please see CHANGELOG.rst
for the detailed change log.
Full Changelog: https://github.com/python-babel/babel/compare/v2.17.0...v2.18.0
Changelog
Sourced from babel's
changelog.
Version 2.18.0
Happy 2026! This release is, coincidentally, also being made from
FOSDEM.
We will aspire for a slightly less glacial release cadence in this
year;
there are interesting features in the pipeline.
Features
* Core: Add `babel.core.get_cldr_version()` by @akx in :gh:`1242`
* Core: Use CLDR 47 by @tomasr8 in :gh:`1210`
* Core: Use canonical IANA zone names in zone_territories by @akx in
:gh:`1220`
* Messages: Improve extract performance via ignoring directories early
during os.walk by @akx in :gh:`968`
* Messages: Merge in per-format keywords and auto_comments by @akx in
:gh:`1243`
* Messages: Update keywords for extraction of dpgettext and dnpgettext
by @mardiros in :gh:`1235`
* Messages: Validate all plurals in Python format checker by @tomasr8 in
:gh:`1188`
* Time: Use standard library `timezone` instead of `FixedOffsetTimezone`
by @akx in :gh:`1203`
Bugfixes
- Core: Fix formatting for "Empty locale identifier"
exception added in #1164
by
@akx in
:gh:1184
- Core: Improve handling of no-inheritance-marker in timezone data by
@akx in
:gh:1194
- Core: Make the number pattern regular expression more efficient by
@akx in
:gh:1213
- Messages: Keep translator comments next to the translation function
call by
@akx in
:gh:1196
- Numbers: Fix KeyError that occurred when formatting compact
currencies of exactly one thousand in several locales by
@bartbroere in
:gh:1246
Other improvements
* Core: Avoid unnecessary uses of `map()` by @akx in :gh:`1180`
* Messages: Have init-catalog create directories too by @akx in
:gh:`1244`
* Messages: Optimizations for read_po by @akx in :gh:`1200`
* Messages: Use pathlib.Path() in catalog frontend; improve test
coverage by @akx in :gh:`1204`
Infrastructure and documentation
- CI: Renovate CI & lint tools by
@akx in
:gh:1228
- CI: Tighten up CI with Zizmor by
@akx in
:gh:1230
- CI: make job permissions explicit by
@akx in
:gh:1227
- Docs: Add SECURITY.md by
@akx in
:gh:1229
- Docs: Remove u string prefix from docs by
@verhovsky in
:gh:1174
- Docs: Update dates.rst with current unicode.org tr35 link by
@clach04 in
:gh:1189
- General: Add some PyPI classifiers by
@tomasr8 in
:gh:1186
- General: Apply reformatting by hand and with Ruff by
@akx in
:gh:1202
- General: Test on and declare support for Python 3.14 by
@akx in
:gh:1233
... (truncated)
Commits
56c63ca
Prepare for 2.18.0 (#1248)
73015a1
Add user-agent to CLDR downloader (#1247)
29bd362
Fix formatting compact currencies of exactly one thousand in several
locales ...
851db43
Reuse InitCatalog's guts in UpdateCatalog (#1244)
fd00e60
Extract: Merge in per-format keywords and auto_comments (#1243)
12a14b6
Add dpgettext and dnpgettext support (#1235)
7110e62
Use canonical IANA zone names in zone_territories (#1220)
e91c346
Improve extract performance via ignoring directories early during
os.walk (#968)
0c4f378
Convert Unittest testcases with setup/teardown to fixtures (#1240)
218c96e
Add babel.core.get_cldr_version() (#1242)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index a8db8a92947..f0d89cf217a 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -24,7 +24,7 @@ async-timeout==5.0.1 ; python_version < "3.11"
# valkey
attrs==25.4.0
# via -r requirements/runtime-deps.in
-babel==2.17.0
+babel==2.18.0
# via sphinx
backports-zstd==1.3.0 ; implementation_name == "cpython"
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 7072457e396..481e3b5cc99 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -24,7 +24,7 @@ async-timeout==5.0.1 ; python_version < "3.11"
# valkey
attrs==25.4.0
# via -r requirements/runtime-deps.in
-babel==2.17.0
+babel==2.18.0
# via sphinx
backports-zstd==1.3.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
# via
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index f8e52b40ecc..03af40c6bee 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -8,7 +8,7 @@ aiohttp-theme==0.1.7
# via -r requirements/doc.in
alabaster==1.0.0
# via sphinx
-babel==2.17.0
+babel==2.18.0
# via sphinx
certifi==2026.1.4
# via requests
diff --git a/requirements/doc.txt b/requirements/doc.txt
index b9dcb92a428..b9f1465e865 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -8,7 +8,7 @@ aiohttp-theme==0.1.7
# via -r requirements/doc.in
alabaster==1.0.0
# via sphinx
-babel==2.17.0
+babel==2.18.0
# via sphinx
certifi==2026.1.4
# via requests
From 543ed1f8f89301a6e6ab603d1f5818369d07a948 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 5 Feb 2026 11:04:50 +0000
Subject: [PATCH 039/141] Bump pip from 25.3 to 26.0.1 (#12033)
Bumps [pip](https://github.com/pypa/pip) from 25.3 to 26.0.1.
Changelog
Sourced from pip's
changelog.
26.0.1 (2026-02-04)
Bug Fixes
- Fix
--pre not being respected from the command line
when a requirement file
includes an option e.g. -extra-index-url.
([#13788](https://github.com/pypa/pip/issues/13788)
<https://github.com/pypa/pip/issues/13788>_)
26.0 (2026-01-30)
Deprecations and Removals
- Remove support for non-bare project names in egg fragments. Affected
users should use
the
Direct URL requirement syntax
<https://packaging.python.org/en/latest/specifications/version-specifiers/#direct-references>.
([#13157](https://github.com/pypa/pip/issues/13157)
<https://github.com/pypa/pip/issues/13157>)
Features
-
Display pip's command-line help in colour, if possible.
([#12134](https://github.com/pypa/pip/issues/12134)
<https://github.com/pypa/pip/issues/12134>_)
-
Support installing dependencies declared with inline script metadata
(:pep:723) with --requirements-from-script.
([#12891](https://github.com/pypa/pip/issues/12891)
<https://github.com/pypa/pip/issues/12891>_)
-
Add --all-releases and --only-final options
to control pre-release
and final release selection during package installation.
([#13221](https://github.com/pypa/pip/issues/13221)
<https://github.com/pypa/pip/issues/13221>_)
-
Add --uploaded-prior-to option to only consider packages
uploaded prior to
a given datetime when the upload-time field is available
from a remote index.
([#13625](https://github.com/pypa/pip/issues/13625)
<https://github.com/pypa/pip/issues/13625>_)
-
Add --use-feature inprocess-build-deps to request that
build dependencies are installed
within the same pip install process. This new mechanism is faster,
supports --no-clean
and --no-cache-dir reliably, and supports prompting for
authentication.
Enabling this feature will also enable --use-feature
build-constraints. This feature will
become the default in a future pip version.
([#9081](https://github.com/pypa/pip/issues/9081)
<https://github.com/pypa/pip/issues/9081>_)
-
pip cache purge and pip cache remove now
clean up empty directories
and legacy files left by older pip versions.
([#9058](https://github.com/pypa/pip/issues/9058)
<https://github.com/pypa/pip/issues/9058>_)
Bug Fixes
- Fix selecting pre-release versions when only pre-releases match.
For example,
package>1.0 with versions 1.0,
2.0rc1 now installs
2.0rc1 instead of failing.
([#13746](https://github.com/pypa/pip/issues/13746)
<https://github.com/pypa/pip/issues/13746>_)
- Revisions in version control URLs now must be percent-encoded.
For example, use
git+https://example.com/repo.git@issue%231
to specify the branch issue#1.
If you previously used a branch name containing a %
character in a version control URL, you now need to replace it with
%25 to ensure correct percent-encoding.
([#13407](https://github.com/pypa/pip/issues/13407)
<https://github.com/pypa/pip/issues/13407>_)
- Preserve original casing when a path is displayed.
(
[#6823](https://github.com/pypa/pip/issues/6823)
<https://github.com/pypa/pip/issues/6823>_)
- Fix bash completion when the
$IFS variable has been
modified from its default.
([#13555](https://github.com/pypa/pip/issues/13555)
<https://github.com/pypa/pip/issues/13555>_)
- Precompute Python requirements on each candidate, reducing time of
long resolutions.
(
[#13656](https://github.com/pypa/pip/issues/13656)
<https://github.com/pypa/pip/issues/13656>_)
- Skip redundant work converting version objects to strings when using
the
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index f0d89cf217a..448500ba6c0 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -293,7 +293,7 @@ zlib-ng==1.0.0
# -r requirements/test-common.in
# The following packages are considered to be unsafe in a requirements file:
-pip==25.3
+pip==26.0.1
# via pip-tools
setuptools==80.10.2
# via pip-tools
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 481e3b5cc99..3dd8ab4e1ea 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -283,7 +283,7 @@ zlib-ng==1.0.0
# -r requirements/test-common.in
# The following packages are considered to be unsafe in a requirements file:
-pip==25.3
+pip==26.0.1
# via pip-tools
setuptools==80.10.2
# via pip-tools
From b7b17ff900420298c01ae3f2e51e9f2f8d716e1f Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 5 Feb 2026 11:08:42 +0000
Subject: [PATCH 040/141] Bump coverage from 7.13.2 to 7.13.3 (#12032)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.13.2
to 7.13.3.
Changelog
Sourced from coverage's
changelog.
Version 7.13.3 — 2026-02-03
- Fix: in some situations, third-party code was measured when it
shouldn't have
been, slowing down test execution. This happened with layered virtual
environments such as uv sometimes makes. The problem is fixed, closing
issue 2082_. Now any directory on sys.path that is inside a
virtualenv is
considered third-party code.
.. _issue 2082: coveragepy/coveragepy#2082
.. _changes_7-13-2:
Commits
6bf962f
docs: sample HTML for 7.13.3
9f2e54c
docs: prep for 7.13.3
6208c42
fix: find third-party packages in more locations. #2082
edb5016
refactor: make dataclass imports uniform
b05826a
chore: bump actions/setup-python in the action-dependencies group (#2126)
b519e17
refactor: no need for ox_profile connection
775f1cb
build: remove pudb, I can install it if I need it
0ccb1fe
chore: make upgrade
e9e2a0e
chore: bump actions/checkout in the action-dependencies group (#2122)
77e1a04
chore: make upgrade
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
5 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 448500ba6c0..defc27c1995 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -55,7 +55,7 @@ click==8.3.1
# slotscheck
# towncrier
# wait-for-it
-coverage==7.13.2
+coverage==7.13.3
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 3dd8ab4e1ea..8e303980915 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -55,7 +55,7 @@ click==8.3.1
# slotscheck
# towncrier
# wait-for-it
-coverage==7.13.2
+coverage==7.13.3
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index a4c832d4896..2c49452ac8f 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -14,7 +14,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.2
+coverage==7.13.3
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 9069fd70483..f56b40e86e5 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -29,7 +29,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.2
+coverage==7.13.3
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test.txt b/requirements/test.txt
index bced139e0df..8bdbf9e4816 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -29,7 +29,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.2
+coverage==7.13.3
# via
# -r requirements/test-common.in
# pytest-cov
From 6f0ae2901ef7e2b4919bc9634d63c91717b0db1c Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Thu, 5 Feb 2026 16:47:05 +0000
Subject: [PATCH 041/141] Change .decode_async() to .decode_iter() (#12028)
(#12034)
(cherry picked from commit 4bb9e6e375c60ef305a860646df45de9155b3c7b)
---
CHANGES/11898.bugfix.rst | 10 ++++++---
aiohttp/multipart.py | 41 ++++++++++++++++++++----------------
aiohttp/web_request.py | 20 +++++++++---------
docs/multipart_reference.rst | 7 ++++--
tests/test_multipart.py | 5 +++--
5 files changed, 48 insertions(+), 35 deletions(-)
diff --git a/CHANGES/11898.bugfix.rst b/CHANGES/11898.bugfix.rst
index f430bcce997..2a2e41d037b 100644
--- a/CHANGES/11898.bugfix.rst
+++ b/CHANGES/11898.bugfix.rst
@@ -1,6 +1,10 @@
Restored :py:meth:`~aiohttp.BodyPartReader.decode` as a synchronous method
for backward compatibility. The method was inadvertently changed to async
in 3.13.3 as part of the decompression bomb security fix. A new
-:py:meth:`~aiohttp.BodyPartReader.decode_async` method is now available
-for non-blocking decompression of large payloads. Internal aiohttp code
-uses the async variant to maintain security protections -- by :user:`bdraco`.
+:py:meth:`~aiohttp.BodyPartReader.decode_iter` method is now available
+for non-blocking decompression of large payloads using an async generator.
+Internal aiohttp code uses the async variant to maintain security protections.
+
+Changed multipart processing chunk sizes from 64 KiB to 256KiB, to better
+match aiohttp internals
+-- by :user:`bdraco` and :user:`Dreamsorcerer`.
diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py
index 0736e8090a2..bcc01b352ef 100644
--- a/aiohttp/multipart.py
+++ b/aiohttp/multipart.py
@@ -6,7 +6,7 @@
import uuid
import warnings
from collections import deque
-from collections.abc import Iterator, Mapping, Sequence
+from collections.abc import AsyncIterator, Iterator, Mapping, Sequence
from types import TracebackType
from typing import TYPE_CHECKING, Any, Union, cast
from urllib.parse import parse_qsl, unquote, urlencode
@@ -313,7 +313,10 @@ async def read(self, *, decode: bool = False) -> bytes:
while not self._at_eof:
data.extend(await self.read_chunk(self.chunk_size))
if decode:
- return await self.decode_async(data)
+ decoded_data = bytearray()
+ async for d in self.decode_iter(data):
+ decoded_data.extend(d)
+ return decoded_data
return data
async def read_chunk(self, size: int = chunk_size) -> bytes:
@@ -508,7 +511,7 @@ def decode(self, data: bytes) -> bytes:
Decodes data according the specified Content-Encoding
or Content-Transfer-Encoding headers value.
- Note: For large payloads, consider using decode_async() instead
+ Note: For large payloads, consider using decode_iter() instead
to avoid blocking the event loop during decompression.
"""
data = self._apply_content_transfer_decoding(data)
@@ -516,8 +519,8 @@ def decode(self, data: bytes) -> bytes:
return self._decode_content(data)
return data
- async def decode_async(self, data: bytes) -> bytes:
- """Decodes data asynchronously.
+ async def decode_iter(self, data: bytes) -> AsyncIterator[bytes]:
+ """Async generator that yields decoded data chunks.
Decodes data according the specified Content-Encoding
or Content-Transfer-Encoding headers value.
@@ -527,8 +530,10 @@ async def decode_async(self, data: bytes) -> bytes:
"""
data = self._apply_content_transfer_decoding(data)
if self._needs_content_decoding():
- return await self._decode_content_async(data)
- return data
+ async for d in self._decode_content_async(data):
+ yield d
+ else:
+ yield data
def _decode_content(self, data: bytes) -> bytes:
encoding = self.headers.get(CONTENT_ENCODING, "").lower()
@@ -542,17 +547,18 @@ def _decode_content(self, data: bytes) -> bytes:
raise RuntimeError(f"unknown content encoding: {encoding}")
- async def _decode_content_async(self, data: bytes) -> bytes:
+ async def _decode_content_async(self, data: bytes) -> AsyncIterator[bytes]:
encoding = self.headers.get(CONTENT_ENCODING, "").lower()
if encoding == "identity":
- return data
- if encoding in {"deflate", "gzip"}:
- return await ZLibDecompressor(
+ yield data
+ elif encoding in {"deflate", "gzip"}:
+ d = ZLibDecompressor(
encoding=encoding,
suppress_deflate_header=True,
- ).decompress(data, max_length=self._max_decompress_size)
-
- raise RuntimeError(f"unknown content encoding: {encoding}")
+ )
+ yield await d.decompress(data, max_length=self._max_decompress_size)
+ else:
+ raise RuntimeError(f"unknown content encoding: {encoding}")
def _decode_content_transfer(self, data: bytes) -> bytes:
encoding = self.headers.get(CONTENT_TRANSFER_ENCODING, "").lower()
@@ -623,10 +629,9 @@ async def as_bytes(self, encoding: str = "utf-8", errors: str = "strict") -> byt
async def write(self, writer: AbstractStreamWriter) -> None:
field = self._value
- chunk = await field.read_chunk(size=2**16)
- while chunk:
- await writer.write(await field.decode_async(chunk))
- chunk = await field.read_chunk(size=2**16)
+ while chunk := await field.read_chunk(size=2**18):
+ async for d in field.decode_iter(chunk):
+ await writer.write(d)
class MultipartReader:
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index f5bb35a87fc..38eb0de2905 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -747,17 +747,17 @@ async def post(self) -> "MultiDictProxy[str | bytes | FileField]":
tmp = await self._loop.run_in_executor(
None, tempfile.TemporaryFile
)
- chunk = await field.read_chunk(size=2**16)
- while chunk:
- chunk = await field.decode_async(chunk)
- await self._loop.run_in_executor(None, tmp.write, chunk)
- size += len(chunk)
- if 0 < max_size < size:
- await self._loop.run_in_executor(None, tmp.close)
- raise HTTPRequestEntityTooLarge(
- max_size=max_size, actual_size=size
+ while chunk := await field.read_chunk(size=2**18):
+ async for decoded_chunk in field.decode_iter(chunk):
+ await self._loop.run_in_executor(
+ None, tmp.write, decoded_chunk
)
- chunk = await field.read_chunk(size=2**16)
+ size += len(decoded_chunk)
+ if 0 < max_size < size:
+ await self._loop.run_in_executor(None, tmp.close)
+ raise HTTPRequestEntityTooLarge(
+ max_size=max_size, actual_size=size
+ )
await self._loop.run_in_executor(None, tmp.seek, 0)
if field_ct is None:
diff --git a/docs/multipart_reference.rst b/docs/multipart_reference.rst
index 2c13c8cfec9..e116ee879ed 100644
--- a/docs/multipart_reference.rst
+++ b/docs/multipart_reference.rst
@@ -119,15 +119,18 @@ Multipart reference
.. note::
- For large payloads, consider using :meth:`decode_async` instead
+ For large payloads, consider using :meth:`decode_iter` instead
to avoid blocking the event loop during decompression.
- .. method:: decode_async(data)
+ .. method:: decode_iter(data)
:async:
Decodes data asynchronously according the specified ``Content-Encoding``
or ``Content-Transfer-Encoding`` headers value.
+ This is an async iterator and will return decoded data in chunks. This
+ can be used to avoid loading large payloads into memory.
+
This method offloads decompression to an executor for large payloads
to avoid blocking the event loop.
diff --git a/tests/test_multipart.py b/tests/test_multipart.py
index 05d9aad009f..94a3986acc4 100644
--- a/tests/test_multipart.py
+++ b/tests/test_multipart.py
@@ -401,14 +401,15 @@ async def test_decode_with_content_transfer_encoding_base64(self) -> None:
result += obj.decode(chunk)
assert b"Time to Relax!" == result
- async def test_decode_async_with_content_transfer_encoding_base64(self) -> None:
+ async def test_decode_iter_with_content_transfer_encoding_base64(self) -> None:
h = CIMultiDictProxy(CIMultiDict({CONTENT_TRANSFER_ENCODING: "base64"}))
with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
result = b""
while not obj.at_eof():
chunk = await obj.read_chunk(size=6)
- result += await obj.decode_async(chunk)
+ async for decoded_chunk in obj.decode_iter(chunk):
+ result += decoded_chunk
assert b"Time to Relax!" == result
async def test_decode_with_content_encoding_deflate(self) -> None:
From 9d8dcec2624eba0240c5088d380be797149196a6 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Thu, 5 Feb 2026 16:58:13 +0000
Subject: [PATCH 042/141] Change .decode_async() to .decode_iter() (#12028)
(#12035)
(cherry picked from commit 4bb9e6e375c60ef305a860646df45de9155b3c7b)
---
CHANGES/11898.bugfix.rst | 10 ++++++---
aiohttp/multipart.py | 40 +++++++++++++++++++++---------------
aiohttp/web_request.py | 20 +++++++++---------
docs/multipart_reference.rst | 7 +++++--
tests/test_multipart.py | 5 +++--
5 files changed, 48 insertions(+), 34 deletions(-)
diff --git a/CHANGES/11898.bugfix.rst b/CHANGES/11898.bugfix.rst
index f430bcce997..2a2e41d037b 100644
--- a/CHANGES/11898.bugfix.rst
+++ b/CHANGES/11898.bugfix.rst
@@ -1,6 +1,10 @@
Restored :py:meth:`~aiohttp.BodyPartReader.decode` as a synchronous method
for backward compatibility. The method was inadvertently changed to async
in 3.13.3 as part of the decompression bomb security fix. A new
-:py:meth:`~aiohttp.BodyPartReader.decode_async` method is now available
-for non-blocking decompression of large payloads. Internal aiohttp code
-uses the async variant to maintain security protections -- by :user:`bdraco`.
+:py:meth:`~aiohttp.BodyPartReader.decode_iter` method is now available
+for non-blocking decompression of large payloads using an async generator.
+Internal aiohttp code uses the async variant to maintain security protections.
+
+Changed multipart processing chunk sizes from 64 KiB to 256KiB, to better
+match aiohttp internals
+-- by :user:`bdraco` and :user:`Dreamsorcerer`.
diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py
index d3a6ae7d604..8f1f9c93937 100644
--- a/aiohttp/multipart.py
+++ b/aiohttp/multipart.py
@@ -11,6 +11,7 @@
from typing import (
TYPE_CHECKING,
Any,
+ AsyncIterator,
Deque,
Dict,
Iterator,
@@ -325,7 +326,10 @@ async def read(self, *, decode: bool = False) -> bytes:
while not self._at_eof:
data.extend(await self.read_chunk(self.chunk_size))
if decode:
- return await self.decode_async(data)
+ decoded_data = bytearray()
+ async for d in self.decode_iter(data):
+ decoded_data.extend(d)
+ return decoded_data
return data
async def read_chunk(self, size: int = chunk_size) -> bytes:
@@ -520,7 +524,7 @@ def decode(self, data: bytes) -> bytes:
Decodes data according the specified Content-Encoding
or Content-Transfer-Encoding headers value.
- Note: For large payloads, consider using decode_async() instead
+ Note: For large payloads, consider using decode_iter() instead
to avoid blocking the event loop during decompression.
"""
data = self._apply_content_transfer_decoding(data)
@@ -528,8 +532,8 @@ def decode(self, data: bytes) -> bytes:
return self._decode_content(data)
return data
- async def decode_async(self, data: bytes) -> bytes:
- """Decodes data asynchronously.
+ async def decode_iter(self, data: bytes) -> AsyncIterator[bytes]:
+ """Async generator that yields decoded data chunks.
Decodes data according the specified Content-Encoding
or Content-Transfer-Encoding headers value.
@@ -539,8 +543,10 @@ async def decode_async(self, data: bytes) -> bytes:
"""
data = self._apply_content_transfer_decoding(data)
if self._needs_content_decoding():
- return await self._decode_content_async(data)
- return data
+ async for d in self._decode_content_async(data):
+ yield d
+ else:
+ yield data
def _decode_content(self, data: bytes) -> bytes:
encoding = self.headers.get(CONTENT_ENCODING, "").lower()
@@ -554,17 +560,18 @@ def _decode_content(self, data: bytes) -> bytes:
raise RuntimeError(f"unknown content encoding: {encoding}")
- async def _decode_content_async(self, data: bytes) -> bytes:
+ async def _decode_content_async(self, data: bytes) -> AsyncIterator[bytes]:
encoding = self.headers.get(CONTENT_ENCODING, "").lower()
if encoding == "identity":
- return data
- if encoding in {"deflate", "gzip"}:
- return await ZLibDecompressor(
+ yield data
+ elif encoding in {"deflate", "gzip"}:
+ d = ZLibDecompressor(
encoding=encoding,
suppress_deflate_header=True,
- ).decompress(data, max_length=self._max_decompress_size)
-
- raise RuntimeError(f"unknown content encoding: {encoding}")
+ )
+ yield await d.decompress(data, max_length=self._max_decompress_size)
+ else:
+ raise RuntimeError(f"unknown content encoding: {encoding}")
def _decode_content_transfer(self, data: bytes) -> bytes:
encoding = self.headers.get(CONTENT_TRANSFER_ENCODING, "").lower()
@@ -635,10 +642,9 @@ async def as_bytes(self, encoding: str = "utf-8", errors: str = "strict") -> byt
async def write(self, writer: AbstractStreamWriter) -> None:
field = self._value
- chunk = await field.read_chunk(size=2**16)
- while chunk:
- await writer.write(await field.decode_async(chunk))
- chunk = await field.read_chunk(size=2**16)
+ while chunk := await field.read_chunk(size=2**18):
+ async for d in field.decode_iter(chunk):
+ await writer.write(d)
class MultipartReader:
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index 9163d3c34fd..c652cdb4e97 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -738,17 +738,17 @@ async def post(self) -> "MultiDictProxy[Union[str, bytes, FileField]]":
tmp = await self._loop.run_in_executor(
None, tempfile.TemporaryFile
)
- chunk = await field.read_chunk(size=2**16)
- while chunk:
- chunk = await field.decode_async(chunk)
- await self._loop.run_in_executor(None, tmp.write, chunk)
- size += len(chunk)
- if 0 < max_size < size:
- await self._loop.run_in_executor(None, tmp.close)
- raise HTTPRequestEntityTooLarge(
- max_size=max_size, actual_size=size
+ while chunk := await field.read_chunk(size=2**18):
+ async for decoded_chunk in field.decode_iter(chunk):
+ await self._loop.run_in_executor(
+ None, tmp.write, decoded_chunk
)
- chunk = await field.read_chunk(size=2**16)
+ size += len(decoded_chunk)
+ if 0 < max_size < size:
+ await self._loop.run_in_executor(None, tmp.close)
+ raise HTTPRequestEntityTooLarge(
+ max_size=max_size, actual_size=size
+ )
await self._loop.run_in_executor(None, tmp.seek, 0)
if field_ct is None:
diff --git a/docs/multipart_reference.rst b/docs/multipart_reference.rst
index 2c13c8cfec9..e116ee879ed 100644
--- a/docs/multipart_reference.rst
+++ b/docs/multipart_reference.rst
@@ -119,15 +119,18 @@ Multipart reference
.. note::
- For large payloads, consider using :meth:`decode_async` instead
+ For large payloads, consider using :meth:`decode_iter` instead
to avoid blocking the event loop during decompression.
- .. method:: decode_async(data)
+ .. method:: decode_iter(data)
:async:
Decodes data asynchronously according the specified ``Content-Encoding``
or ``Content-Transfer-Encoding`` headers value.
+ This is an async iterator and will return decoded data in chunks. This
+ can be used to avoid loading large payloads into memory.
+
This method offloads decompression to an executor for large payloads
to avoid blocking the event loop.
diff --git a/tests/test_multipart.py b/tests/test_multipart.py
index 4dbba7abcdf..3e18bd93f97 100644
--- a/tests/test_multipart.py
+++ b/tests/test_multipart.py
@@ -402,14 +402,15 @@ async def test_decode_with_content_transfer_encoding_base64(self) -> None:
result += obj.decode(chunk)
assert b"Time to Relax!" == result
- async def test_decode_async_with_content_transfer_encoding_base64(self) -> None:
+ async def test_decode_iter_with_content_transfer_encoding_base64(self) -> None:
h = CIMultiDictProxy(CIMultiDict({CONTENT_TRANSFER_ENCODING: "base64"}))
with Stream(b"VG\r\r\nltZSB0byBSZ\r\nWxheCE=\r\n--:--") as stream:
obj = aiohttp.BodyPartReader(BOUNDARY, h, stream)
result = b""
while not obj.at_eof():
chunk = await obj.read_chunk(size=6)
- result += await obj.decode_async(chunk)
+ async for decoded_chunk in obj.decode_iter(chunk):
+ result += decoded_chunk
assert b"Time to Relax!" == result
async def test_decode_with_content_encoding_deflate(self) -> None:
From f688860d750650d106a3b9af32b11a5a34871adf Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 9 Feb 2026 12:19:45 +0000
Subject: [PATCH 043/141] Bump gunicorn from 25.0.1 to 25.0.3 (#12047)
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 25.0.1 to
25.0.3.
Release notes
Sourced from gunicorn's
releases.
25.0.3
What's Changed
Bug Fixes
- Fix RuntimeError when StopIteration raised in ASGI coroutine (#3484)
- Fix passing maxsplit in re.split() as positional argument
(deprecated in Python 3.13)
Documentation
- Updated sponsorship section and homepage
Full Changelog: https://github.com/benoitc/gunicorn/compare/25.0.2...25.0.3
25.0.2
What's Changed
Bug Fixes
- Fix ASGI concurrent request failures through nginx proxy
- Graceful disconnect handling for ASGI worker
- Lazy import dirty module for gevent compatibility
Other
- Increase CI timeout for signal tests on PyPy
- Remove trailing blank line in
instrument/init.py
Full Changelog: https://github.com/benoitc/gunicorn/compare/25.0.1...25.0.2
Commits
fc85068
chore: prepare release 25.0.3
787602f
docs: shorten README sponsor section
a52c81f
Merge pull request #3492
from benoitc/feature/improve-sponsorship
1afb2b7
Merge pull request #3490
from tyrossel/master
2ead70d
docs: add Sponsor to top-level navigation
874e8be
docs: shorten sponsor section on homepage
7293b21
docs: add prominent sponsorship options
70d571b
docs: update homepage title and remove version display
d301d53
fix: resolve RuntimeError when StopIteration raised in ASGI
coroutine
9508df6
test: increase CI timeout for signal tests on PyPy
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index fcdf5f38898..d3b5840177e 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.1
+gunicorn==25.0.3
# via -r requirements/base-ft.in
idna==3.10
# via yarl
diff --git a/requirements/base.txt b/requirements/base.txt
index f643e6d8f6e..1e71d53943f 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.1
+gunicorn==25.0.3
# via -r requirements/base.in
idna==3.10
# via yarl
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index defc27c1995..bd33eae0851 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -83,7 +83,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.1
+gunicorn==25.0.3
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 8e303980915..89d99c7a1dc 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -81,7 +81,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.1
+gunicorn==25.0.3
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index f56b40e86e5..8df401e52f9 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.1
+gunicorn==25.0.3
# via -r requirements/base-ft.in
idna==3.10
# via
diff --git a/requirements/test.txt b/requirements/test.txt
index 8bdbf9e4816..8403798a330 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.1
+gunicorn==25.0.3
# via -r requirements/base.in
idna==3.10
# via
From 207632e030a93c3a3e5c17369ed932aaf6157d7b Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 9 Feb 2026 12:25:05 +0000
Subject: [PATCH 044/141] Bump setuptools from 80.10.2 to 82.0.0 (#12048)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [setuptools](https://github.com/pypa/setuptools) from 80.10.2 to
82.0.0.
Changelog
Sourced from setuptools's
changelog.
v82.0.0
Deprecations and Removals
pkg_resources has been removed from Setuptools. Most
common uses of pkg_resources have been superseded by the
importlib.resources
<https://docs.python.org/3/library/importlib.resources.html>_
and importlib.metadata
<https://docs.python.org/3/library/importlib.metadata.html>_
projects. Projects and environments relying on
pkg_resources for namespace packages or other behavior
should depend on older versions of setuptools. (#3085)
v81.0.0
Deprecations and Removals
- Removed support for the --dry-run parameter to setup.py. This one
feature by its nature threads through lots of core and ancillary
functionality, adding complexity and friction. Removal of this parameter
will help decouple the compiler functionality from distutils and thus
the eventual full integration of distutils. These changes do affect some
class and function signatures, so any derivative functionality may
require some compatibility shims to support their expected interface.
Please report any issues to the Setuptools project for investigation.
(#4872)
Commits
03f3615
Bump version: 81.0.0 → 82.0.0
530d114
Merge pull request #5007
from pypa/feature/remove-more-pkg_resources
11efe9f
Merge branch 'maint/75.3'
118f129
Bump version: 75.3.3 → 75.3.4
90561ff
Merge pull request #5150
from UladzimirTrehubenka/backport_cve_47273
4595034
Add news fragment.
fc00800
Merge pull request #5171
from cclauss/ruff-v0.15.0
127e561
Remove tests reliant on pkg_resources, rather than xfailing them.
64bc21e
Reference the superseding libraries.
cf1ff45
Merge branch 'main' into debt/pbr-without-pkg_resources
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index bd33eae0851..f05d9e5f459 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -295,5 +295,5 @@ zlib-ng==1.0.0
# The following packages are considered to be unsafe in a requirements file:
pip==26.0.1
# via pip-tools
-setuptools==80.10.2
+setuptools==82.0.0
# via pip-tools
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 89d99c7a1dc..9e4759d6544 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -285,5 +285,5 @@ zlib-ng==1.0.0
# The following packages are considered to be unsafe in a requirements file:
pip==26.0.1
# via pip-tools
-setuptools==80.10.2
+setuptools==82.0.0
# via pip-tools
From 3254ba1a63e7c0e979dcffc97dbbbfb5998aecc8 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 10 Feb 2026 11:03:21 +0000
Subject: [PATCH 045/141] Bump pytest-codspeed from 4.2.0 to 4.3.0 (#12050)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [pytest-codspeed](https://github.com/CodSpeedHQ/pytest-codspeed)
from 4.2.0 to 4.3.0.
Release notes
Sourced from pytest-codspeed's
releases.
v4.3.0
What's Changed
This release brings support for the memory
instrument, which enables you to track memory usage, heap
allocations, and memory leaks in your benchmarks.
New Contributors
Full Changelog: https://github.com/CodSpeedHQ/pytest-codspeed/compare/v4.2.0...v4.3.0
Changelog
Sourced from pytest-codspeed's
changelog.
[4.3.0] - 2026-02-09
🚀 Features
🐛 Bug Fixes
⚙️ Internals
Commits
a24abfe
Release v4.3.0 🚀
748f2dd
ci: use github runner instead of buildjet
66e54c8
feat: add .gitignore to .codspeed folder on creation (#107)
36f5930
chore(ci): enable memory profiling
8ce65af
feat: support memory profiling
4df4f8c
fix(ci): switch to OIDC token
c3a194a
chore: pin python to 3.14.2 in CI to prevent walltime crashes
adee8a1
feat: rename instrumentation to simulation
08e0519
chore: add comment about uv pinning
bb84077
chore: add comment to explain results storing in .codspeed folder
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index f05d9e5f459..380650cbbff 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -179,7 +179,7 @@ pytest==9.0.2
# pytest-cov
# pytest-mock
# pytest-xdist
-pytest-codspeed==4.2.0
+pytest-codspeed==4.3.0
# via
# -r requirements/lint.in
# -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 9e4759d6544..146394da15e 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -174,7 +174,7 @@ pytest==9.0.2
# pytest-cov
# pytest-mock
# pytest-xdist
-pytest-codspeed==4.2.0
+pytest-codspeed==4.3.0
# via
# -r requirements/lint.in
# -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index e8a4ee26512..c152511f997 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -82,7 +82,7 @@ pytest==9.0.2
# -r requirements/lint.in
# pytest-codspeed
# pytest-mock
-pytest-codspeed==4.2.0
+pytest-codspeed==4.3.0
# via -r requirements/lint.in
pytest-mock==3.15.1
# via -r requirements/lint.in
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 2c49452ac8f..f9d42fef9c4 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -73,7 +73,7 @@ pytest==9.0.2
# pytest-cov
# pytest-mock
# pytest-xdist
-pytest-codspeed==4.2.0
+pytest-codspeed==4.3.0
# via -r requirements/test-common.in
pytest-cov==7.0.0
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 8df401e52f9..eb853147de0 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -108,7 +108,7 @@ pytest==9.0.2
# pytest-cov
# pytest-mock
# pytest-xdist
-pytest-codspeed==4.2.0
+pytest-codspeed==4.3.0
# via -r requirements/test-common.in
pytest-cov==7.0.0
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index 8403798a330..2a0fedf8de0 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -108,7 +108,7 @@ pytest==9.0.2
# pytest-cov
# pytest-mock
# pytest-xdist
-pytest-codspeed==4.2.0
+pytest-codspeed==4.3.0
# via -r requirements/test-common.in
pytest-cov==7.0.0
# via -r requirements/test-common.in
From 5627d8325fc23444503edc601cea932f9889aeef Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 10 Feb 2026 11:10:04 +0000
Subject: [PATCH 046/141] Bump coverage from 7.13.3 to 7.13.4 (#12051)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.13.3
to 7.13.4.
Changelog
Sourced from coverage's
changelog.
Version 7.13.4 — 2026-02-09
-
Fix: the third-party code fix in 7.13.3 required examining the parent
directories where coverage was run. In the unusual situation that one of
the
parent directories is unreadable, a PermissionError would occur, as
described in issue 2129_. This is now fixed.
-
Fix: in test suites that change sys.path, coverage.py could fail with
"RuntimeError: Set changed size during iteration" as described
and fixed in
pull 2130_. Thanks, Noah Fatsi.
-
We now publish ppc64le wheels, thanks to Pankhudi Jain <pull
2121_>_.
.. _pull 2121: coveragepy/coveragepy#2121
.. _issue 2129: coveragepy/coveragepy#2129
.. _pull 2130: coveragepy/coveragepy#2130
.. _changes_7-13-3:
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
5 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 380650cbbff..0b23a736a90 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -55,7 +55,7 @@ click==8.3.1
# slotscheck
# towncrier
# wait-for-it
-coverage==7.13.3
+coverage==7.13.4
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 146394da15e..146ce953ee5 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -55,7 +55,7 @@ click==8.3.1
# slotscheck
# towncrier
# wait-for-it
-coverage==7.13.3
+coverage==7.13.4
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index f9d42fef9c4..e251e0a1548 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -14,7 +14,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.3
+coverage==7.13.4
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index eb853147de0..40fa3a39b87 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -29,7 +29,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.3
+coverage==7.13.4
# via
# -r requirements/test-common.in
# pytest-cov
diff --git a/requirements/test.txt b/requirements/test.txt
index 2a0fedf8de0..19d4cbd2f83 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -29,7 +29,7 @@ cffi==2.0.0
# pytest-codspeed
click==8.3.1
# via wait-for-it
-coverage==7.13.3
+coverage==7.13.4
# via
# -r requirements/test-common.in
# pytest-cov
From 543d8171d741dfc37ca5428c4bbc1ed5496dfd71 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 12 Feb 2026 11:04:57 +0000
Subject: [PATCH 047/141] Bump pip-tools from 7.5.2 to 7.5.3 (#12060)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [pip-tools](https://github.com/jazzband/pip-tools) from 7.5.2 to
7.5.3.
Release notes
Sourced from pip-tools's
releases.
v7.5.3
2026-02-11
Bug fixes
-
The option --unsafe-package is now normalized -- by @shifqu.
PRs and issues: #2150
-
Fixed a bug in which pip-compile lost any index URL
options when
looking up hashes -- by @sirosen.
This caused errors when a package was only available from an extra
index, and caused pip-compile to incorrectly drop index URL
options
from output, even when they were present in the input requirements.
PRs and issues: #2220,
#2294,
#2305
-
Fixed removal of temporary files used when reading requirements from
stdin
-- by @sirosen.
Features
-
pip-tools is now tested against Python 3.14 and 3.14t in
CI, and
marks them as supported in the core packaging metadata
-- by @webknjaz.
PRs and issues: #2255
-
pip-tools is now compatible with pip 26.0 -- by @sirosen.
PRs and issues: #2319,
#2320
Removals and backward incompatible breaking changes
- Removed support for Python 3.8 -- by
@sirosen.
Improved documentation
-
The change log management infra now allows the maintainers to add
notes
before and after the regular categories -- by @webknjaz.
PRs and issues: #2287,
#2322
-
Added documentation clarifying that pip-compile reads
the existing
output file as a constraint source, and how to use
--upgrade to
refresh dependencies -- by @maliktafheem.
PRs and issues: #2307
... (truncated)
Changelog
Sourced from pip-tools's
changelog.
v7.5.3
2026-02-09
Bug fixes
-
The option --unsafe-package is now normalized -- by
{user}shifqu.
PRs and issues: {issue}2150
-
Fixed a bug in which pip-compile lost any index URL
options when
looking up hashes -- by {user}sirosen.
This caused errors when a package was only available from an extra
index, and caused pip-compile to incorrectly drop index URL
options
from output, even when they were present in the input requirements.
PRs and issues: {issue}2220,
{issue}2294, {issue}2305
-
Fixed removal of temporary files used when reading requirements from
stdin
-- by {user}sirosen.
Features
-
pip-tools is now tested against Python 3.14 and 3.14t in
CI, and
marks them as supported in the core packaging metadata
-- by {user}webknjaz.
PRs and issues: {issue}2255
-
pip-tools is now compatible with pip 26.0 -- by
{user}sirosen.
PRs and issues: {issue}2319,
{issue}2320
Removals and backward incompatible breaking changes
- Removed support for Python 3.8 -- by
{user}
sirosen.
Improved documentation
-
The change log management infra now allows the maintainers to add
notes
before and after the regular categories -- by
{user}webknjaz.
PRs and issues: {issue}2287,
{issue}2322
-
Added documentation clarifying that pip-compile reads
the existing
output file as a constraint source, and how to use
--upgrade to
refresh dependencies -- by {user}maliktafheem.
PRs and issues: {issue}2307
... (truncated)
Commits
5f31d8a
Merge pull request #2332
from sirosen/fix-release-version-normalization
106f1d6
Fix CI workflow to normalize versions (for release)
3a0f5ed
Merge pull request #2329
from sirosen/release/v7.5.3
e4bd31d
Merge pull request #2328
from jazzband/pre-commit-ci-update-config
08107ab
Update changelog for version 7.5.3
5b4d130
Merge pull request #2325
from sirosen/ensure-tmpfile-cleanup
cc6a2b9
Apply feedback/suggestions from review
fc53265
[pre-commit.ci] pre-commit autoupdate
6c27507
Add 'tempfile_compat' to handle windows tmp files
9ac94db
Fix leak of temp files when reading from stdin
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 0b23a736a90..35c13319169 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -134,7 +134,7 @@ packaging==25.0
# wheel
pathspec==1.0.3
# via mypy
-pip-tools==7.5.2
+pip-tools==7.5.3
# via -r requirements/dev.in
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 146ce953ee5..967e891d64e 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -131,7 +131,7 @@ packaging==25.0
# wheel
pathspec==1.0.3
# via mypy
-pip-tools==7.5.2
+pip-tools==7.5.3
# via -r requirements/dev.in
pkgconfig==1.5.5
# via -r requirements/test-common.in
From a4b6a03881da6e6c3aa964fec044d2799c6da134 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 12 Feb 2026 11:32:03 +0000
Subject: [PATCH 048/141] Bump packaging from 25.0 to 26.0 (#11980)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [packaging](https://github.com/pypa/packaging) from 25.0 to 26.0.
Release notes
Sourced from packaging's
releases.
26.0
Read about the performance improvements here: https://iscinumpy.dev/post/packaging-faster.
What's Changed
Features:
Behavior adaptations:
Fixes:
Performance:
... (truncated)
Changelog
Sourced from packaging's
changelog.
26.0 - 2026-01-20
Features:
- PEP 751: support pylock (:pull:
900)
- PEP 794: import name metadata (:pull:
948)
- Support for writing metadata to a file (:pull:
846)
- Support
__replace__ on Version
(:pull:1003)
- Support positional pattern matching for
Version and
SpecifierSet (:pull:1004)
Behavior adaptations:
- PEP 440 handling of prereleases for
Specifier.contains,
SpecifierSet.contains, and SpecifierSet.filter
(:pull:897)
- Handle PEP 440 edge case in
SpecifierSet.filter
(:pull:942)
- Adjust arbitrary equality intersection preservation in
SpecifierSet (:pull:951)
- Return
False instead of raising for
.contains with invalid version
(:pull:932)
- Support arbitrary equality on arbitrary strings for
Specifier and SpecifierSet's
filter and contains method.
(:pull:954)
- Only try to parse as
Version on certain marker keys,
return False on unequal ordered comparisons
(:pull:939)
Fixes:
- Update
_hash when unpickling Tag()
(:pull:860)
- Correct comment and simplify implicit prerelease handling in
Specifier.prereleases (:pull:896)
- Use explicit
_GLibCVersion NamedTuple in
_manylinux (:pull:868)
- Detect invalid license expressions containing
()
(:pull:879)
- Correct regex for metadata
'name' format
(:pull:925)
- Improve the message around expecting a semicolon
(:pull:
833)
- Support nested parens in license expressions
(:pull:
931)
- Add space before at symbol in
Requirements string
(:pull:953)
- A root logger use found, use a
packaging logger instead
(:pull:965)
- Better support for subclassing
Marker and
Requirement (:pull:1022)
- Normalize all extras, not just if it comes first
(:pull:
1024)
- Don't produce a broken repr if
Marker fails to
construct (:pull:1033)
Performance:
- Avoid recompiling regexes in the tokenizer for a 3x speedup
(:pull:
1019)
- Improve performance in
_manylinux.py
(:pull:869)
- Minor cleanups to
Version (:pull:913)
- Skip redundant creation of
Version's in specifier
comparison (:pull:986)
- Cache the
Specifier's Version
(:pull:985)
- Make
Version a little faster
(:pull:987)
- Minor
Version regex cleanup
(:pull:990)
- Faster regex on Python 3.11.5+ for
Version
(:pull:988, :pull:1055)
- Lazily calculate
_key in Version
(:pull:989, :pull:1048)
- Faster
canonicalize_version
(:pull:993)
- Use
re.fullmatch in a couple more places
(:pull:992, :pull:1029)
- Use
map instead of generator
(:pull:996)
- Deprecate
._version (_Version, a
NamedTuple) (:pull:995,
:pull:1062)
</tr></table>
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
10 files changed, 10 insertions(+), 10 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index d3b5840177e..3f994357afe 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -32,7 +32,7 @@ multidict==6.7.0
# via
# -r requirements/runtime-deps.in
# yarl
-packaging==25.0
+packaging==26.0
# via gunicorn
propcache==0.4.1
# via
diff --git a/requirements/base.txt b/requirements/base.txt
index 1e71d53943f..4ee2e7d4394 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -32,7 +32,7 @@ multidict==6.7.0
# via
# -r requirements/runtime-deps.in
# yarl
-packaging==25.0
+packaging==26.0
# via gunicorn
propcache==0.4.1
# via
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 35c13319169..c78980f1c32 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -125,7 +125,7 @@ mypy-extensions==1.1.0
# via mypy
nodeenv==1.10.0
# via pre-commit
-packaging==25.0
+packaging==26.0
# via
# build
# gunicorn
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 967e891d64e..548113434bc 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -122,7 +122,7 @@ mypy-extensions==1.1.0
# via mypy
nodeenv==1.10.0
# via pre-commit
-packaging==25.0
+packaging==26.0
# via
# build
# gunicorn
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index 03af40c6bee..fa87bdec99e 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -28,7 +28,7 @@ jinja2==3.1.6
# towncrier
markupsafe==3.0.3
# via jinja2
-packaging==25.0
+packaging==26.0
# via sphinx
pyenchant==3.3.0
# via sphinxcontrib-spelling
diff --git a/requirements/doc.txt b/requirements/doc.txt
index b9f1465e865..973e97b2719 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -28,7 +28,7 @@ jinja2==3.1.6
# towncrier
markupsafe==3.0.3
# via jinja2
-packaging==25.0
+packaging==26.0
# via sphinx
pygments==2.19.2
# via sphinx
diff --git a/requirements/lint.txt b/requirements/lint.txt
index c152511f997..3cebdc5d62a 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.1.0
# via mypy
nodeenv==1.10.0
# via pre-commit
-packaging==25.0
+packaging==26.0
# via pytest
pathspec==1.0.3
# via mypy
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index e251e0a1548..f21c9c1b341 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -44,7 +44,7 @@ mypy==1.19.1 ; implementation_name == "cpython"
# via -r requirements/test-common.in
mypy-extensions==1.1.0
# via mypy
-packaging==25.0
+packaging==26.0
# via pytest
pathspec==1.0.3
# via mypy
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 40fa3a39b87..97f325009c2 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -71,7 +71,7 @@ mypy==1.19.1 ; implementation_name == "cpython"
# via -r requirements/test-common.in
mypy-extensions==1.1.0
# via mypy
-packaging==25.0
+packaging==26.0
# via
# gunicorn
# pytest
diff --git a/requirements/test.txt b/requirements/test.txt
index 19d4cbd2f83..2dde29d7e6e 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -71,7 +71,7 @@ mypy==1.19.1 ; implementation_name == "cpython"
# via -r requirements/test-common.in
mypy-extensions==1.1.0
# via mypy
-packaging==25.0
+packaging==26.0
# via
# gunicorn
# pytest
From 7dcc7ba3376a8a252a5dbfbf91d324601be6c9e7 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 12 Feb 2026 11:59:32 +0000
Subject: [PATCH 049/141] Bump pathspec from 1.0.3 to 1.0.4 (#12006)
Bumps [pathspec](https://github.com/cpburnz/python-pathspec) from 1.0.3
to 1.0.4.
Release notes
Sourced from pathspec's
releases.
v1.0.4
Release v1.0.4. See CHANGES.rst.
Changelog
Sourced from pathspec's
changelog.
1.0.4 (2026-01-26)
Issue
[#103](https://github.com/cpburnz/python-pathspec/issues/103)_:
Using re2 fails if pyre2 is also installed.
.. _Issue
[#103](https://github.com/cpburnz/python-pathspec/issues/103): cpburnz/python-pathspec#103
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index c78980f1c32..9f8b6fc8c85 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -132,7 +132,7 @@ packaging==26.0
# pytest
# sphinx
# wheel
-pathspec==1.0.3
+pathspec==1.0.4
# via mypy
pip-tools==7.5.3
# via -r requirements/dev.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 548113434bc..eb1dc773794 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -129,7 +129,7 @@ packaging==26.0
# pytest
# sphinx
# wheel
-pathspec==1.0.3
+pathspec==1.0.4
# via mypy
pip-tools==7.5.3
# via -r requirements/dev.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 3cebdc5d62a..b7be9db6e16 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -57,7 +57,7 @@ nodeenv==1.10.0
# via pre-commit
packaging==26.0
# via pytest
-pathspec==1.0.3
+pathspec==1.0.4
# via mypy
platformdirs==4.5.1
# via virtualenv
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index f21c9c1b341..5091e21519e 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -46,7 +46,7 @@ mypy-extensions==1.1.0
# via mypy
packaging==26.0
# via pytest
-pathspec==1.0.3
+pathspec==1.0.4
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 97f325009c2..b1fe76b012e 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -75,7 +75,7 @@ packaging==26.0
# via
# gunicorn
# pytest
-pathspec==1.0.3
+pathspec==1.0.4
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index 2dde29d7e6e..c1afdbc2d97 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -75,7 +75,7 @@ packaging==26.0
# via
# gunicorn
# pytest
-pathspec==1.0.3
+pathspec==1.0.4
# via mypy
pkgconfig==1.5.5
# via -r requirements/test-common.in
From 10fbddaeb70ddc401892b2c77ec06b8154e307ec Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 13 Feb 2026 11:08:23 +0000
Subject: [PATCH 050/141] Bump filelock from 3.20.3 to 3.21.2 (#12065)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [filelock](https://github.com/tox-dev/py-filelock) from 3.20.3 to
3.21.2.
Release notes
Sourced from filelock's
releases.
3.21.2
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.21.1...3.21.2
3.21.1
What's Changed
New Contributors
Full Changelog: https://github.com/tox-dev/filelock/compare/3.21.0...3.21.1
3.21.0
What's Changed
New Contributors
Full Changelog: https://github.com/tox-dev/filelock/compare/3.20.3...3.21.0
Changelog
Sourced from filelock's
changelog.
3.21.2 (2026-02-13)
- 🐛 fix: catch ImportError for missing sqlite3 C library
:pr:
475
3.21.1 (2026-02-12)
- 🐛 fix: gracefully handle missing
sqlite3 when importing
ReadWriteLock :pr:473 - by
:user:bayandin
- 🐛 fix(ci): make release workflow robust
3.21.0 (2026-02-12)
- 🐛 fix(ci): make release workflow robust
- 👷 ci(release): commit changelog and use release config
:pr:
472
- 👷 ci(release): consolidate to two jobs :pr:
471
- ✨ feat(unix): delete lock file on release :pr:
408 - by
:user:sbc100
- ✨ feat(lock): add SQLite-based ReadWriteLock :pr:
399 -
by :user:leventov
- 🔧 chore: modernize tooling and bump deps :pr:
470
Commits
9678acc
Release 3.21.2
4194182
🐛 fix: catch ImportError for missing sqlite3 C library (#475)
9b2d086
Release 3.21.1
eff3991
🐛 fix: gracefully handle missing sqlite3 when importing
ReadWriteLock (#473)
3692417
🐛 fix(ci): make release workflow robust
053168f
Release 3.21.0
c2f5754
🐛 fix(ci): make release workflow robust
0dc277c
👷 ci(release): commit changelog and use release config (#472)
be8fe1a
👷 ci(release): consolidate to two jobs (#471)
03b0ab7
✨ feat(unix): delete lock file on release (#408)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 9f8b6fc8c85..6265f0c041c 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -71,7 +71,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.20.3
+filelock==3.21.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/dev.txt b/requirements/dev.txt
index eb1dc773794..789238b8b24 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -69,7 +69,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.20.3
+filelock==3.21.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/lint.txt b/requirements/lint.txt
index b7be9db6e16..dd8abff2fbd 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -29,7 +29,7 @@ distlib==0.4.0
# via virtualenv
exceptiongroup==1.3.1
# via pytest
-filelock==3.20.3
+filelock==3.21.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
From 47d045d7f7001f9bfa3847ee6136346595520188 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Vojt=C4=9Bch=20Bo=C4=8Dek?=
Date: Fri, 13 Feb 2026 19:22:34 +0100
Subject: [PATCH 051/141] fix: do not use `get_event_loop` in GunicornWebWorker
- 3.14 (#12058)
---
CHANGES/11701.bugfix.rst | 3 +++
aiohttp/worker.py | 13 ++++++++++---
tests/test_worker.py | 19 ++++++++++++++++++-
3 files changed, 31 insertions(+), 4 deletions(-)
create mode 100644 CHANGES/11701.bugfix.rst
diff --git a/CHANGES/11701.bugfix.rst b/CHANGES/11701.bugfix.rst
new file mode 100644
index 00000000000..adf1e529d57
--- /dev/null
+++ b/CHANGES/11701.bugfix.rst
@@ -0,0 +1,3 @@
+Fixed ``RuntimeError: An event loop is running`` error when using ``aiohttp.GunicornWebWorker``
+or ``aiohttp.GunicornUVLoopWebWorker`` on Python >=3.14.
+-- by :user:`Tasssadar`.
diff --git a/aiohttp/worker.py b/aiohttp/worker.py
index 2c0e59530a2..090a7b3b13d 100644
--- a/aiohttp/worker.py
+++ b/aiohttp/worker.py
@@ -36,7 +36,6 @@
class GunicornWebWorker(base.Worker): # type: ignore[misc,no-any-unimported]
-
DEFAULT_AIOHTTP_LOG_FORMAT = AccessLogger.LOG_FORMAT
DEFAULT_GUNICORN_LOG_FORMAT = GunicornAccessLogFormat.default
@@ -49,7 +48,11 @@ def __init__(self, *args: Any, **kw: Any) -> None: # pragma: no cover
def init_process(self) -> None:
# create new event_loop after fork
- asyncio.get_event_loop().close()
+ try:
+ asyncio.get_event_loop().close()
+ except RuntimeError:
+ # No loop was running
+ pass
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
@@ -245,7 +248,11 @@ def init_process(self) -> None:
# Close any existing event loop before setting a
# new policy.
- asyncio.get_event_loop().close()
+ try:
+ asyncio.get_event_loop().close()
+ except RuntimeError:
+ # No loop was running
+ pass
# Setup uvloop policy, so that every
# asyncio.get_event_loop() will create an instance
diff --git a/tests/test_worker.py b/tests/test_worker.py
index 94eb8967bd4..9be9e41c20c 100644
--- a/tests/test_worker.py
+++ b/tests/test_worker.py
@@ -46,7 +46,8 @@ class AsyncioWorker(BaseTestWorker, base_worker.GunicornWebWorker): # type: ign
if uvloop is not None:
class UvloopWorker(
- BaseTestWorker, base_worker.GunicornUVLoopWebWorker # type: ignore
+ BaseTestWorker,
+ base_worker.GunicornUVLoopWebWorker, # type: ignore
):
pass
@@ -75,6 +76,22 @@ def test_init_process(worker: base_worker.GunicornWebWorker) -> None:
assert m_asyncio.set_event_loop.called
+def test_init_process_no_loop(worker: base_worker.GunicornWebWorker) -> None:
+ with mock.patch("aiohttp.worker.asyncio") as m_asyncio:
+ m_asyncio.get_event_loop.side_effect = RuntimeError(
+ "There is no current event loop in thread 'MainThread'"
+ )
+ try:
+ worker.init_process()
+ except TypeError:
+ pass
+
+ assert m_asyncio.get_event_loop.called
+ assert not m_asyncio.get_event_loop.return_value.close.called
+ assert m_asyncio.new_event_loop.called
+ assert m_asyncio.set_event_loop.called
+
+
def test_run(
worker: base_worker.GunicornWebWorker, loop: asyncio.AbstractEventLoop
) -> None:
From d99ba3b68f03ab098bbe7bbfc93f2fee24893737 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Vojt=C4=9Bch=20Bo=C4=8Dek?=
Date: Fri, 13 Feb 2026 19:22:45 +0100
Subject: [PATCH 052/141] fix: do not use `get_event_loop` in GunicornWebWorker
- 3.13 (#12057)
---
CHANGES/11701.bugfix.rst | 3 +++
aiohttp/worker.py | 13 ++++++++++---
tests/test_worker.py | 19 ++++++++++++++++++-
3 files changed, 31 insertions(+), 4 deletions(-)
create mode 100644 CHANGES/11701.bugfix.rst
diff --git a/CHANGES/11701.bugfix.rst b/CHANGES/11701.bugfix.rst
new file mode 100644
index 00000000000..adf1e529d57
--- /dev/null
+++ b/CHANGES/11701.bugfix.rst
@@ -0,0 +1,3 @@
+Fixed ``RuntimeError: An event loop is running`` error when using ``aiohttp.GunicornWebWorker``
+or ``aiohttp.GunicornUVLoopWebWorker`` on Python >=3.14.
+-- by :user:`Tasssadar`.
diff --git a/aiohttp/worker.py b/aiohttp/worker.py
index f7281bfde75..00ec79ce131 100644
--- a/aiohttp/worker.py
+++ b/aiohttp/worker.py
@@ -36,7 +36,6 @@
class GunicornWebWorker(base.Worker): # type: ignore[misc,no-any-unimported]
-
DEFAULT_AIOHTTP_LOG_FORMAT = AccessLogger.LOG_FORMAT
DEFAULT_GUNICORN_LOG_FORMAT = GunicornAccessLogFormat.default
@@ -49,7 +48,11 @@ def __init__(self, *args: Any, **kw: Any) -> None: # pragma: no cover
def init_process(self) -> None:
# create new event_loop after fork
- asyncio.get_event_loop().close()
+ try:
+ asyncio.get_event_loop().close()
+ except RuntimeError:
+ # No loop was running
+ pass
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
@@ -245,7 +248,11 @@ def init_process(self) -> None:
# Close any existing event loop before setting a
# new policy.
- asyncio.get_event_loop().close()
+ try:
+ asyncio.get_event_loop().close()
+ except RuntimeError:
+ # No loop was running
+ pass
# Setup uvloop policy, so that every
# asyncio.get_event_loop() will create an instance
diff --git a/tests/test_worker.py b/tests/test_worker.py
index 60d1e8b088b..9d4baab8025 100644
--- a/tests/test_worker.py
+++ b/tests/test_worker.py
@@ -46,7 +46,8 @@ class AsyncioWorker(BaseTestWorker, base_worker.GunicornWebWorker): # type: ign
if uvloop is not None:
class UvloopWorker(
- BaseTestWorker, base_worker.GunicornUVLoopWebWorker # type: ignore
+ BaseTestWorker,
+ base_worker.GunicornUVLoopWebWorker, # type: ignore
):
pass
@@ -75,6 +76,22 @@ def test_init_process(worker: base_worker.GunicornWebWorker) -> None:
assert m_asyncio.set_event_loop.called
+def test_init_process_no_loop(worker: base_worker.GunicornWebWorker) -> None:
+ with mock.patch("aiohttp.worker.asyncio") as m_asyncio:
+ m_asyncio.get_event_loop.side_effect = RuntimeError(
+ "There is no current event loop in thread 'MainThread'"
+ )
+ try:
+ worker.init_process()
+ except TypeError:
+ pass
+
+ assert m_asyncio.get_event_loop.called
+ assert not m_asyncio.get_event_loop.return_value.close.called
+ assert m_asyncio.new_event_loop.called
+ assert m_asyncio.set_event_loop.called
+
+
def test_run(
worker: base_worker.GunicornWebWorker, loop: asyncio.AbstractEventLoop
) -> None:
From dc89aecd330e9cdf3004572eae13bc0c4d3d9a4b Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Mon, 16 Feb 2026 01:50:36 +0000
Subject: [PATCH 053/141] =?UTF-8?q?Stabilize=20test=20suite:=20fix=20flaky?=
=?UTF-8?q?=20timing=20tests,=20and=20handle=20Windows=20reso=E2=80=A6=20(?=
=?UTF-8?q?#12075)?=
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
…urce leaks (#11992)
(cherry picked from commit 32924e8fa4c9085dcbaf41b3b3c27fc773bfa213)
Co-authored-by: Rodrigo Nogueira
---
CHANGES/11992.contrib.rst | 1 +
CONTRIBUTORS.txt | 2 ++
tests/conftest.py | 12 ++++++++
tests/test_client_middleware_digest_auth.py | 14 +++++++---
tests/test_cookie_helpers.py | 11 ++++++--
tests/test_imports.py | 31 ++++-----------------
tests/test_proxy_functional.py | 19 ++++++++++---
tests/test_web_request.py | 10 +++++--
8 files changed, 61 insertions(+), 39 deletions(-)
create mode 100644 CHANGES/11992.contrib.rst
diff --git a/CHANGES/11992.contrib.rst b/CHANGES/11992.contrib.rst
new file mode 100644
index 00000000000..c56c2ab7059
--- /dev/null
+++ b/CHANGES/11992.contrib.rst
@@ -0,0 +1 @@
+Fixed flaky performance tests by using appropriate fixed thresholds that account for CI variability -- by :user:`rodrigobnogueira`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index 5664ed517b6..79f3c8e3bfd 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -310,6 +310,8 @@ Raúl Cumplido
Required Field
Robert Lu
Robert Nikolich
+Rodrigo Nogueira
+Roman Markeloff
Roman Podoliaka
Roman Postnov
Rong Zhang
diff --git a/tests/conftest.py b/tests/conftest.py
index d3c4f237fb1..661f539a632 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -41,6 +41,18 @@
TRUSTME = False
+def pytest_configure(config: pytest.Config) -> None:
+ # On Windows with Python 3.10/3.11, proxy.py's threaded mode can leave
+ # sockets not fully released by the time pytest's unraisableexception
+ # plugin collects warnings during teardown. Suppress these warnings
+ # since they are not actionable and only affect older Python versions.
+ if os.name == "nt" and sys.version_info < (3, 12):
+ config.addinivalue_line(
+ "filterwarnings",
+ "ignore:Exception ignored in.*socket.*:pytest.PytestUnraisableExceptionWarning",
+ )
+
+
try:
if sys.platform == "win32":
import winloop as uvloop
diff --git a/tests/test_client_middleware_digest_auth.py b/tests/test_client_middleware_digest_auth.py
index c15bf1b422e..52e6f97050a 100644
--- a/tests/test_client_middleware_digest_auth.py
+++ b/tests/test_client_middleware_digest_auth.py
@@ -1332,12 +1332,18 @@ async def handler(request: Request) -> Response:
def test_regex_performance() -> None:
+ """Test that the regex pattern doesn't suffer from ReDoS issues."""
+ REGEX_TIME_THRESHOLD_SECONDS = 0.08
value = "0" * 54773 + "\\0=a"
+
start = time.perf_counter()
matches = _HEADER_PAIRS_PATTERN.findall(value)
- end = time.perf_counter()
+ elapsed = time.perf_counter() - start
- # If this is taking more than 10ms, there's probably a performance/ReDoS issue.
- assert (end - start) < 0.01
- # This example probably shouldn't produce a match either.
+ # If this is taking more time, there's probably a performance/ReDoS issue.
+ assert elapsed < REGEX_TIME_THRESHOLD_SECONDS, (
+ f"Regex took {elapsed * 1000:.1f}ms, "
+ f"expected <{REGEX_TIME_THRESHOLD_SECONDS * 1000:.0f}ms - potential ReDoS issue"
+ )
+ # This example shouldn't produce a match either.
assert not matches
diff --git a/tests/test_cookie_helpers.py b/tests/test_cookie_helpers.py
index 38a44972c09..767e1eaa34a 100644
--- a/tests/test_cookie_helpers.py
+++ b/tests/test_cookie_helpers.py
@@ -638,13 +638,18 @@ def test_cookie_pattern_matches_partitioned_attribute(test_string: str) -> None:
def test_cookie_pattern_performance() -> None:
+ """Test that the cookie pattern doesn't suffer from ReDoS issues."""
+ COOKIE_PATTERN_TIME_THRESHOLD_SECONDS = 0.08
value = "a" + "=" * 21651 + "\x00"
start = time.perf_counter()
match = helpers._COOKIE_PATTERN.match(value)
- end = time.perf_counter()
+ elapsed = time.perf_counter() - start
- # If this is taking more than 10ms, there's probably a performance/ReDoS issue.
- assert (end - start) < 0.01
+ # If this is taking more time, there's probably a performance/ReDoS issue.
+ assert elapsed < COOKIE_PATTERN_TIME_THRESHOLD_SECONDS, (
+ f"Pattern took {elapsed * 1000:.1f}ms, "
+ f"expected <{COOKIE_PATTERN_TIME_THRESHOLD_SECONDS * 1000:.0f}ms - potential ReDoS issue"
+ )
# This example shouldn't produce a match either.
assert match is None
diff --git a/tests/test_imports.py b/tests/test_imports.py
index b3f545ad900..340698531de 100644
--- a/tests/test_imports.py
+++ b/tests/test_imports.py
@@ -28,22 +28,6 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
result.assert_outcomes(passed=0, errors=0)
-_IS_CI_ENV = os.getenv("CI") == "true"
-_XDIST_WORKER_COUNT = int(os.getenv("PYTEST_XDIST_WORKER_COUNT", 0))
-_IS_XDIST_RUN = _XDIST_WORKER_COUNT > 1
-
-_TARGET_TIMINGS_BY_PYTHON_VERSION = {
- "3.12": (
- # 3.12+ is expected to be a bit slower due to performance trade-offs,
- # and even slower under pytest-xdist, especially in CI
- _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
- if _IS_XDIST_RUN
- else 295
- ),
-}
-_TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
-
-
@pytest.mark.internal
@pytest.mark.dev_mode
@pytest.mark.skipif(
@@ -57,7 +41,10 @@ def test_import_time(pytester: pytest.Pytester) -> None:
Obviously, the time may vary on different machines and may need to be adjusted
from time to time, but this should provide an early warning if something is
added that significantly increases import time.
+
+ Runs 3 times and keeps the minimum time to reduce flakiness.
"""
+ IMPORT_TIME_THRESHOLD_MS = 300 if sys.version_info >= (3, 12) else 200
root = Path(__file__).parent.parent
old_path = os.environ.get("PYTHONPATH")
os.environ["PYTHONPATH"] = os.pathsep.join([str(root)] + sys.path)
@@ -67,18 +54,12 @@ def test_import_time(pytester: pytest.Pytester) -> None:
try:
for _ in range(3):
r = pytester.run(sys.executable, "-We", "-c", cmd)
-
- assert not r.stderr.str()
- runtime_ms = int(r.stdout.str())
- if runtime_ms < best_time_ms:
- best_time_ms = runtime_ms
+ assert not r.stderr.str(), r.stderr.str()
+ best_time_ms = min(best_time_ms, int(r.stdout.str()))
finally:
if old_path is None:
os.environ.pop("PYTHONPATH")
else:
os.environ["PYTHONPATH"] = old_path
- expected_time = _TARGET_TIMINGS_BY_PYTHON_VERSION.get(
- f"{sys.version_info.major}.{sys.version_info.minor}", 200
- )
- assert best_time_ms < expected_time
+ assert best_time_ms < IMPORT_TIME_THRESHOLD_MS
diff --git a/tests/test_proxy_functional.py b/tests/test_proxy_functional.py
index f1ae83ee7c6..bf6ecc9a933 100644
--- a/tests/test_proxy_functional.py
+++ b/tests/test_proxy_functional.py
@@ -19,6 +19,7 @@
from aiohttp.client_exceptions import ClientConnectionError
from aiohttp.helpers import IS_MACOS, IS_WINDOWS
from aiohttp.pytest_plugin import AiohttpServer
+from aiohttp.test_utils import TestServer
ASYNCIO_SUPPORTS_TLS_IN_TLS = sys.version_info >= (3, 11)
@@ -216,24 +217,34 @@ async def test_https_proxy_unsupported_tls_in_tls(
# otherwise this test will fail because the proxy will die with an error.
async def test_uvloop_secure_https_proxy(
client_ssl_ctx: ssl.SSLContext,
+ ssl_ctx: ssl.SSLContext,
secure_proxy_url: URL,
uvloop_loop: asyncio.AbstractEventLoop,
) -> None:
"""Ensure HTTPS sites are accessible through a secure proxy without warning when using uvloop."""
+ payload = str(uuid4())
+
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(text=payload)
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ server = TestServer(app, host="127.0.0.1")
+ await server.start_server(ssl=ssl_ctx)
+
+ url = URL.build(scheme="https", host=server.host, port=server.port)
conn = aiohttp.TCPConnector(force_close=True)
sess = aiohttp.ClientSession(connector=conn)
try:
- url = URL("https://example.com")
-
async with sess.get(
url, proxy=secure_proxy_url, ssl=client_ssl_ctx
) as response:
assert response.status == 200
- # Ensure response body is read to completion
- await response.read()
+ assert await response.text() == payload
finally:
await sess.close()
await conn.close()
+ await server.close()
await asyncio.sleep(0)
await asyncio.sleep(0.1)
diff --git a/tests/test_web_request.py b/tests/test_web_request.py
index 0e40b0dfea6..3a6e17a0a74 100644
--- a/tests/test_web_request.py
+++ b/tests/test_web_request.py
@@ -641,13 +641,17 @@ def test_single_forwarded_header() -> None:
def test_forwarded_re_performance() -> None:
+ FORWARDED_RE_TIME_THRESHOLD_SECONDS = 0.08
value = "{" + "f" * 54773 + "z\x00a=v"
start = time.perf_counter()
match = _FORWARDED_PAIR_RE.match(value)
- end = time.perf_counter()
+ elapsed = time.perf_counter() - start
- # If this is taking more than 10ms, there's probably a performance/ReDoS issue.
- assert (end - start) < 0.01
+ # If this is taking more time, there's probably a performance/ReDoS issue.
+ assert elapsed < FORWARDED_RE_TIME_THRESHOLD_SECONDS, (
+ f"Regex took {elapsed * 1000:.1f}ms, "
+ f"expected <{FORWARDED_RE_TIME_THRESHOLD_SECONDS * 1000:.0f}ms - potential ReDoS issue"
+ )
# This example shouldn't produce a match either.
assert match is None
From 93225827bdcd6f09f64e0e82665bf6d4ced747ee Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Mon, 16 Feb 2026 02:04:11 +0000
Subject: [PATCH 054/141] =?UTF-8?q?Stabilize=20test=20suite:=20fix=20flaky?=
=?UTF-8?q?=20timing=20tests,=20and=20handle=20Windows=20reso=E2=80=A6=20(?=
=?UTF-8?q?#12076)?=
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
…urce leaks (#11992)
(cherry picked from commit 32924e8fa4c9085dcbaf41b3b3c27fc773bfa213)
Co-authored-by: Rodrigo Nogueira
---
CHANGES/11992.contrib.rst | 1 +
CONTRIBUTORS.txt | 2 ++
tests/conftest.py | 12 ++++++++
tests/test_client_middleware_digest_auth.py | 14 +++++++---
tests/test_cookie_helpers.py | 11 ++++++--
tests/test_imports.py | 31 ++++-----------------
tests/test_proxy_functional.py | 19 ++++++++++---
tests/test_web_request.py | 10 +++++--
8 files changed, 61 insertions(+), 39 deletions(-)
create mode 100644 CHANGES/11992.contrib.rst
diff --git a/CHANGES/11992.contrib.rst b/CHANGES/11992.contrib.rst
new file mode 100644
index 00000000000..c56c2ab7059
--- /dev/null
+++ b/CHANGES/11992.contrib.rst
@@ -0,0 +1 @@
+Fixed flaky performance tests by using appropriate fixed thresholds that account for CI variability -- by :user:`rodrigobnogueira`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index a5c94854ee2..f310e2e5cca 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -308,6 +308,8 @@ Raúl Cumplido
Required Field
Robert Lu
Robert Nikolich
+Rodrigo Nogueira
+Roman Markeloff
Roman Podoliaka
Roman Postnov
Rong Zhang
diff --git a/tests/conftest.py b/tests/conftest.py
index 6d91d08f10a..c299df5aa3f 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -41,6 +41,18 @@
TRUSTME = False
+def pytest_configure(config: pytest.Config) -> None:
+ # On Windows with Python 3.10/3.11, proxy.py's threaded mode can leave
+ # sockets not fully released by the time pytest's unraisableexception
+ # plugin collects warnings during teardown. Suppress these warnings
+ # since they are not actionable and only affect older Python versions.
+ if os.name == "nt" and sys.version_info < (3, 12):
+ config.addinivalue_line(
+ "filterwarnings",
+ "ignore:Exception ignored in.*socket.*:pytest.PytestUnraisableExceptionWarning",
+ )
+
+
try:
if sys.platform == "win32":
import winloop as uvloop
diff --git a/tests/test_client_middleware_digest_auth.py b/tests/test_client_middleware_digest_auth.py
index 40ebadf6e37..453e78ab4c1 100644
--- a/tests/test_client_middleware_digest_auth.py
+++ b/tests/test_client_middleware_digest_auth.py
@@ -1331,12 +1331,18 @@ async def handler(request: Request) -> Response:
def test_regex_performance() -> None:
+ """Test that the regex pattern doesn't suffer from ReDoS issues."""
+ REGEX_TIME_THRESHOLD_SECONDS = 0.08
value = "0" * 54773 + "\\0=a"
+
start = time.perf_counter()
matches = _HEADER_PAIRS_PATTERN.findall(value)
- end = time.perf_counter()
+ elapsed = time.perf_counter() - start
- # If this is taking more than 10ms, there's probably a performance/ReDoS issue.
- assert (end - start) < 0.01
- # This example probably shouldn't produce a match either.
+ # If this is taking more time, there's probably a performance/ReDoS issue.
+ assert elapsed < REGEX_TIME_THRESHOLD_SECONDS, (
+ f"Regex took {elapsed * 1000:.1f}ms, "
+ f"expected <{REGEX_TIME_THRESHOLD_SECONDS * 1000:.0f}ms - potential ReDoS issue"
+ )
+ # This example shouldn't produce a match either.
assert not matches
diff --git a/tests/test_cookie_helpers.py b/tests/test_cookie_helpers.py
index 38a44972c09..767e1eaa34a 100644
--- a/tests/test_cookie_helpers.py
+++ b/tests/test_cookie_helpers.py
@@ -638,13 +638,18 @@ def test_cookie_pattern_matches_partitioned_attribute(test_string: str) -> None:
def test_cookie_pattern_performance() -> None:
+ """Test that the cookie pattern doesn't suffer from ReDoS issues."""
+ COOKIE_PATTERN_TIME_THRESHOLD_SECONDS = 0.08
value = "a" + "=" * 21651 + "\x00"
start = time.perf_counter()
match = helpers._COOKIE_PATTERN.match(value)
- end = time.perf_counter()
+ elapsed = time.perf_counter() - start
- # If this is taking more than 10ms, there's probably a performance/ReDoS issue.
- assert (end - start) < 0.01
+ # If this is taking more time, there's probably a performance/ReDoS issue.
+ assert elapsed < COOKIE_PATTERN_TIME_THRESHOLD_SECONDS, (
+ f"Pattern took {elapsed * 1000:.1f}ms, "
+ f"expected <{COOKIE_PATTERN_TIME_THRESHOLD_SECONDS * 1000:.0f}ms - potential ReDoS issue"
+ )
# This example shouldn't produce a match either.
assert match is None
diff --git a/tests/test_imports.py b/tests/test_imports.py
index b3f545ad900..340698531de 100644
--- a/tests/test_imports.py
+++ b/tests/test_imports.py
@@ -28,22 +28,6 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
result.assert_outcomes(passed=0, errors=0)
-_IS_CI_ENV = os.getenv("CI") == "true"
-_XDIST_WORKER_COUNT = int(os.getenv("PYTEST_XDIST_WORKER_COUNT", 0))
-_IS_XDIST_RUN = _XDIST_WORKER_COUNT > 1
-
-_TARGET_TIMINGS_BY_PYTHON_VERSION = {
- "3.12": (
- # 3.12+ is expected to be a bit slower due to performance trade-offs,
- # and even slower under pytest-xdist, especially in CI
- _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
- if _IS_XDIST_RUN
- else 295
- ),
-}
-_TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
-
-
@pytest.mark.internal
@pytest.mark.dev_mode
@pytest.mark.skipif(
@@ -57,7 +41,10 @@ def test_import_time(pytester: pytest.Pytester) -> None:
Obviously, the time may vary on different machines and may need to be adjusted
from time to time, but this should provide an early warning if something is
added that significantly increases import time.
+
+ Runs 3 times and keeps the minimum time to reduce flakiness.
"""
+ IMPORT_TIME_THRESHOLD_MS = 300 if sys.version_info >= (3, 12) else 200
root = Path(__file__).parent.parent
old_path = os.environ.get("PYTHONPATH")
os.environ["PYTHONPATH"] = os.pathsep.join([str(root)] + sys.path)
@@ -67,18 +54,12 @@ def test_import_time(pytester: pytest.Pytester) -> None:
try:
for _ in range(3):
r = pytester.run(sys.executable, "-We", "-c", cmd)
-
- assert not r.stderr.str()
- runtime_ms = int(r.stdout.str())
- if runtime_ms < best_time_ms:
- best_time_ms = runtime_ms
+ assert not r.stderr.str(), r.stderr.str()
+ best_time_ms = min(best_time_ms, int(r.stdout.str()))
finally:
if old_path is None:
os.environ.pop("PYTHONPATH")
else:
os.environ["PYTHONPATH"] = old_path
- expected_time = _TARGET_TIMINGS_BY_PYTHON_VERSION.get(
- f"{sys.version_info.major}.{sys.version_info.minor}", 200
- )
- assert best_time_ms < expected_time
+ assert best_time_ms < IMPORT_TIME_THRESHOLD_MS
diff --git a/tests/test_proxy_functional.py b/tests/test_proxy_functional.py
index f4bc020d1f0..8b5289d5798 100644
--- a/tests/test_proxy_functional.py
+++ b/tests/test_proxy_functional.py
@@ -19,6 +19,7 @@
from aiohttp.client_exceptions import ClientConnectionError
from aiohttp.helpers import IS_MACOS, IS_WINDOWS
from aiohttp.pytest_plugin import AiohttpServer
+from aiohttp.test_utils import TestServer
ASYNCIO_SUPPORTS_TLS_IN_TLS = sys.version_info >= (3, 11)
@@ -216,24 +217,34 @@ async def test_https_proxy_unsupported_tls_in_tls(
# otherwise this test will fail because the proxy will die with an error.
async def test_uvloop_secure_https_proxy(
client_ssl_ctx: ssl.SSLContext,
+ ssl_ctx: ssl.SSLContext,
secure_proxy_url: URL,
uvloop_loop: asyncio.AbstractEventLoop,
) -> None:
"""Ensure HTTPS sites are accessible through a secure proxy without warning when using uvloop."""
+ payload = str(uuid4())
+
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(text=payload)
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ server = TestServer(app, host="127.0.0.1")
+ await server.start_server(ssl=ssl_ctx)
+
+ url = URL.build(scheme="https", host=server.host, port=server.port)
conn = aiohttp.TCPConnector(force_close=True)
sess = aiohttp.ClientSession(connector=conn)
try:
- url = URL("https://example.com")
-
async with sess.get(
url, proxy=secure_proxy_url, ssl=client_ssl_ctx
) as response:
assert response.status == 200
- # Ensure response body is read to completion
- await response.read()
+ assert await response.text() == payload
finally:
await sess.close()
await conn.close()
+ await server.close()
await asyncio.sleep(0)
await asyncio.sleep(0.1)
diff --git a/tests/test_web_request.py b/tests/test_web_request.py
index dffc691dff0..e4b3979e67b 100644
--- a/tests/test_web_request.py
+++ b/tests/test_web_request.py
@@ -557,13 +557,17 @@ def test_single_forwarded_header() -> None:
def test_forwarded_re_performance() -> None:
+ FORWARDED_RE_TIME_THRESHOLD_SECONDS = 0.08
value = "{" + "f" * 54773 + "z\x00a=v"
start = time.perf_counter()
match = _FORWARDED_PAIR_RE.match(value)
- end = time.perf_counter()
+ elapsed = time.perf_counter() - start
- # If this is taking more than 10ms, there's probably a performance/ReDoS issue.
- assert (end - start) < 0.01
+ # If this is taking more time, there's probably a performance/ReDoS issue.
+ assert elapsed < FORWARDED_RE_TIME_THRESHOLD_SECONDS, (
+ f"Regex took {elapsed * 1000:.1f}ms, "
+ f"expected <{FORWARDED_RE_TIME_THRESHOLD_SECONDS * 1000:.0f}ms - potential ReDoS issue"
+ )
# This example shouldn't produce a match either.
assert match is None
From f8b1e8fec985226ab77733093156a83f161c6339 Mon Sep 17 00:00:00 2001
From: Gene Hoffman <30377676+hoffmang9@users.noreply.github.com>
Date: Sun, 15 Feb 2026 18:27:15 -0800
Subject: [PATCH 055/141] Fix/ws heartbeat reset on data (#12030) - fix
backport merge (#12074)
(cherry picked from commit a640f4f2c42d9d92267340f6f5db510f5b1b5b66)
Co-authored-by: Sam Bull
---
CHANGES/12030.bugfix.rst | 2 +
CONTRIBUTORS.txt | 2 +
aiohttp/client.py | 9 +-
aiohttp/client_proto.py | 13 ++-
aiohttp/client_ws.py | 30 ++++++-
aiohttp/web_protocol.py | 9 +-
aiohttp/web_ws.py | 46 ++++++++--
docs/client_reference.rst | 5 +-
docs/web_reference.rst | 3 +-
tests/test_client_ws.py | 130 ++++++++++++++++++++++++++++-
tests/test_client_ws_functional.py | 60 +++++++++++++
tests/test_web_protocol.py | 48 +++++++++++
tests/test_web_websocket.py | 66 +++++++++++++++
13 files changed, 403 insertions(+), 20 deletions(-)
create mode 100644 CHANGES/12030.bugfix.rst
create mode 100644 tests/test_web_protocol.py
diff --git a/CHANGES/12030.bugfix.rst b/CHANGES/12030.bugfix.rst
new file mode 100644
index 00000000000..5f2f8ba5c3c
--- /dev/null
+++ b/CHANGES/12030.bugfix.rst
@@ -0,0 +1,2 @@
+Reset the WebSocket heartbeat timer on inbound data to avoid false ping/pong timeouts while receiving large frames
+-- by :user:`hoffmang9`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index 79f3c8e3bfd..cd47a312eeb 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -115,6 +115,7 @@ Dmitry Trofimov
Dmytro Bohomiakov
Dmytro Kuznetsov
Dustin J. Mitchell
+Earle Lowe
Eduard Iskandarov
Eli Ribble
Elizabeth Leddy
@@ -139,6 +140,7 @@ Gabriel Tremblay
Gang Ji
Gary Leung
Gary Wilson Jr.
+Gene Hoffman
Gennady Andreyev
Georges Dubus
Greg Holt
diff --git a/aiohttp/client.py b/aiohttp/client.py
index 2a3823be6cb..7a6df06c427 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -1269,9 +1269,6 @@ async def _ws_connect(
transport = conn.transport
assert transport is not None
reader = WebSocketDataQueue(conn_proto, 2**16, loop=self._loop)
- conn_proto.set_parser(
- WebSocketReader(reader, max_msg_size, decode_text=decode_text), reader
- )
writer = WebSocketWriter(
conn_proto,
transport,
@@ -1283,7 +1280,7 @@ async def _ws_connect(
resp.close()
raise
else:
- return self._ws_response_class(
+ ws_resp = self._ws_response_class(
reader,
writer,
protocol,
@@ -1296,6 +1293,10 @@ async def _ws_connect(
compress=compress,
client_notakeover=notakeover,
)
+ parser = WebSocketReader(reader, max_msg_size, decode_text=decode_text)
+ cb = None if heartbeat is None else ws_resp._on_data_received
+ conn_proto.set_parser(parser, reader, data_received_cb=cb)
+ return ws_resp
def _prepare_headers(self, headers: LooseHeaders | None) -> "CIMultiDict[str]":
"""Add default headers and transform it to CIMultiDict"""
diff --git a/aiohttp/client_proto.py b/aiohttp/client_proto.py
index a078068eb05..08c03c192fe 100644
--- a/aiohttp/client_proto.py
+++ b/aiohttp/client_proto.py
@@ -1,6 +1,6 @@
import asyncio
from contextlib import suppress
-from typing import Any
+from typing import Any, Callable
from .base_protocol import BaseProtocol
from .client_exceptions import (
@@ -34,6 +34,7 @@ def __init__(self, loop: asyncio.AbstractEventLoop) -> None:
self._payload: StreamReader | None = None
self._skip_payload = False
self._payload_parser = None
+ self._data_received_cb: Callable[[], None] | None = None
self._timer = None
@@ -203,7 +204,12 @@ def set_exception(
self._drop_timeout()
super().set_exception(exc, exc_cause)
- def set_parser(self, parser: Any, payload: Any) -> None:
+ def set_parser(
+ self,
+ parser: Any,
+ payload: Any,
+ data_received_cb: Callable[[], None] | None = None,
+ ) -> None:
# TODO: actual types are:
# parser: WebSocketReader
# payload: WebSocketDataQueue
@@ -211,6 +217,7 @@ def set_parser(self, parser: Any, payload: Any) -> None:
# Need an ABC for both types
self._payload = payload
self._payload_parser = parser
+ self._data_received_cb = data_received_cb
self._drop_timeout()
@@ -298,6 +305,8 @@ def data_received(self, data: bytes) -> None:
# custom payload parser - currently always WebSocketReader
if self._payload_parser is not None:
+ if self._data_received_cb is not None:
+ self._data_received_cb()
eof, tail = self._payload_parser.feed_data(data)
if eof:
self._payload = None
diff --git a/aiohttp/client_ws.py b/aiohttp/client_ws.py
index eca6927b38e..ee50f1df6eb 100644
--- a/aiohttp/client_ws.py
+++ b/aiohttp/client_ws.py
@@ -98,11 +98,17 @@ def __init__(
self._compress = compress
self._client_notakeover = client_notakeover
self._ping_task: asyncio.Task[None] | None = None
+ self._need_heartbeat_reset = False
+ self._heartbeat_reset_handle: asyncio.Handle | None = None
self._reset_heartbeat()
def _cancel_heartbeat(self) -> None:
self._cancel_pong_response_cb()
+ if self._heartbeat_reset_handle is not None:
+ self._heartbeat_reset_handle.cancel()
+ self._heartbeat_reset_handle = None
+ self._need_heartbeat_reset = False
if self._heartbeat_cb is not None:
self._heartbeat_cb.cancel()
self._heartbeat_cb = None
@@ -115,6 +121,23 @@ def _cancel_pong_response_cb(self) -> None:
self._pong_response_cb.cancel()
self._pong_response_cb = None
+ def _on_data_received(self) -> None:
+ if self._heartbeat is None or self._need_heartbeat_reset:
+ return
+ loop = self._loop
+ assert loop is not None
+ # Coalesce multiple chunks received in the same loop tick into a single
+ # heartbeat reset. Resetting immediately per chunk increases timer churn.
+ self._need_heartbeat_reset = True
+ self._heartbeat_reset_handle = loop.call_soon(self._flush_heartbeat_reset)
+
+ def _flush_heartbeat_reset(self) -> None:
+ self._heartbeat_reset_handle = None
+ if not self._need_heartbeat_reset:
+ return
+ self._reset_heartbeat()
+ self._need_heartbeat_reset = False
+
def _reset_heartbeat(self) -> None:
if self._heartbeat is None:
return
@@ -138,6 +161,12 @@ def _reset_heartbeat(self) -> None:
def _send_heartbeat(self) -> None:
self._heartbeat_cb = None
+
+ # If heartbeat reset is pending (data is being received), skip sending
+ # the ping and let the reset callback handle rescheduling the heartbeat.
+ if self._need_heartbeat_reset:
+ return
+
loop = self._loop
now = loop.time()
if now < self._heartbeat_when:
@@ -365,7 +394,6 @@ async def receive(
msg = await self._reader.read()
else:
msg = await self._reader.read()
- self._reset_heartbeat()
finally:
self._waiting = False
if self._close_wait:
diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py
index 0f87304a0ef..eb7aea87a2c 100644
--- a/aiohttp/web_protocol.py
+++ b/aiohttp/web_protocol.py
@@ -148,6 +148,7 @@ class RequestHandler(BaseProtocol):
"_task_handler",
"_upgrade",
"_payload_parser",
+ "_data_received_cb",
"_request_parser",
"_reading_paused",
"logger",
@@ -203,6 +204,7 @@ def __init__(
self._messages: deque[_MsgType] = deque()
self._message_tail = b""
+ self._data_received_cb: Callable[[], None] | None = None
self._waiter: asyncio.Future[None] | None = None
self._handler_waiter: asyncio.Future[None] | None = None
@@ -373,11 +375,14 @@ def connection_lost(self, exc: BaseException | None) -> None:
self._payload_parser.feed_eof()
self._payload_parser = None
- def set_parser(self, parser: Any) -> None:
+ def set_parser(
+ self, parser: Any, data_received_cb: Callable[[], None] | None = None
+ ) -> None:
# Actual type is WebReader
assert self._payload_parser is None
self._payload_parser = parser
+ self._data_received_cb = data_received_cb
if self._message_tail:
self._payload_parser.feed_data(self._message_tail)
@@ -421,6 +426,8 @@ def data_received(self, data: bytes) -> None:
# feed payload
elif data:
+ if self._data_received_cb is not None:
+ self._data_received_cb()
eof, tail = self._payload_parser.feed_data(data)
if eof:
self.close()
diff --git a/aiohttp/web_ws.py b/aiohttp/web_ws.py
index 227b8962d59..0be33c6cd8d 100644
--- a/aiohttp/web_ws.py
+++ b/aiohttp/web_ws.py
@@ -90,6 +90,8 @@ class WebSocketResponse(StreamResponse, Generic[_DecodeText]):
_heartbeat_cb: asyncio.TimerHandle | None = None
_pong_response_cb: asyncio.TimerHandle | None = None
_ping_task: asyncio.Task[None] | None = None
+ _need_heartbeat_reset: bool = False
+ _heartbeat_reset_handle: asyncio.Handle | None = None
def __init__(
self,
@@ -118,9 +120,15 @@ def __init__(
self._max_msg_size = max_msg_size
self._writer_limit = writer_limit
self._decode_text = decode_text
+ self._need_heartbeat_reset = False
+ self._heartbeat_reset_handle = None
def _cancel_heartbeat(self) -> None:
self._cancel_pong_response_cb()
+ if self._heartbeat_reset_handle is not None:
+ self._heartbeat_reset_handle.cancel()
+ self._heartbeat_reset_handle = None
+ self._need_heartbeat_reset = False
if self._heartbeat_cb is not None:
self._heartbeat_cb.cancel()
self._heartbeat_cb = None
@@ -133,6 +141,23 @@ def _cancel_pong_response_cb(self) -> None:
self._pong_response_cb.cancel()
self._pong_response_cb = None
+ def _on_data_received(self) -> None:
+ if self._heartbeat is None or self._need_heartbeat_reset:
+ return
+ loop = self._loop
+ assert loop is not None
+ # Coalesce multiple chunks received in the same loop tick into a single
+ # heartbeat reset. Resetting immediately per chunk increases timer churn.
+ self._need_heartbeat_reset = True
+ self._heartbeat_reset_handle = loop.call_soon(self._flush_heartbeat_reset)
+
+ def _flush_heartbeat_reset(self) -> None:
+ self._heartbeat_reset_handle = None
+ if not self._need_heartbeat_reset:
+ return
+ self._reset_heartbeat()
+ self._need_heartbeat_reset = False
+
def _reset_heartbeat(self) -> None:
if self._heartbeat is None:
return
@@ -156,6 +181,12 @@ def _reset_heartbeat(self) -> None:
def _send_heartbeat(self) -> None:
self._heartbeat_cb = None
+
+ # If heartbeat reset is pending (data is being received), skip sending
+ # the ping and let the reset callback handle rescheduling the heartbeat.
+ if self._need_heartbeat_reset:
+ return
+
loop = self._loop
assert loop is not None and self._writer is not None
now = loop.time()
@@ -349,14 +380,14 @@ def _post_start(
loop = self._loop
assert loop is not None
self._reader = WebSocketDataQueue(request._protocol, 2**16, loop=loop)
- request.protocol.set_parser(
- WebSocketReader(
- self._reader,
- self._max_msg_size,
- compress=bool(self._compress),
- decode_text=self._decode_text,
- )
+ parser = WebSocketReader(
+ self._reader,
+ self._max_msg_size,
+ compress=bool(self._compress),
+ decode_text=self._decode_text,
)
+ cb = None if self._heartbeat is None else self._on_data_received
+ request.protocol.set_parser(parser, data_received_cb=cb)
# disable HTTP keepalive for WebSocket
request.protocol.keep_alive(False)
@@ -576,7 +607,6 @@ async def receive(
msg = await self._reader.read()
else:
msg = await self._reader.read()
- self._reset_heartbeat()
finally:
self._waiting = False
if self._close_wait:
diff --git a/docs/client_reference.rst b/docs/client_reference.rst
index e96f56bb0df..0b1ae5dff19 100644
--- a/docs/client_reference.rst
+++ b/docs/client_reference.rst
@@ -777,8 +777,9 @@ The client session supports the context manager protocol for self closing.
:param float heartbeat: Send *ping* message every *heartbeat*
seconds and wait *pong* response, if
*pong* response is not received then
- close connection. The timer is reset on any data
- reception.(optional)
+ close connection. The timer is reset on any
+ inbound data reception (coalesced per event loop
+ iteration). (optional)
:param str origin: Origin header to send to server(optional)
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index 3f1fd6aa9da..e8ca31f8614 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -999,7 +999,8 @@ and :ref:`aiohttp-web-signals` handlers::
:param float heartbeat: Send `ping` message every `heartbeat`
seconds and wait `pong` response, close
connection if `pong` response is not
- received. The timer is reset on any data reception.
+ received. The timer is reset on any inbound data
+ reception (coalesced per event loop iteration).
:param float timeout: Timeout value for the ``close``
operation. After sending the close websocket message,
diff --git a/tests/test_client_ws.py b/tests/test_client_ws.py
index 6f6e75dceae..169330b7d1a 100644
--- a/tests/test_client_ws.py
+++ b/tests/test_client_ws.py
@@ -620,7 +620,135 @@ async def test_receive_runtime_err(loop) -> None:
await resp.receive()
-async def test_ws_connect_close_resp_on_err(loop, ws_key, key_data) -> None:
+async def test_heartbeat_reset_coalesces_on_data(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ response = mock.Mock()
+ response.connection = None
+ resp = client.ClientWebSocketResponse(
+ mock.Mock(),
+ mock.Mock(),
+ None,
+ response,
+ aiohttp.ClientWSTimeout(ws_receive=10.0),
+ True,
+ True,
+ loop,
+ heartbeat=0.05,
+ )
+ with mock.patch.object(resp, "_reset_heartbeat", autospec=True) as reset:
+ resp._on_data_received()
+ resp._on_data_received()
+
+ await asyncio.sleep(0)
+
+ assert reset.call_count == 1
+
+
+async def test_receive_does_not_reset_heartbeat(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ response = mock.Mock()
+ response.connection = None
+ msg = mock.Mock(type=aiohttp.WSMsgType.TEXT)
+ reader = mock.Mock()
+ reader.read = mock.AsyncMock(return_value=msg)
+ resp = client.ClientWebSocketResponse(
+ reader,
+ mock.Mock(),
+ None,
+ response,
+ aiohttp.ClientWSTimeout(ws_receive=10.0),
+ True,
+ True,
+ loop,
+ heartbeat=0.05,
+ )
+ with mock.patch.object(resp, "_reset_heartbeat", autospec=True) as reset:
+ received = await resp.receive()
+
+ assert received is msg
+ reset.assert_not_called()
+
+
+async def test_cancel_heartbeat_cancels_pending_heartbeat_reset_handle(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ response = mock.Mock()
+ response.connection = None
+ resp = client.ClientWebSocketResponse(
+ mock.Mock(),
+ mock.Mock(),
+ None,
+ response,
+ aiohttp.ClientWSTimeout(ws_receive=10.0),
+ True,
+ True,
+ loop,
+ heartbeat=0.05,
+ )
+
+ resp._on_data_received()
+ handle = resp._heartbeat_reset_handle
+ assert handle is not None
+
+ resp._cancel_heartbeat()
+
+ assert resp._heartbeat_reset_handle is None
+ assert resp._need_heartbeat_reset is False
+ assert handle.cancelled()
+
+
+async def test_flush_heartbeat_reset_returns_early_when_not_needed(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ response = mock.Mock()
+ response.connection = None
+ resp = client.ClientWebSocketResponse(
+ mock.Mock(),
+ mock.Mock(),
+ None,
+ response,
+ aiohttp.ClientWSTimeout(ws_receive=10.0),
+ True,
+ True,
+ loop,
+ heartbeat=0.05,
+ )
+ resp._need_heartbeat_reset = False
+
+ with mock.patch.object(resp, "_reset_heartbeat", autospec=True) as reset:
+ resp._flush_heartbeat_reset()
+ reset.assert_not_called()
+
+
+async def test_send_heartbeat_returns_early_when_reset_is_pending(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ response = mock.Mock()
+ response.connection = None
+ writer = mock.Mock()
+ resp = client.ClientWebSocketResponse(
+ mock.Mock(),
+ writer,
+ None,
+ response,
+ aiohttp.ClientWSTimeout(ws_receive=10.0),
+ True,
+ True,
+ loop,
+ heartbeat=0.05,
+ )
+ resp._need_heartbeat_reset = True
+
+ resp._send_heartbeat()
+
+ writer.send_frame.assert_not_called()
+
+
+async def test_ws_connect_close_resp_on_err(
+ loop: asyncio.AbstractEventLoop, ws_key: str, key_data: bytes
+) -> None:
resp = mock.Mock()
resp.status = 500
resp.headers = {
diff --git a/tests/test_client_ws_functional.py b/tests/test_client_ws_functional.py
index 47e293e5f70..13c03fe0563 100644
--- a/tests/test_client_ws_functional.py
+++ b/tests/test_client_ws_functional.py
@@ -1,6 +1,8 @@
import asyncio
import json
+import struct
import sys
+from contextlib import suppress
from typing import Any, Literal, NoReturn
from unittest import mock
@@ -798,6 +800,64 @@ async def handler(request):
assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
+async def test_heartbeat_does_not_timeout_while_receiving_large_frame(
+ aiohttp_client: AiohttpClient,
+) -> None:
+ """Slowly receiving a single large frame should not trip heartbeat.
+
+ Regression test for the behavior described in
+ https://github.com/aio-libs/aiohttp/discussions/12023: on slow connections,
+ the websocket heartbeat used to be reset only after a full message was read,
+ which could cause a ping/pong timeout while bytes were still being received.
+ """
+ payload = b"x" * 2048
+ heartbeat = 0.05
+ chunk_size = 64
+ delay = 0.01
+
+ async def handler(request: web.Request) -> web.WebSocketResponse:
+ ws = web.WebSocketResponse()
+ await ws.prepare(request)
+
+ assert ws._writer is not None
+ transport = ws._writer.transport
+
+ # Server-to-client frames are not masked.
+ length = len(payload) # payload is fixed length of 2048 bytes
+ header = bytes((0x82, 126)) + struct.pack("!H", length)
+
+ frame = header + payload
+ for i in range(0, len(frame), chunk_size):
+ transport.write(frame[i : i + chunk_size])
+ await asyncio.sleep(delay)
+
+ # Ensure the server side is cleaned up.
+ with suppress(asyncio.TimeoutError):
+ await ws.receive(timeout=1.0)
+ with suppress(Exception):
+ await ws.close()
+ return ws
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ client = await aiohttp_client(app)
+
+ async with client.ws_connect("/", heartbeat=heartbeat) as resp:
+ # If heartbeat was not reset on any incoming bytes, the client would start
+ # sending PINGs while we're still streaming the message body, and since the
+ # server handler never calls receive(), no PONG would be produced and the
+ # client would close with a ping/pong timeout.
+ with mock.patch.object(
+ resp._writer, "send_frame", wraps=resp._writer.send_frame
+ ) as sf:
+ msg = await resp.receive()
+ assert (
+ sf.call_args_list.count(mock.call(b"", WSMsgType.PING)) == 0
+ ), "Heartbeat PING sent while data was still being received"
+ assert msg.type is WSMsgType.BINARY
+ assert msg.data == payload
+
+
async def test_heartbeat_no_pong_after_receive_many_messages(
aiohttp_client: AiohttpClient,
) -> None:
diff --git a/tests/test_web_protocol.py b/tests/test_web_protocol.py
new file mode 100644
index 00000000000..5ae1e5dd756
--- /dev/null
+++ b/tests/test_web_protocol.py
@@ -0,0 +1,48 @@
+import asyncio
+from typing import Any, cast
+from unittest import mock
+
+from aiohttp.web_protocol import RequestHandler
+
+
+class _DummyManager:
+ def __init__(self) -> None:
+ self.request_handler = mock.Mock()
+ self.request_factory = mock.Mock()
+
+
+class _DummyParser:
+ def __init__(self) -> None:
+ self.received: list[bytes] = []
+
+ def feed_data(self, data: bytes) -> tuple[bool, bytes]:
+ self.received.append(data)
+ return False, b""
+
+
+def test_set_parser_does_not_call_data_received_cb_for_tail(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ handler: RequestHandler[Any] = RequestHandler(cast(Any, _DummyManager()), loop=loop)
+ handler._message_tail = b"tail"
+ cb = mock.Mock()
+ parser = _DummyParser()
+
+ handler.set_parser(parser, data_received_cb=cb)
+
+ cb.assert_not_called()
+ assert parser.received == [b"tail"]
+
+
+def test_data_received_calls_data_received_cb(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ handler: RequestHandler[Any] = RequestHandler(cast(Any, _DummyManager()), loop=loop)
+ cb = mock.Mock()
+ parser = _DummyParser()
+
+ handler.set_parser(parser, data_received_cb=cb)
+ handler.data_received(b"x")
+
+ assert cb.call_count == 1
+ assert parser.received == [b"x"]
diff --git a/tests/test_web_websocket.py b/tests/test_web_websocket.py
index 3a285b76aad..3593936e6eb 100644
--- a/tests/test_web_websocket.py
+++ b/tests/test_web_websocket.py
@@ -29,6 +29,7 @@ def app(loop):
def protocol():
ret = mock.Mock()
ret.set_parser.return_value = ret
+ ret._timeout_ceil_threshold = 5
return ret
@@ -104,6 +105,41 @@ async def test_nonstarted_receive_str() -> None:
await ws.receive_str()
+async def test_cancel_heartbeat_cancels_pending_heartbeat_reset_handle(
+ loop: asyncio.AbstractEventLoop,
+) -> None:
+ ws = web.WebSocketResponse(heartbeat=0.05)
+ ws._loop = loop
+ ws._on_data_received()
+ handle = ws._heartbeat_reset_handle
+ assert handle is not None
+
+ ws._cancel_heartbeat()
+
+ assert ws._heartbeat_reset_handle is None
+ assert ws._need_heartbeat_reset is False
+ assert handle.cancelled()
+
+
+async def test_flush_heartbeat_reset_returns_early_when_not_needed() -> None:
+ ws = web.WebSocketResponse(heartbeat=0.05)
+ ws._need_heartbeat_reset = False
+
+ with mock.patch.object(ws, "_reset_heartbeat") as reset:
+ ws._flush_heartbeat_reset()
+ reset.assert_not_called()
+
+
+async def test_send_heartbeat_returns_early_when_reset_is_pending() -> None:
+ ws = web.WebSocketResponse(heartbeat=0.05)
+ ws._need_heartbeat_reset = True
+
+ ws._send_heartbeat()
+
+ assert ws._pong_response_cb is None
+ assert ws._ping_task is None
+
+
async def test_nonstarted_receive_bytes() -> None:
ws = WebSocketResponse()
with pytest.raises(RuntimeError):
@@ -188,6 +224,36 @@ async def test_heartbeat_timeout(make_request: Any) -> None:
assert ws.closed
+async def test_heartbeat_reset_coalesces_on_data(
+ make_request: Any,
+) -> None:
+ req = make_request("GET", "/")
+ ws = web.WebSocketResponse(heartbeat=0.05)
+ await ws.prepare(req)
+
+ with mock.patch.object(ws, "_reset_heartbeat") as reset:
+ ws._on_data_received()
+ ws._on_data_received()
+
+ await asyncio.sleep(0)
+
+ assert reset.call_count == 1
+
+
+async def test_receive_does_not_reset_heartbeat() -> None:
+ ws = web.WebSocketResponse(heartbeat=0.05)
+ msg = mock.Mock(type=WSMsgType.TEXT)
+ reader = mock.Mock()
+ reader.read = mock.AsyncMock(return_value=msg)
+ ws._reader = reader
+
+ with mock.patch.object(ws, "_reset_heartbeat") as reset:
+ received = await ws.receive()
+
+ assert received is msg
+ reset.assert_not_called()
+
+
def test_websocket_ready() -> None:
websocket_ready = WebSocketReady(True, "chat")
assert websocket_ready.ok is True
From 629db33062cc53c2095091f46e5b0458b3b333a5 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 16 Feb 2026 11:59:19 +0000
Subject: [PATCH 056/141] Bump filelock from 3.21.2 to 3.24.2 (#12079)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [filelock](https://github.com/tox-dev/py-filelock) from 3.21.2 to
3.24.2.
Release notes
Sourced from filelock's
releases.
3.24.2
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.24.1...3.24.2
3.24.1
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.24.0...3.24.1
3.24.0
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.23.0...3.24.0
3.23.0
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.22.0...3.23.0
3.22.0
What's Changed
... (truncated)
Changelog
Sourced from filelock's
changelog.
###########
Changelog
###########
3.24.2 (2026-02-16)
- 🐛 fix(rw): close sqlite3 cursors and skip SoftFileLock Windows race
:pr:
491
- 🐛 fix(test): resolve flaky write non-starvation test
:pr:
490
- 📝 docs: restructure using Diataxis framework
:pr:
489
3.24.1 (2026-02-15)
- 🐛 fix(soft): resolve Windows deadlock and test race condition
:pr:
488
3.24.0 (2026-02-14)
- ✨ feat(lock): add lifetime parameter for lock expiration (#68)
:pr:
486
- ✨ feat(lock): add cancel_check to acquire (#309)
:pr:
487
- 🐛 fix(api): detect same-thread self-deadlock
:pr:
481
- ✨ feat(mode): respect POSIX default ACLs (#378)
:pr:
483
- 🐛 fix(win): eliminate lock file race in threaded usage
:pr:
484
- ✨ feat(lock): add poll_interval to constructor
:pr:
482
- 🐛 fix(unix): auto-fallback to SoftFileLock on ENOSYS
:pr:
480
3.23.0 (2026-02-14)
- 📝 docs: move from Unlicense to MIT :pr:
479
- 📝 docs: add fasteners to similar libraries :pr:
478
3.22.0 (2026-02-14)
- 🐛 fix(soft): skip stale detection on Windows
:pr:
477
- ✨ feat(soft): detect and break stale locks :pr:
476
3.21.2 (2026-02-13)
- 🐛 fix: catch ImportError for missing sqlite3 C library
:pr:
475
... (truncated)
Commits
db3e05a
Release 3.24.2
ab6b90e
🐛 fix(rw): close sqlite3 cursors and skip SoftFileLock Windows race (#491)
98b4ee9
🐛 fix(test): resolve flaky write non-starvation test (#490)
ef15f6b
📝 docs: restructure using Diataxis framework (#489)
0b2f65b
Release 3.24.1
abccdba
🐛 fix(soft): resolve Windows deadlock and test race condition (#488)
2de9380
Release 3.24.0
d429e18
✨ feat(lock): add lifetime parameter for lock expiration (#68)
(#486)
6c75bf7
✨ feat(lock): add cancel_check to acquire (#309)
(#487)
fa6a27b
🐛 fix(api): detect same-thread self-deadlock (#481)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 6265f0c041c..a7ec463b5eb 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -71,7 +71,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.21.2
+filelock==3.24.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 789238b8b24..ca8582ea94d 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -69,7 +69,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.21.2
+filelock==3.24.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
diff --git a/requirements/lint.txt b/requirements/lint.txt
index dd8abff2fbd..c37cb284bb3 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -29,7 +29,7 @@ distlib==0.4.0
# via virtualenv
exceptiongroup==1.3.1
# via pytest
-filelock==3.21.2
+filelock==3.24.2
# via virtualenv
forbiddenfruit==0.1.4
# via blockbuster
From 19f904e70042acc6dc5d499c5ac800b68429044c Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 16 Feb 2026 12:02:10 +0000
Subject: [PATCH 057/141] Bump platformdirs from 4.5.1 to 4.9.2 (#12080)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [platformdirs](https://github.com/tox-dev/platformdirs) from 4.5.1
to 4.9.2.
Release notes
Sourced from platformdirs's
releases.
4.9.2
What's Changed
Full Changelog: https://github.com/tox-dev/platformdirs/compare/4.9.1...4.9.2
4.9.1
What's Changed
Full Changelog: https://github.com/tox-dev/platformdirs/compare/4.9.0...4.9.1
4.9.0
What's Changed
Full Changelog: https://github.com/tox-dev/platformdirs/compare/4.8.0...4.9.0
4.8.0
What's Changed
Full Changelog: https://github.com/tox-dev/platformdirs/compare/4.7.1...4.8.0
... (truncated)
Changelog
Sourced from platformdirs's
changelog.
###########
Changelog
###########
4.9.2 (2026-02-16)
- 📝 docs: restructure following Diataxis framework
:pr:
448
- 📝 docs(platforms): fix RST formatting and TOC hierarchy
:pr:
447
4.9.1 (2026-02-14)
- 📝 docs: enhance README, fix issues, and reorganize platforms.rst
:pr:
445
4.9.0 (2026-02-14)
- 📚 docs: split usage guide into tutorial, how-to, and reference
:pr:
441
- ✨ feat(api): add site_bin_dir property :pr:
443
- ✨ feat(api): add site_applications_dir property
:pr:
442
- 🐛 fix(unix): use correct runtime dir path for OpenBSD
:pr:
440
- 📝 docs(usage): document use_site_for_root parameter
:pr:
439
4.8.0 (2026-02-14)
- 📝 docs(usage): note that home dir is in stdlib
:pr:
431
- ✨ feat(api): add user_applications_dir property
:pr:
432
- ✨ feat(api): add user_bin_dir property :pr:
430
- 🐛 fix(macos): yield individual site dirs in iter_*_dirs
:pr:
429
- ✨ feat(windows): add WIN_PD_OVERRIDE_* env var overrides
:pr:
428
- ✨ feat(windows): add PLATFORMDIRS_* env var overrides
:pr:
427
- ✨ feat(api): add use_site_for_root parameter
:pr:
426
- ✨ feat(api): add site_state_dir for system-wide state
:pr:
425
- ✨ feat(api): add site_log_dir and document Store Python sandbox
:pr:
424
- 📝 docs(windows): document Store Python sandbox path behavior
:pr:
423
4.7.1 (2026-02-13)
- 🐛 fix(windows): avoid FileNotFoundError in sandboxed environments
:pr:
422
4.7.0 (2026-02-12)
... (truncated)
Commits
72271a6
Release 4.9.2
3e45fa9
📝 docs: restructure following Diataxis framework (#448)
1d8448b
📝 docs(platforms): fix RST formatting and TOC hierarchy (#447)
f656849
Release 4.9.1
d983fb1
📝 docs: enhance README, fix issues, and reorganize platforms.rst (#445)
685a1d9
Release 4.9.0
ae73d34
📚 docs: split usage guide into tutorial, how-to, and reference (#441)
816747e
✨ feat(api): add site_bin_dir property (#443)
7a47ac4
✨ feat(api): add site_applications_dir property (#442)
c69a552
🐛 fix(unix): use correct runtime dir path for OpenBSD (#440)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index a7ec463b5eb..81009436de5 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -138,7 +138,7 @@ pip-tools==7.5.3
# via -r requirements/dev.in
pkgconfig==1.5.5
# via -r requirements/test-common.in
-platformdirs==4.5.1
+platformdirs==4.9.2
# via virtualenv
pluggy==1.6.0
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index ca8582ea94d..4388f1d6e66 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -135,7 +135,7 @@ pip-tools==7.5.3
# via -r requirements/dev.in
pkgconfig==1.5.5
# via -r requirements/test-common.in
-platformdirs==4.5.1
+platformdirs==4.9.2
# via virtualenv
pluggy==1.6.0
# via
diff --git a/requirements/lint.txt b/requirements/lint.txt
index c37cb284bb3..029f9dae23f 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -59,7 +59,7 @@ packaging==26.0
# via pytest
pathspec==1.0.4
# via mypy
-platformdirs==4.5.1
+platformdirs==4.9.2
# via virtualenv
pluggy==1.6.0
# via pytest
From 830b942100330a24c5dc0245f8ce3bdbfc2ef764 Mon Sep 17 00:00:00 2001
From: Varun Chawla <34209028+veeceey@users.noreply.github.com>
Date: Mon, 16 Feb 2026 06:43:18 -0800
Subject: [PATCH 058/141] Fix ValueError with ClientTimeout(total=0) in TLS
connections (#12067)
---
CHANGES/11859.bugfix.rst | 1 +
aiohttp/connector.py | 4 ++--
tests/test_connector.py | 31 +++++++++++++++++++++++++++++++
3 files changed, 34 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/11859.bugfix.rst
diff --git a/CHANGES/11859.bugfix.rst b/CHANGES/11859.bugfix.rst
new file mode 100644
index 00000000000..7e32de1a4fa
--- /dev/null
+++ b/CHANGES/11859.bugfix.rst
@@ -0,0 +1 @@
+Fixed :exc:`ValueError` when creating a TLS connection with ``ClientTimeout(total=0)`` by converting ``0`` to ``None`` before passing to ``ssl_handshake_timeout`` in :py:meth:`asyncio.loop.start_tls` -- by :user:`veeceey`.
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index d6b0abefc69..6e1e0919393 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -1436,7 +1436,7 @@ async def _start_tls_connection(
tls_proto,
sslcontext,
server_hostname=req.server_hostname or req.host,
- ssl_handshake_timeout=timeout.total,
+ ssl_handshake_timeout=timeout.total or None,
ssl_shutdown_timeout=self._ssl_shutdown_timeout,
)
else:
@@ -1445,7 +1445,7 @@ async def _start_tls_connection(
tls_proto,
sslcontext,
server_hostname=req.server_hostname or req.host,
- ssl_handshake_timeout=timeout.total,
+ ssl_handshake_timeout=timeout.total or None,
)
except BaseException:
# We need to close the underlying transport since
diff --git a/tests/test_connector.py b/tests/test_connector.py
index 15709db9b83..7fd1bd0158b 100644
--- a/tests/test_connector.py
+++ b/tests/test_connector.py
@@ -36,6 +36,7 @@
_ConnectTunnelConnection,
_DNSCacheTable,
)
+from aiohttp.pytest_plugin import AiohttpClient, AiohttpServer
from aiohttp.resolver import ResolveResult
from aiohttp.test_utils import unused_port
from aiohttp.tracing import Trace
@@ -2515,6 +2516,36 @@ async def test_start_tls_exception_with_ssl_shutdown_timeout_nonzero_pre_311(
underlying_transport.abort.assert_not_called()
+async def test_start_tls_with_zero_total_timeout(
+ aiohttp_server: AiohttpServer,
+ aiohttp_client: AiohttpClient,
+ ssl_ctx: ssl.SSLContext,
+ client_ssl_ctx: ssl.SSLContext,
+) -> None:
+ """Test that ClientTimeout(total=0) works with TLS connections.
+
+ Regression test for https://github.com/aio-libs/aiohttp/issues/11859
+ When total=0 (meaning no timeout), ssl_handshake_timeout should receive
+ None instead of 0, as asyncio raises ValueError for 0.
+ """
+
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(text="ok")
+
+ app = web.Application()
+ app.router.add_get("/", handler)
+ server = await aiohttp_server(app, ssl=ssl_ctx)
+
+ connector = aiohttp.TCPConnector(ssl=client_ssl_ctx)
+ client = await aiohttp_client(server, connector=connector)
+
+ # This used to raise ValueError: ssl_handshake_timeout should be a
+ # positive number, got 0
+ async with client.get("/", timeout=ClientTimeout(total=0)) as resp:
+ assert resp.status == 200
+ assert await resp.text() == "ok"
+
+
async def test_invalid_ssl_param() -> None:
with pytest.raises(TypeError):
aiohttp.TCPConnector(ssl=object()) # type: ignore[arg-type]
From b17972f5ea2e8b8cb80486c8a5a5b9e67788d571 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Mon, 16 Feb 2026 15:12:03 +0000
Subject: [PATCH 059/141] [PR #12067/830b9421 backport][3.13] Fix ValueError
with ClientTimeout(total=0) in TLS connections (#12081)
**This is a backport of PR #12067 as merged into 3.14
(830b942100330a24c5dc0245f8ce3bdbfc2ef764).**
Co-authored-by: Varun Chawla <34209028+veeceey@users.noreply.github.com>
---
CHANGES/11859.bugfix.rst | 1 +
aiohttp/connector.py | 4 ++--
tests/test_connector.py | 31 +++++++++++++++++++++++++++++++
3 files changed, 34 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/11859.bugfix.rst
diff --git a/CHANGES/11859.bugfix.rst b/CHANGES/11859.bugfix.rst
new file mode 100644
index 00000000000..7e32de1a4fa
--- /dev/null
+++ b/CHANGES/11859.bugfix.rst
@@ -0,0 +1 @@
+Fixed :exc:`ValueError` when creating a TLS connection with ``ClientTimeout(total=0)`` by converting ``0`` to ``None`` before passing to ``ssl_handshake_timeout`` in :py:meth:`asyncio.loop.start_tls` -- by :user:`veeceey`.
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index 384741ed994..348f1271a7d 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -1455,7 +1455,7 @@ async def _start_tls_connection(
tls_proto,
sslcontext,
server_hostname=req.server_hostname or req.host,
- ssl_handshake_timeout=timeout.total,
+ ssl_handshake_timeout=timeout.total or None,
ssl_shutdown_timeout=self._ssl_shutdown_timeout,
)
else:
@@ -1464,7 +1464,7 @@ async def _start_tls_connection(
tls_proto,
sslcontext,
server_hostname=req.server_hostname or req.host,
- ssl_handshake_timeout=timeout.total,
+ ssl_handshake_timeout=timeout.total or None,
)
except BaseException:
# We need to close the underlying transport since
diff --git a/tests/test_connector.py b/tests/test_connector.py
index cf187d66b08..0755cff96ff 100644
--- a/tests/test_connector.py
+++ b/tests/test_connector.py
@@ -45,6 +45,7 @@
_ConnectTunnelConnection,
_DNSCacheTable,
)
+from aiohttp.pytest_plugin import AiohttpClient, AiohttpServer
from aiohttp.resolver import ResolveResult
from aiohttp.test_utils import unused_port
from aiohttp.tracing import Trace
@@ -2524,6 +2525,36 @@ async def test_start_tls_exception_with_ssl_shutdown_timeout_nonzero_pre_311(
underlying_transport.abort.assert_not_called()
+async def test_start_tls_with_zero_total_timeout(
+ aiohttp_server: AiohttpServer,
+ aiohttp_client: AiohttpClient,
+ ssl_ctx: ssl.SSLContext,
+ client_ssl_ctx: ssl.SSLContext,
+) -> None:
+ """Test that ClientTimeout(total=0) works with TLS connections.
+
+ Regression test for https://github.com/aio-libs/aiohttp/issues/11859
+ When total=0 (meaning no timeout), ssl_handshake_timeout should receive
+ None instead of 0, as asyncio raises ValueError for 0.
+ """
+
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response(text="ok")
+
+ app = web.Application()
+ app.router.add_get("/", handler)
+ server = await aiohttp_server(app, ssl=ssl_ctx)
+
+ connector = aiohttp.TCPConnector(ssl=client_ssl_ctx)
+ client = await aiohttp_client(server, connector=connector)
+
+ # This used to raise ValueError: ssl_handshake_timeout should be a
+ # positive number, got 0
+ async with client.get("/", timeout=ClientTimeout(total=0)) as resp:
+ assert resp.status == 200
+ assert await resp.text() == "ok"
+
+
async def test_invalid_ssl_param() -> None:
with pytest.raises(TypeError):
aiohttp.TCPConnector(ssl=object()) # type: ignore[arg-type]
From 367d8d1be897e9eb7f42655e19fb134d9d9bc9ef Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 17 Feb 2026 10:59:48 +0000
Subject: [PATCH 060/141] Bump gunicorn from 25.0.3 to 25.1.0 (#12084)
Bumps [gunicorn](https://github.com/benoitc/gunicorn) from 25.0.3 to
25.1.0.
Release notes
Sourced from gunicorn's
releases.
Gunicorn 25.1.0
New Features
-
Control Interface (gunicornc): Add interactive
control interface for managing
running Gunicorn instances, similar to birdc for BIRD routing daemon
([PR #3505](benoitc/gunicorn#3505))
- Unix socket-based communication with JSON protocol
- Interactive mode with readline support and command history
- Commands:
show
all/workers/dirty/config/stats/listeners
- Worker management:
worker add/remove/kill, dirty
add/remove
- Server control:
reload, reopen,
shutdown
- New settings:
--control-socket,
--control-socket-mode,
--no-control-socket
- New CLI tool:
gunicornc for connecting to control
socket
- See Control
Interface Guide for details
-
Dirty Stash: Add global shared state between workers
via dirty.stash
([PR #3503](benoitc/gunicorn#3503))
- In-memory key-value store accessible by all workers
- Supports get, set, delete, clear, keys, and has operations
- Useful for sharing state like feature flags, rate limits, or cached
data
-
Dirty Binary Protocol: Implement efficient binary
protocol for dirty arbiter IPC
using TLV (Type-Length-Value) encoding
([PR #3500](benoitc/gunicorn#3500))
- More efficient than JSON for binary data
- Supports all Python types: str, bytes, int, float, bool, None, list,
dict
- Better performance for large payloads
-
Dirty TTIN/TTOU Signals: Add dynamic worker scaling
for dirty arbiters
([PR #3504](benoitc/gunicorn#3504))
- Send SIGTTIN to increase dirty workers
- Send SIGTTOU to decrease dirty workers
- Respects minimum worker constraints from app configurations
Changes
- ASGI Worker: Promoted from beta to stable
- Dirty Arbiters: Now marked as beta feature
Documentation
- Fix Markdown formatting in /configure documentation
Commits
2d43101
docs: merge gunicornc into 25.1.0 release
bf4ad8d
docs: update 25.1.0 release date to 2026-02-13
730350e
Merge pull request #3505
from benoitc/feature/gunicornc-control-interface
63df19b
fix(tests): use process groups for reliable signal handling in PyPy
cd77bcc
fix(tests): increase wait time for all server tests
02ea985
fix(tests): improve server test reliability on FreeBSD
6d81c9e
fix: resolve pylint warnings
7486baa
fix: remove unused imports
3e60d29
docs: add gunicornc control interface guide
e05e40d
feat(ctl): add message-based dirty worker management
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index 3f994357afe..dc6e4b907aa 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.3
+gunicorn==25.1.0
# via -r requirements/base-ft.in
idna==3.10
# via yarl
diff --git a/requirements/base.txt b/requirements/base.txt
index 4ee2e7d4394..f45e21291be 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.3
+gunicorn==25.1.0
# via -r requirements/base.in
idna==3.10
# via yarl
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 81009436de5..631c61a2282 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -83,7 +83,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.3
+gunicorn==25.1.0
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 4388f1d6e66..8719f25bbdf 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -81,7 +81,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.3
+gunicorn==25.1.0
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index b1fe76b012e..a5056a1596b 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.3
+gunicorn==25.1.0
# via -r requirements/base-ft.in
idna==3.10
# via
diff --git a/requirements/test.txt b/requirements/test.txt
index c1afdbc2d97..7c9a6553cb3 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-gunicorn==25.0.3
+gunicorn==25.1.0
# via -r requirements/base.in
idna==3.10
# via
From 800360a174b1d8531e432635c66c2e4a10d0b9a8 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Tue, 17 Feb 2026 19:05:48 +0000
Subject: [PATCH 061/141] [PR #12042/a52e149c backport][3.14] docs: Document
asyncio.TimeoutError for WebSocketResponse.receive methods (#12087)
**This is a backport of PR #12042 as merged into master
(a52e149caebe304ae45f7387fa0c7a8dac3b45f1).**
Co-authored-by: Varun Chawla <34209028+veeceey@users.noreply.github.com>
---
CHANGES/12042.doc.rst | 2 ++
docs/web_reference.rst | 7 +++++++
2 files changed, 9 insertions(+)
create mode 100644 CHANGES/12042.doc.rst
diff --git a/CHANGES/12042.doc.rst b/CHANGES/12042.doc.rst
new file mode 100644
index 00000000000..50c30f28cdf
--- /dev/null
+++ b/CHANGES/12042.doc.rst
@@ -0,0 +1,2 @@
+Documented :exc:`asyncio.TimeoutError` for ``WebSocketResponse.receive()``
+and related methods -- by :user:`veeceey`.
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index e8ca31f8614..c351a447610 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -1306,6 +1306,8 @@ and :ref:`aiohttp-web-signals` handlers::
:raise RuntimeError: if connection is not started
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
+
.. method:: receive_str(*, timeout=None)
:async:
@@ -1324,6 +1326,8 @@ and :ref:`aiohttp-web-signals` handlers::
:raise aiohttp.WSMessageTypeError: if message is not :const:`~aiohttp.WSMsgType.TEXT`.
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
+
.. method:: receive_bytes(*, timeout=None)
:async:
@@ -1343,6 +1347,8 @@ and :ref:`aiohttp-web-signals` handlers::
:raise aiohttp.WSMessageTypeError: if message is not :const:`~aiohttp.WSMsgType.BINARY`.
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
+
.. method:: receive_json(*, loads=json.loads, timeout=None)
:async:
@@ -1366,6 +1372,7 @@ and :ref:`aiohttp-web-signals` handlers::
:raise TypeError: if message is :const:`~aiohttp.WSMsgType.BINARY`.
:raise ValueError: if message is not valid JSON.
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
.. seealso:: :ref:`WebSockets handling`
From 99d1bc1abf7128460246299e8501d37a9e9f9e4e Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Tue, 17 Feb 2026 19:06:06 +0000
Subject: [PATCH 062/141] [PR #12042/a52e149c backport][3.13] docs: Document
asyncio.TimeoutError for WebSocketResponse.receive methods (#12086)
**This is a backport of PR #12042 as merged into master
(a52e149caebe304ae45f7387fa0c7a8dac3b45f1).**
Co-authored-by: Varun Chawla <34209028+veeceey@users.noreply.github.com>
---
CHANGES/12042.doc.rst | 2 ++
docs/web_reference.rst | 7 +++++++
2 files changed, 9 insertions(+)
create mode 100644 CHANGES/12042.doc.rst
diff --git a/CHANGES/12042.doc.rst b/CHANGES/12042.doc.rst
new file mode 100644
index 00000000000..50c30f28cdf
--- /dev/null
+++ b/CHANGES/12042.doc.rst
@@ -0,0 +1,2 @@
+Documented :exc:`asyncio.TimeoutError` for ``WebSocketResponse.receive()``
+and related methods -- by :user:`veeceey`.
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index 3ff3299b316..83a9c7f0ecd 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -1290,6 +1290,8 @@ and :ref:`aiohttp-web-signals` handlers::
:raise RuntimeError: if connection is not started
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
+
.. method:: receive_str(*, timeout=None)
:async:
@@ -1308,6 +1310,8 @@ and :ref:`aiohttp-web-signals` handlers::
:raise aiohttp.WSMessageTypeError: if message is not :const:`~aiohttp.WSMsgType.TEXT`.
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
+
.. method:: receive_bytes(*, timeout=None)
:async:
@@ -1327,6 +1331,8 @@ and :ref:`aiohttp-web-signals` handlers::
:raise aiohttp.WSMessageTypeError: if message is not :const:`~aiohttp.WSMsgType.BINARY`.
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
+
.. method:: receive_json(*, loads=json.loads, timeout=None)
:async:
@@ -1350,6 +1356,7 @@ and :ref:`aiohttp-web-signals` handlers::
:raise TypeError: if message is :const:`~aiohttp.WSMsgType.BINARY`.
:raise ValueError: if message is not valid JSON.
+ :raise asyncio.TimeoutError: if timeout expires before receiving a message
.. seealso:: :ref:`WebSockets handling`
From 564cdbfffde2ca8a3fef858815427ac2de695a6c Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Wed, 18 Feb 2026 10:54:58 +0000
Subject: [PATCH 063/141] Bump librt from 0.7.8 to 0.8.1 (#12090)
Bumps [librt](https://github.com/mypyc/librt) from 0.7.8 to 0.8.1.
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 631c61a2282..1312123dd69 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -104,7 +104,7 @@ jinja2==3.1.6
# via
# sphinx
# towncrier
-librt==0.7.8
+librt==0.8.1
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 8719f25bbdf..771e82994a0 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -102,7 +102,7 @@ jinja2==3.1.6
# via
# sphinx
# towncrier
-librt==0.7.8
+librt==0.8.1
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 029f9dae23f..ecd2c2e3ee4 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -43,7 +43,7 @@ iniconfig==2.3.0
# via pytest
isal==1.7.2
# via -r requirements/lint.in
-librt==0.7.8
+librt==0.8.1
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 5091e21519e..71c2caf7bb1 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -34,7 +34,7 @@ iniconfig==2.3.0
# via pytest
isal==1.8.0 ; python_version < "3.14"
# via -r requirements/test-common.in
-librt==0.7.8
+librt==0.8.1
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index a5056a1596b..891b83979ad 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -57,7 +57,7 @@ iniconfig==2.3.0
# via pytest
isal==1.8.0 ; python_version < "3.14"
# via -r requirements/test-common.in
-librt==0.7.8
+librt==0.8.1
# via mypy
markdown-it-py==4.0.0
# via rich
diff --git a/requirements/test.txt b/requirements/test.txt
index 7c9a6553cb3..97566b1b745 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -57,7 +57,7 @@ iniconfig==2.3.0
# via pytest
isal==1.7.2 ; python_version < "3.14"
# via -r requirements/test-common.in
-librt==0.7.8
+librt==0.8.1
# via mypy
markdown-it-py==4.0.0
# via rich
From e69617b65d67fe63c6af33141e2a123e3993a10a Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 20 Feb 2026 10:53:42 +0000
Subject: [PATCH 064/141] Bump regex from 2026.1.15 to 2026.2.19 (#12100)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [regex](https://github.com/mrabarnett/mrab-regex) from 2026.1.15
to 2026.2.19.
Changelog
Sourced from regex's
changelog.
Version: 2026.2.19
Added \z as alias of \Z, like in re module.
Added prefixmatch as alias of match, like in re module.
Version: 2026.1.15
Re-uploaded.
Version: 2026.1.14
Git issue 596: Specifying {e<=0} causes ca 210× slow-down.
Added RISC-V wheels.
Version: 2025.11.3
Git issue 594: Support relative PARNO in recursive
subpatterns.
Version: 2025.10.23
'setup.py' was missing from the source distribution.
Version: 2025.10.22
Fixed test in main.yml.
Version: 2025.10.21
Moved tests into subfolder.
Version: 2025.10.20
Re-organised files.
Updated to Unicode 17.0.0.
Version: 2025.9.20
Enable free-threading support in cibuildwheel in another
place.
Version: 2025.9.19
Enable free-threading support in cibuildwheel.
Version: 2025.9.18
Git issue 565: Support the free-threaded build of CPython
3.13
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
5 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 1312123dd69..114fa5b8bbd 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -201,7 +201,7 @@ pyyaml==6.0.3
# via pre-commit
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2026.1.15
+regex==2026.2.19
# via re-assert
requests==2.32.5
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 771e82994a0..18699d44ca4 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -196,7 +196,7 @@ pyyaml==6.0.3
# via pre-commit
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2026.1.15
+regex==2026.2.19
# via re-assert
requests==2.32.5
# via sphinx
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 71c2caf7bb1..e49053e0b67 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -87,7 +87,7 @@ python-on-whales==0.80.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2026.1.15
+regex==2026.2.19
# via re-assert
rich==14.3.2
# via pytest-codspeed
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 891b83979ad..fcf77a79177 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -122,7 +122,7 @@ python-on-whales==0.80.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2026.1.15
+regex==2026.2.19
# via re-assert
rich==14.3.2
# via pytest-codspeed
diff --git a/requirements/test.txt b/requirements/test.txt
index 97566b1b745..cf90673850d 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -122,7 +122,7 @@ python-on-whales==0.80.0
# via -r requirements/test-common.in
re-assert==1.1.0
# via -r requirements/test-common.in
-regex==2026.1.15
+regex==2026.2.19
# via re-assert
rich==14.3.2
# via pytest-codspeed
From bb61c2defb12825cdad6200858080d7e567e3dfe Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 20 Feb 2026 15:53:58 +0000
Subject: [PATCH 065/141] [PR #12101/e458a830 backport][3.14] Strip function
def from code example (#12103)
**This is a backport of PR #12101 as merged into master
(e458a830fcf7eae4fdc2c218c68ead3d1c4d69cb).**
---
docs/client_middleware_cookbook.rst | 2 ++
1 file changed, 2 insertions(+)
diff --git a/docs/client_middleware_cookbook.rst b/docs/client_middleware_cookbook.rst
index e890b02a4bf..5ecce0b4b1b 100644
--- a/docs/client_middleware_cookbook.rst
+++ b/docs/client_middleware_cookbook.rst
@@ -60,6 +60,8 @@ efficient to implement without a middleware:
.. literalinclude:: code/client_middleware_cookbook.py
:pyobject: token_refresh_preemptively_example
+ :lines: 2-
+ :dedent:
Or combine the above approaches to create a more robust solution.
From 8c1c05c5dab9af31b311791cfc2e2dbeccb78da9 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 20 Feb 2026 16:17:58 +0000
Subject: [PATCH 066/141] [PR #12101/e458a830 backport][3.13] Strip function
def from code example (#12102)
**This is a backport of PR #12101 as merged into master
(e458a830fcf7eae4fdc2c218c68ead3d1c4d69cb).**
---
docs/client_middleware_cookbook.rst | 2 ++
1 file changed, 2 insertions(+)
diff --git a/docs/client_middleware_cookbook.rst b/docs/client_middleware_cookbook.rst
index e890b02a4bf..5ecce0b4b1b 100644
--- a/docs/client_middleware_cookbook.rst
+++ b/docs/client_middleware_cookbook.rst
@@ -60,6 +60,8 @@ efficient to implement without a middleware:
.. literalinclude:: code/client_middleware_cookbook.py
:pyobject: token_refresh_preemptively_example
+ :lines: 2-
+ :dedent:
Or combine the above approaches to create a more robust solution.
From b4f67b19bb064781985208c9d874302ae6a7d286 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sat, 21 Feb 2026 00:07:46 +0000
Subject: [PATCH 067/141] [PR #12097/51d5dbac backport][3.14] Fix digest auth
dropping challenge fields with empty string values (#12108)
**This is a backport of PR #12097 as merged into master
(51d5dbac70b5321da631e224c384058948889608).**
Co-authored-by: Kadir Can Ozden <101993364+bysiber@users.noreply.github.com>
---
CHANGES/12097.bugfix.rst | 1 +
aiohttp/client_middleware_digest_auth.py | 2 +-
tests/test_client_middleware_digest_auth.py | 7 +++++++
3 files changed, 9 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12097.bugfix.rst
diff --git a/CHANGES/12097.bugfix.rst b/CHANGES/12097.bugfix.rst
new file mode 100644
index 00000000000..1ea88a8c087
--- /dev/null
+++ b/CHANGES/12097.bugfix.rst
@@ -0,0 +1 @@
+Fixed digest auth dropping challenge fields with empty string values -- by :user:`bysiber`.
diff --git a/aiohttp/client_middleware_digest_auth.py b/aiohttp/client_middleware_digest_auth.py
index 43e398ebd11..64257a65c18 100644
--- a/aiohttp/client_middleware_digest_auth.py
+++ b/aiohttp/client_middleware_digest_auth.py
@@ -414,7 +414,7 @@ def _authenticate(self, response: ClientResponse) -> bool:
# Extract challenge parameters
self._challenge = {}
for field in CHALLENGE_FIELDS:
- if value := header_pairs.get(field):
+ if (value := header_pairs.get(field)) is not None:
self._challenge[field] = value
# Update protection space based on domain parameter or default to origin
diff --git a/tests/test_client_middleware_digest_auth.py b/tests/test_client_middleware_digest_auth.py
index 52e6f97050a..c490fb70d78 100644
--- a/tests/test_client_middleware_digest_auth.py
+++ b/tests/test_client_middleware_digest_auth.py
@@ -114,6 +114,13 @@ def mock_md5_digest() -> Generator[mock.MagicMock, None, None]:
True,
{"realm": "test", "nonce": "abc", "qop": "auth"},
),
+ # Valid digest with empty realm (RFC 7616 Section 3.3 allows this)
+ (
+ 401,
+ {"www-authenticate": 'Digest realm="", nonce="abc", qop="auth"'},
+ True,
+ {"realm": "", "nonce": "abc", "qop": "auth"},
+ ),
# Non-401 status
(200, {}, False, {}), # No challenge should be set
],
From 39ae036bfa222bcaefa9a13cb5760b6523833387 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sat, 21 Feb 2026 00:08:02 +0000
Subject: [PATCH 068/141] [PR #12097/51d5dbac backport][3.13] Fix digest auth
dropping challenge fields with empty string values (#12107)
**This is a backport of PR #12097 as merged into master
(51d5dbac70b5321da631e224c384058948889608).**
Co-authored-by: Kadir Can Ozden <101993364+bysiber@users.noreply.github.com>
---
CHANGES/12097.bugfix.rst | 1 +
aiohttp/client_middleware_digest_auth.py | 2 +-
tests/test_client_middleware_digest_auth.py | 7 +++++++
3 files changed, 9 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12097.bugfix.rst
diff --git a/CHANGES/12097.bugfix.rst b/CHANGES/12097.bugfix.rst
new file mode 100644
index 00000000000..1ea88a8c087
--- /dev/null
+++ b/CHANGES/12097.bugfix.rst
@@ -0,0 +1 @@
+Fixed digest auth dropping challenge fields with empty string values -- by :user:`bysiber`.
diff --git a/aiohttp/client_middleware_digest_auth.py b/aiohttp/client_middleware_digest_auth.py
index 5aab5acb85a..d7f2f1eb9f8 100644
--- a/aiohttp/client_middleware_digest_auth.py
+++ b/aiohttp/client_middleware_digest_auth.py
@@ -425,7 +425,7 @@ def _authenticate(self, response: ClientResponse) -> bool:
# Extract challenge parameters
self._challenge = {}
for field in CHALLENGE_FIELDS:
- if value := header_pairs.get(field):
+ if (value := header_pairs.get(field)) is not None:
self._challenge[field] = value
# Update protection space based on domain parameter or default to origin
diff --git a/tests/test_client_middleware_digest_auth.py b/tests/test_client_middleware_digest_auth.py
index 453e78ab4c1..65e7d667e9d 100644
--- a/tests/test_client_middleware_digest_auth.py
+++ b/tests/test_client_middleware_digest_auth.py
@@ -113,6 +113,13 @@ def mock_md5_digest() -> Generator[mock.MagicMock, None, None]:
True,
{"realm": "test", "nonce": "abc", "qop": "auth"},
),
+ # Valid digest with empty realm (RFC 7616 Section 3.3 allows this)
+ (
+ 401,
+ {"www-authenticate": 'Digest realm="", nonce="abc", qop="auth"'},
+ True,
+ {"realm": "", "nonce": "abc", "qop": "auth"},
+ ),
# Non-401 status
(200, {}, False, {}), # No challenge should be set
],
From 2712a3c5625ab0712995e4baa4041ab20b711c2e Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 21 Feb 2026 00:10:53 +0000
Subject: [PATCH 069/141] Fix multipart injection (#12104) (#12109)
(cherry picked from commit dab9e879be5606682a39b9dd378900eba0afd1a4)
Co-authored-by: mingi jung
---
aiohttp/formdata.py | 5 +++++
tests/test_formdata.py | 16 +++++++++++-----
2 files changed, 16 insertions(+), 5 deletions(-)
diff --git a/aiohttp/formdata.py b/aiohttp/formdata.py
index 869a966a712..5fe37f62dce 100644
--- a/aiohttp/formdata.py
+++ b/aiohttp/formdata.py
@@ -79,6 +79,11 @@ def add_field(
raise TypeError(
"content_type must be an instance of str. Got: %s" % content_type
)
+ if "\r" in content_type or "\n" in content_type:
+ raise ValueError(
+ "Newline or carriage return detected in headers. "
+ "Potential header injection attack."
+ )
headers[hdrs.CONTENT_TYPE] = content_type
self._is_multipart = True
if content_transfer_encoding is not None:
diff --git a/tests/test_formdata.py b/tests/test_formdata.py
index 5fe8f92b097..d826c92e67d 100644
--- a/tests/test_formdata.py
+++ b/tests/test_formdata.py
@@ -67,12 +67,18 @@ async def test_formdata_textio_charset(buf: bytearray, writer) -> None:
assert b"\x93\xfa\x96{" in buf
-def test_invalid_formdata_content_type() -> None:
+@pytest.mark.parametrize("val", (0, 0.1, {}, [], b"foo"))
+def test_invalid_type_formdata_content_type(val: object) -> None:
form = FormData()
- invalid_vals = [0, 0.1, {}, [], b"foo"]
- for invalid_val in invalid_vals:
- with pytest.raises(TypeError):
- form.add_field("foo", "bar", content_type=invalid_val)
+ with pytest.raises(TypeError):
+ form.add_field("foo", "bar", content_type=val) # type: ignore[arg-type]
+
+
+@pytest.mark.parametrize("val", ("\r", "\n", "a\ra\n", "a\na\r"))
+def test_invalid_value_formdata_content_type(val: str) -> None:
+ form = FormData()
+ with pytest.raises(ValueError):
+ form.add_field("foo", "bar", content_type=val)
def test_invalid_formdata_filename() -> None:
From 9a6ada97e2c6cf1ce31727c6c9fcea17c21f6f06 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 21 Feb 2026 00:17:11 +0000
Subject: [PATCH 070/141] Fix multipart injection (#12104) (#12110)
(cherry picked from commit dab9e879be5606682a39b9dd378900eba0afd1a4)
Co-authored-by: mingi jung
---
aiohttp/formdata.py | 5 +++++
tests/test_formdata.py | 16 +++++++++++-----
2 files changed, 16 insertions(+), 5 deletions(-)
diff --git a/aiohttp/formdata.py b/aiohttp/formdata.py
index a5a4f603e19..17d219a32c0 100644
--- a/aiohttp/formdata.py
+++ b/aiohttp/formdata.py
@@ -78,6 +78,11 @@ def add_field(
raise TypeError(
"content_type must be an instance of str. Got: %s" % content_type
)
+ if "\r" in content_type or "\n" in content_type:
+ raise ValueError(
+ "Newline or carriage return detected in headers. "
+ "Potential header injection attack."
+ )
headers[hdrs.CONTENT_TYPE] = content_type
self._is_multipart = True
if content_transfer_encoding is not None:
diff --git a/tests/test_formdata.py b/tests/test_formdata.py
index 5fe8f92b097..d826c92e67d 100644
--- a/tests/test_formdata.py
+++ b/tests/test_formdata.py
@@ -67,12 +67,18 @@ async def test_formdata_textio_charset(buf: bytearray, writer) -> None:
assert b"\x93\xfa\x96{" in buf
-def test_invalid_formdata_content_type() -> None:
+@pytest.mark.parametrize("val", (0, 0.1, {}, [], b"foo"))
+def test_invalid_type_formdata_content_type(val: object) -> None:
form = FormData()
- invalid_vals = [0, 0.1, {}, [], b"foo"]
- for invalid_val in invalid_vals:
- with pytest.raises(TypeError):
- form.add_field("foo", "bar", content_type=invalid_val)
+ with pytest.raises(TypeError):
+ form.add_field("foo", "bar", content_type=val) # type: ignore[arg-type]
+
+
+@pytest.mark.parametrize("val", ("\r", "\n", "a\ra\n", "a\na\r"))
+def test_invalid_value_formdata_content_type(val: str) -> None:
+ form = FormData()
+ with pytest.raises(ValueError):
+ form.add_field("foo", "bar", content_type=val)
def test_invalid_formdata_filename() -> None:
From 9b63f4c20cd7a0d881dff59f4fc7302d788d2ccc Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sat, 21 Feb 2026 22:31:05 +0000
Subject: [PATCH 071/141] [PR #12069/3a39006d backport][3.14] Upgrade to llhttp
3.9.1 (#12072)
**This is a backport of PR #12069 as merged into master
(3a39006da33970ac4b46256558ea2add7b4bd075).**
Co-authored-by: Sam Bull
---
CHANGES/12069.packaging.rst | 1 +
vendor/llhttp | 2 +-
2 files changed, 2 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12069.packaging.rst
diff --git a/CHANGES/12069.packaging.rst b/CHANGES/12069.packaging.rst
new file mode 100644
index 00000000000..a5c79a7cebe
--- /dev/null
+++ b/CHANGES/12069.packaging.rst
@@ -0,0 +1 @@
+Upgraded llhttp to 3.9.1 -- by :user:`Dreamsorcerer`.
diff --git a/vendor/llhttp b/vendor/llhttp
index 36151b9a7d6..06b12e87f20 160000
--- a/vendor/llhttp
+++ b/vendor/llhttp
@@ -1 +1 @@
-Subproject commit 36151b9a7d6320072e24e472a769a5e09f9e969d
+Subproject commit 06b12e87f209da43e3e9e0f958b7464a4a218896
From 7d590fe1f90a440cdca6642e78721be2ac05d1c0 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sat, 21 Feb 2026 22:31:16 +0000
Subject: [PATCH 072/141] [PR #12069/3a39006d backport][3.13] Upgrade to llhttp
3.9.1 (#12071)
**This is a backport of PR #12069 as merged into master
(3a39006da33970ac4b46256558ea2add7b4bd075).**
Co-authored-by: Sam Bull
---
CHANGES/12069.packaging.rst | 1 +
vendor/llhttp | 2 +-
2 files changed, 2 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12069.packaging.rst
diff --git a/CHANGES/12069.packaging.rst b/CHANGES/12069.packaging.rst
new file mode 100644
index 00000000000..a5c79a7cebe
--- /dev/null
+++ b/CHANGES/12069.packaging.rst
@@ -0,0 +1 @@
+Upgraded llhttp to 3.9.1 -- by :user:`Dreamsorcerer`.
diff --git a/vendor/llhttp b/vendor/llhttp
index 36151b9a7d6..06b12e87f20 160000
--- a/vendor/llhttp
+++ b/vendor/llhttp
@@ -1 +1 @@
-Subproject commit 36151b9a7d6320072e24e472a769a5e09f9e969d
+Subproject commit 06b12e87f209da43e3e9e0f958b7464a4a218896
From f82bc8aad2f1e5635381044f03ca8640e9727939 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sun, 22 Feb 2026 12:53:37 +0000
Subject: [PATCH 073/141] Allow `JSONEncoder` to return bytes directly (#11989)
(#12115)
(cherry picked from commit 67fa1f5e64a95a5e081f566c4718bc7062c7dbdb)
---------
Co-authored-by: Kevin Park
---
CHANGES/11989.feature.rst | 7 +++
aiohttp/client.py | 17 ++++++-
aiohttp/client_ws.py | 15 ++++++
aiohttp/payload.py | 26 ++++++++++-
aiohttp/typedefs.py | 1 +
aiohttp/web.py | 2 +
aiohttp/web_response.py | 39 +++++++++++++++-
aiohttp/web_ws.py | 16 ++++++-
docs/client_reference.rst | 22 +++++++++
docs/spelling_wordlist.txt | 1 +
docs/web_reference.rst | 33 +++++++++++++
tests/test_client_functional.py | 23 ++++++++-
tests/test_client_ws_functional.py | 75 ++++++++++++++++++++++++++++++
tests/test_payload.py | 34 ++++++++++++++
tests/test_web_response.py | 46 +++++++++++++++++-
tests/test_web_websocket.py | 49 ++++++++++++++++++-
16 files changed, 396 insertions(+), 10 deletions(-)
create mode 100644 CHANGES/11989.feature.rst
diff --git a/CHANGES/11989.feature.rst b/CHANGES/11989.feature.rst
new file mode 100644
index 00000000000..ced05b5e100
--- /dev/null
+++ b/CHANGES/11989.feature.rst
@@ -0,0 +1,7 @@
+Added explicit APIs for bytes-returning JSON serializer:
+``JSONBytesEncoder`` type, ``JsonBytesPayload``,
+:func:`~aiohttp.web.json_bytes_response`,
+:meth:`~aiohttp.web.WebSocketResponse.send_json_bytes` and
+:meth:`~aiohttp.ClientWebSocketResponse.send_json_bytes` methods, and
+``json_serialize_bytes`` parameter for :class:`~aiohttp.ClientSession`
+-- by :user:`kevinpark1217`.
diff --git a/aiohttp/client.py b/aiohttp/client.py
index 7a6df06c427..7c18e8c3318 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -104,7 +104,14 @@
from .http import WS_KEY, HttpVersion, WebSocketReader, WebSocketWriter
from .http_websocket import WSHandshakeError, ws_ext_gen, ws_ext_parse
from .tracing import Trace, TraceConfig
-from .typedefs import JSONEncoder, LooseCookies, LooseHeaders, Query, StrOrURL
+from .typedefs import (
+ JSONBytesEncoder,
+ JSONEncoder,
+ LooseCookies,
+ LooseHeaders,
+ Query,
+ StrOrURL,
+)
__all__ = (
# client_exceptions
@@ -271,6 +278,7 @@ class ClientSession:
"_default_auth",
"_version",
"_json_serialize",
+ "_json_serialize_bytes",
"_requote_redirect_url",
"_timeout",
"_raise_for_status",
@@ -311,6 +319,7 @@ def __init__(
skip_auto_headers: Iterable[str] | None = None,
auth: BasicAuth | None = None,
json_serialize: JSONEncoder = json.dumps,
+ json_serialize_bytes: JSONBytesEncoder | None = None,
request_class: type[ClientRequest] = ClientRequest,
response_class: type[ClientResponse] = ClientResponse,
ws_response_class: type[ClientWebSocketResponse] = ClientWebSocketResponse,
@@ -421,6 +430,7 @@ def __init__(
self._default_auth = auth
self._version = version
self._json_serialize = json_serialize
+ self._json_serialize_bytes = json_serialize_bytes
self._raise_for_status = raise_for_status
self._auto_decompress = auto_decompress
self._trust_env = trust_env
@@ -561,7 +571,10 @@ async def _request(
"data and json parameters can not be used at the same time"
)
elif json is not None:
- data = payload.JsonPayload(json, dumps=self._json_serialize)
+ if self._json_serialize_bytes is not None:
+ data = payload.JsonBytesPayload(json, dumps=self._json_serialize_bytes)
+ else:
+ data = payload.JsonPayload(json, dumps=self._json_serialize)
if not isinstance(chunked, bool) and chunked is not None:
warnings.warn("Chunk size is deprecated #1615", DeprecationWarning)
diff --git a/aiohttp/client_ws.py b/aiohttp/client_ws.py
index ee50f1df6eb..02f560f68d2 100644
--- a/aiohttp/client_ws.py
+++ b/aiohttp/client_ws.py
@@ -27,6 +27,7 @@
from .typedefs import (
DEFAULT_JSON_DECODER,
DEFAULT_JSON_ENCODER,
+ JSONBytesEncoder,
JSONDecoder,
JSONEncoder,
)
@@ -303,6 +304,20 @@ async def send_json(
) -> None:
await self.send_str(dumps(data), compress=compress)
+ async def send_json_bytes(
+ self,
+ data: Any,
+ compress: int | None = None,
+ *,
+ dumps: JSONBytesEncoder,
+ ) -> None:
+ """Send JSON data using a bytes-returning encoder as a binary frame.
+
+ Use this when your JSON encoder (like orjson) returns bytes
+ instead of str, avoiding the encode/decode overhead.
+ """
+ await self.send_bytes(dumps(data), compress=compress)
+
async def close(self, *, code: int = WSCloseCode.OK, message: bytes = b"") -> bool:
# we need to break `receive()` cycle first,
# `close()` may be called from different task
diff --git a/aiohttp/payload.py b/aiohttp/payload.py
index 98347d8a885..5ea00039be7 100644
--- a/aiohttp/payload.py
+++ b/aiohttp/payload.py
@@ -23,7 +23,7 @@
sentinel,
)
from .streams import StreamReader
-from .typedefs import JSONEncoder, _CIMultiDict
+from .typedefs import JSONBytesEncoder, JSONEncoder, _CIMultiDict
__all__ = (
"PAYLOAD_REGISTRY",
@@ -38,6 +38,7 @@
"TextIOPayload",
"StringIOPayload",
"JsonPayload",
+ "JsonBytesPayload",
"AsyncIterablePayload",
)
@@ -943,6 +944,29 @@ def __init__(
)
+class JsonBytesPayload(BytesPayload):
+ """JSON payload for encoders that return bytes directly.
+
+ Use this when your JSON encoder (like orjson) returns bytes
+ instead of str, avoiding the encode/decode overhead.
+ """
+
+ def __init__(
+ self,
+ value: Any,
+ dumps: JSONBytesEncoder,
+ content_type: str = "application/json",
+ *args: Any,
+ **kwargs: Any,
+ ) -> None:
+ super().__init__(
+ dumps(value),
+ content_type=content_type,
+ *args,
+ **kwargs,
+ )
+
+
if TYPE_CHECKING:
from collections.abc import AsyncIterable, AsyncIterator
diff --git a/aiohttp/typedefs.py b/aiohttp/typedefs.py
index dd7ad257460..ef27affdd3f 100644
--- a/aiohttp/typedefs.py
+++ b/aiohttp/typedefs.py
@@ -27,6 +27,7 @@
Byteish = Union[bytes, bytearray, memoryview]
JSONEncoder = Callable[[Any], str]
+JSONBytesEncoder = Callable[[Any], bytes]
JSONDecoder = Callable[[str], Any]
LooseHeaders = Union[
Mapping[str, str],
diff --git a/aiohttp/web.py b/aiohttp/web.py
index c1ab12e84ed..071f51d9fe2 100644
--- a/aiohttp/web.py
+++ b/aiohttp/web.py
@@ -96,6 +96,7 @@
ContentCoding as ContentCoding,
Response as Response,
StreamResponse as StreamResponse,
+ json_bytes_response as json_bytes_response,
json_response as json_response,
)
from .web_routedef import (
@@ -228,6 +229,7 @@
"ContentCoding",
"Response",
"StreamResponse",
+ "json_bytes_response",
"json_response",
"ResponseKey",
# web_routedef
diff --git a/aiohttp/web_response.py b/aiohttp/web_response.py
index 4bf6afc32a3..9706cade9b5 100644
--- a/aiohttp/web_response.py
+++ b/aiohttp/web_response.py
@@ -32,12 +32,18 @@
)
from .http import SERVER_SOFTWARE, HttpVersion10, HttpVersion11
from .payload import Payload
-from .typedefs import JSONEncoder, LooseHeaders
+from .typedefs import JSONBytesEncoder, JSONEncoder, LooseHeaders
REASON_PHRASES = {http_status.value: http_status.phrase for http_status in HTTPStatus}
LARGE_BODY_SIZE = 1024**2
-__all__ = ("ContentCoding", "StreamResponse", "Response", "json_response")
+__all__ = (
+ "ContentCoding",
+ "StreamResponse",
+ "Response",
+ "json_response",
+ "json_bytes_response",
+)
if TYPE_CHECKING:
@@ -878,3 +884,32 @@ def json_response(
headers=headers,
content_type=content_type,
)
+
+
+def json_bytes_response(
+ data: Any = sentinel,
+ *,
+ dumps: JSONBytesEncoder,
+ body: bytes | None = None,
+ status: int = 200,
+ reason: str | None = None,
+ headers: LooseHeaders | None = None,
+ content_type: str = "application/json",
+) -> Response:
+ """Create a JSON response using a bytes-returning encoder.
+
+ Use this when your JSON encoder (like orjson) returns bytes
+ instead of str, avoiding the encode/decode overhead.
+ """
+ if data is not sentinel:
+ if body is not None:
+ raise ValueError("only one of data or body should be specified")
+ else:
+ body = dumps(data)
+ return Response(
+ body=body,
+ status=status,
+ reason=reason,
+ headers=headers,
+ content_type=content_type,
+ )
diff --git a/aiohttp/web_ws.py b/aiohttp/web_ws.py
index 0be33c6cd8d..1f305c16193 100644
--- a/aiohttp/web_ws.py
+++ b/aiohttp/web_ws.py
@@ -34,7 +34,7 @@
from .http_websocket import _INTERNAL_RECEIVE_TYPES
from .log import ws_logger
from .streams import EofStream
-from .typedefs import JSONDecoder, JSONEncoder
+from .typedefs import JSONBytesEncoder, JSONDecoder, JSONEncoder
from .web_exceptions import HTTPBadRequest, HTTPException
from .web_request import BaseRequest
from .web_response import StreamResponse
@@ -481,6 +481,20 @@ async def send_json(
) -> None:
await self.send_str(dumps(data), compress=compress)
+ async def send_json_bytes(
+ self,
+ data: Any,
+ compress: int | None = None,
+ *,
+ dumps: JSONBytesEncoder,
+ ) -> None:
+ """Send JSON data using a bytes-returning encoder as a binary frame.
+
+ Use this when your JSON encoder (like orjson) returns bytes
+ instead of str, avoiding the encode/decode overhead.
+ """
+ await self.send_bytes(dumps(data), compress=compress)
+
async def write_eof(self) -> None: # type: ignore[override]
if self._eof_sent:
return
diff --git a/docs/client_reference.rst b/docs/client_reference.rst
index 0b1ae5dff19..e8db2ac602d 100644
--- a/docs/client_reference.rst
+++ b/docs/client_reference.rst
@@ -1850,6 +1850,28 @@ manually.
The method is converted into :term:`coroutine`,
*compress* parameter added.
+ .. method:: send_json_bytes(data, compress=None, *, dumps)
+ :async:
+
+ Send *data* to peer as a JSON binary frame using a bytes-returning encoder.
+
+ :param data: data to send.
+
+ :param int compress: sets specific level of compression for
+ single message,
+ ``None`` for not overriding per-socket setting.
+
+ :param collections.abc.Callable dumps: any :term:`callable` that accepts an object and
+ returns JSON as :class:`bytes`
+ (e.g. ``orjson.dumps``).
+
+ :raise RuntimeError: if connection is not started or closing
+
+ :raise ValueError: if data is not serializable object
+
+ :raise TypeError: if value returned by ``dumps(data)`` is not
+ :class:`bytes`
+
.. method:: send_frame(message, opcode, compress=None)
:async:
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 9b5eafcea4a..cc92a04c2f5 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -227,6 +227,7 @@ nowait
OAuth
Online
optimizations
+orjson
os
outcoming
Overridable
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index c351a447610..0870538b211 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -1232,6 +1232,27 @@ and :ref:`aiohttp-web-signals` handlers::
The method is converted into :term:`coroutine`,
*compress* parameter added.
+ .. method:: send_json_bytes(data, compress=None, *, dumps)
+ :async:
+
+ Send *data* to peer as a JSON binary frame using a bytes-returning encoder.
+
+ :param data: data to send.
+
+ :param int compress: sets specific level of compression for
+ single message,
+ ``None`` for not overriding per-socket setting.
+
+ :param collections.abc.Callable dumps: any :term:`callable` that accepts an object and
+ returns JSON as :class:`bytes`
+ (e.g. ``orjson.dumps``).
+
+ :raise RuntimeError: if the connection is not started.
+
+ :raise ValueError: if data is not serializable object
+
+ :raise TypeError: if value returned by ``dumps`` param is not :class:`bytes`
+
.. method:: send_frame(message, opcode, compress=None)
:async:
@@ -1445,6 +1466,18 @@ the handler.
:class:`Response` initializer.
+.. function:: json_bytes_response([data], *, dumps, body=None, \
+ status=200, reason=None, headers=None, \
+ content_type='application/json')
+
+Return :class:`Response` with predefined ``'application/json'``
+content type and *data* encoded by ``dumps`` parameter
+which must return :class:`bytes` directly (e.g. ``orjson.dumps``).
+
+Use this when your JSON encoder returns :class:`bytes` instead of :class:`str`,
+avoiding the :class:`str`-to-:class:`bytes` encoding overhead.
+
+
.. class:: ResponseKey(name, t)
Keys for use in :class:`Response`.
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index 2d18a27d99a..cd83503d798 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -2244,7 +2244,28 @@ def dumps(obj):
await client.post("/", data="some data", json={"some": "data"})
-async def test_expect_continue(aiohttp_client) -> None:
+async def test_json_serialize_bytes(aiohttp_client: AiohttpClient) -> None:
+ """Test ClientSession.json_serialize_bytes with bytes-returning encoder."""
+
+ async def handler(request: web.Request) -> web.Response:
+ assert request.content_type == "application/json"
+ data = await request.json()
+ return web.Response(body=aiohttp.JsonPayload(data))
+
+ json_bytes_encoder = mock.Mock(side_effect=lambda x: json.dumps(x).encode("utf-8"))
+
+ app = web.Application()
+ app.router.add_post("/", handler)
+ client = await aiohttp_client(app, json_serialize_bytes=json_bytes_encoder)
+
+ async with client.post("/", json={"some": "data"}) as resp:
+ assert resp.status == 200
+ assert json_bytes_encoder.called
+ content = await resp.json()
+ assert content == {"some": "data"}
+
+
+async def test_expect_continue(aiohttp_client: AiohttpClient) -> None:
expect_called = False
async def handler(request):
diff --git a/tests/test_client_ws_functional.py b/tests/test_client_ws_functional.py
index 13c03fe0563..6a4f2296fe8 100644
--- a/tests/test_client_ws_functional.py
+++ b/tests/test_client_ws_functional.py
@@ -188,6 +188,81 @@ async def handler(request: web.Request) -> web.WebSocketResponse:
await resp.close()
+async def test_send_recv_json_bytes(aiohttp_client: AiohttpClient) -> None:
+ async def handler(request: web.Request) -> web.WebSocketResponse:
+ ws = web.WebSocketResponse()
+ await ws.prepare(request)
+ await ws.send_bytes(json.dumps({"response": "x"}).encode())
+ await ws.close()
+ return ws
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ client = await aiohttp_client(app)
+ resp = await client.ws_connect("/")
+ data = await resp.receive()
+ assert data.json() == {"response": "x"}
+ await resp.close()
+
+
+async def test_send_json_bytes_client(aiohttp_client: AiohttpClient) -> None:
+ """Test ClientWebSocketResponse.send_json_bytes sends binary frame."""
+
+ async def handler(request: web.Request) -> web.WebSocketResponse:
+ ws = web.WebSocketResponse()
+ await ws.prepare(request)
+
+ msg = await ws.receive()
+ assert msg.type is WSMsgType.BINARY
+ data = json.loads(msg.data)
+ await ws.send_json_bytes(
+ {"response": data["request"]},
+ dumps=lambda x: json.dumps(x).encode("utf-8"),
+ )
+ await ws.close()
+ return ws
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ client = await aiohttp_client(app)
+ resp = await client.ws_connect("/")
+ test_payload = {"request": "test"}
+ await resp.send_json_bytes(
+ test_payload, dumps=lambda x: json.dumps(x).encode("utf-8")
+ )
+
+ msg = await resp.receive()
+ assert msg.type is WSMsgType.BINARY
+ data = json.loads(msg.data)
+ assert data["response"] == test_payload["request"]
+ await resp.close()
+
+
+async def test_send_json_bytes_custom_encoder(aiohttp_client: AiohttpClient) -> None:
+ """Test send_json_bytes with custom bytes-returning encoder."""
+
+ async def handler(request: web.Request) -> web.WebSocketResponse:
+ ws = web.WebSocketResponse()
+ await ws.prepare(request)
+
+ msg = await ws.receive()
+ assert msg.type is WSMsgType.BINARY
+ # Custom encoder uses compact separators
+ assert msg.data == b'{"test":"value"}'
+ await ws.close()
+ return ws
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ client = await aiohttp_client(app)
+ resp = await client.ws_connect("/")
+ await resp.send_json_bytes(
+ {"test": "value"},
+ dumps=lambda x: json.dumps(x, separators=(",", ":")).encode("utf-8"),
+ )
+ await resp.close()
+
+
async def test_send_recv_frame(aiohttp_client: AiohttpClient) -> None:
async def handler(request: web.Request) -> web.WebSocketResponse:
ws = web.WebSocketResponse()
diff --git a/tests/test_payload.py b/tests/test_payload.py
index 9f79440ef87..6fcc00449b8 100644
--- a/tests/test_payload.py
+++ b/tests/test_payload.py
@@ -1222,6 +1222,40 @@ def test_json_payload_size() -> None:
assert jp_custom.size == len(expected_custom.encode("utf-16"))
+def test_json_bytes_payload() -> None:
+ """Test JsonBytesPayload with a bytes-returning encoder."""
+ data = {"hello": "world"}
+
+ # Test with standard library encoder
+ jp = payload.JsonBytesPayload(data, dumps=lambda x: json.dumps(x).encode("utf-8"))
+ expected = json.dumps(data).encode("utf-8")
+ assert jp.size == len(expected)
+
+ # Test with custom bytes-returning encoder (compact separators)
+ jp_custom = payload.JsonBytesPayload(
+ data, dumps=lambda x: json.dumps(x, separators=(",", ":")).encode("utf-8")
+ )
+ expected_custom = json.dumps(data, separators=(",", ":")).encode("utf-8")
+ assert jp_custom.size == len(expected_custom)
+
+
+def test_json_bytes_payload_content_type() -> None:
+ """Test JsonBytesPayload content_type."""
+ data = {"test": "data"}
+
+ # Default content type
+ jp = payload.JsonBytesPayload(data, dumps=lambda x: json.dumps(x).encode("utf-8"))
+ assert jp.content_type == "application/json"
+
+ # Custom content type
+ jp_custom = payload.JsonBytesPayload(
+ data,
+ dumps=lambda x: json.dumps(x).encode("utf-8"),
+ content_type="application/vnd.api+json",
+ )
+ assert jp_custom.content_type == "application/vnd.api+json"
+
+
async def test_text_io_payload_size_matches_file_encoding(tmp_path: Path) -> None:
"""Test TextIOPayload.size when file encoding matches payload encoding."""
# Create UTF-8 file
diff --git a/tests/test_web_response.py b/tests/test_web_response.py
index 011e3ef1a0c..a6467367d61 100644
--- a/tests/test_web_response.py
+++ b/tests/test_web_response.py
@@ -13,7 +13,7 @@
from multidict import CIMultiDict, CIMultiDictProxy, MultiDict
from re_assert import Matches
-from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs
+from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs, web
from aiohttp.helpers import ETag
from aiohttp.http_writer import StreamWriter, _serialize_headers
from aiohttp.multipart import BodyPartReader, MultipartWriter
@@ -1620,6 +1620,50 @@ def test_content_type_is_overrideable(self) -> None:
assert "application/vnd.json+api" == resp.content_type
+class TestJSONBytesResponse:
+ def test_content_type_is_application_json_by_default(self) -> None:
+ resp = web.json_bytes_response(
+ "", dumps=lambda x: json.dumps(x).encode("utf-8")
+ )
+ assert "application/json" == resp.content_type
+
+ def test_passing_body_only(self) -> None:
+ resp = web.json_bytes_response(
+ dumps=lambda x: json.dumps(x).encode("utf-8"),
+ body=b'"jaysawn"',
+ )
+ assert resp.body == b'"jaysawn"'
+
+ def test_data_and_body_raises_value_error(self) -> None:
+ with pytest.raises(ValueError) as excinfo:
+ web.json_bytes_response(
+ data="foo", dumps=lambda x: json.dumps(x).encode("utf-8"), body=b"bar"
+ )
+ expected_message = "only one of data or body should be specified"
+ assert expected_message == excinfo.value.args[0]
+
+ def test_body_is_json_encoded_bytes(self) -> None:
+ resp = web.json_bytes_response(
+ {"foo": 42}, dumps=lambda x: json.dumps(x).encode("utf-8")
+ )
+ assert json.dumps({"foo": 42}).encode("utf-8") == resp.body
+
+ def test_content_type_is_overrideable(self) -> None:
+ resp = web.json_bytes_response(
+ {"foo": 42},
+ dumps=lambda x: json.dumps(x).encode("utf-8"),
+ content_type="application/vnd.json+api",
+ )
+ assert "application/vnd.json+api" == resp.content_type
+
+ def test_custom_dumps(self) -> None:
+ resp = web.json_bytes_response(
+ {"foo": 42},
+ dumps=lambda x: json.dumps(x, separators=(",", ":")).encode("utf-8"),
+ )
+ assert b'{"foo":42}' == resp.body
+
+
@pytest.mark.parametrize("loose_header_type", (MultiDict, CIMultiDict, dict))
async def test_passing_cimultidict_to_web_response_not_mutated(
loose_header_type: type,
diff --git a/tests/test_web_websocket.py b/tests/test_web_websocket.py
index 3593936e6eb..1678783d5da 100644
--- a/tests/test_web_websocket.py
+++ b/tests/test_web_websocket.py
@@ -1,6 +1,7 @@
import asyncio
+import json
import time
-from typing import Any
+from typing import Any, Protocol
from unittest import mock
import aiosignal
@@ -15,6 +16,16 @@
from aiohttp.web_ws import WebSocketReady
+class _RequestMaker(Protocol):
+ def __call__(
+ self,
+ method: str,
+ path: str,
+ headers: CIMultiDict[str] | None = None,
+ protocols: bool = False,
+ ) -> web.Request: ...
+
+
@pytest.fixture
def app(loop):
ret = mock.Mock()
@@ -204,6 +215,26 @@ async def test_send_json_nonjson(make_request) -> None:
await ws.send_json(set())
+async def test_nonstarted_send_json_bytes() -> None:
+ ws = web.WebSocketResponse()
+ with pytest.raises(RuntimeError):
+ await ws.send_json_bytes(
+ {"type": "json"}, dumps=lambda x: json.dumps(x).encode("utf-8")
+ )
+
+
+async def test_send_json_bytes_nonjson(make_request: _RequestMaker) -> None:
+ req = make_request("GET", "/")
+ ws = web.WebSocketResponse()
+ await ws.prepare(req)
+ with pytest.raises(TypeError):
+ await ws.send_json_bytes(set(), dumps=lambda x: json.dumps(x).encode("utf-8"))
+
+ assert ws._reader is not None
+ ws._reader.feed_data(WS_CLOSED_MESSAGE, 0)
+ await ws.close()
+
+
async def test_write_non_prepared() -> None:
ws = WebSocketResponse()
with pytest.raises(RuntimeError):
@@ -392,7 +423,21 @@ async def test_send_json_closed(make_request) -> None:
await ws.send_json({"type": "json"})
-async def test_send_frame_closed(make_request) -> None:
+async def test_send_json_bytes_closed(make_request: _RequestMaker) -> None:
+ req = make_request("GET", "/")
+ ws = web.WebSocketResponse()
+ await ws.prepare(req)
+ assert ws._reader is not None
+ ws._reader.feed_data(WS_CLOSED_MESSAGE, 0)
+ await ws.close()
+
+ with pytest.raises(ConnectionError):
+ await ws.send_json_bytes(
+ {"type": "json"}, dumps=lambda x: json.dumps(x).encode("utf-8")
+ )
+
+
+async def test_send_frame_closed(make_request: _RequestMaker) -> None:
req = make_request("GET", "/")
ws = WebSocketResponse()
await ws.prepare(req)
From 58e7f8fbd1dd6619d576f7a7aa5ac98ed480ec9b Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 12:57:33 +0000
Subject: [PATCH 074/141] [PR #12106/8ab84c52 backport][3.14] Bound DNS cache
(#12116)
**This is a backport of PR #12106 as merged into master
(8ab84c52fe58ef34794fa9b12f00b06e626adcc0).**
---------
Co-authored-by: Sam Bull
Co-authored-by: gonas
---
CHANGES/12106.feature.rst | 1 +
aiohttp/connector.py | 24 +++++++++----
tests/test_connector.py | 76 +++++++++++++++++++++++++++++++++++++++
3 files changed, 95 insertions(+), 6 deletions(-)
create mode 100644 CHANGES/12106.feature.rst
diff --git a/CHANGES/12106.feature.rst b/CHANGES/12106.feature.rst
new file mode 100644
index 00000000000..daa9088eed6
--- /dev/null
+++ b/CHANGES/12106.feature.rst
@@ -0,0 +1 @@
+Added a ``dns_cache_max_size`` parameter to ``TCPConnector`` to limit the size of the cache -- by :user:`Dreamsorcerer`.
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index 6e1e0919393..0e9e3a5f22a 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -839,25 +839,33 @@ async def _create_connection(
class _DNSCacheTable:
- def __init__(self, ttl: float | None = None) -> None:
- self._addrs_rr: dict[tuple[str, int], tuple[Iterator[ResolveResult], int]] = {}
+ def __init__(self, ttl: float | None = None, max_size: int = 1000) -> None:
+ self._addrs_rr: OrderedDict[
+ tuple[str, int], tuple[Iterator[ResolveResult], int]
+ ] = OrderedDict()
self._timestamps: dict[tuple[str, int], float] = {}
self._ttl = ttl
+ self._max_size = max_size
def __contains__(self, host: object) -> bool:
return host in self._addrs_rr
def add(self, key: tuple[str, int], addrs: list[ResolveResult]) -> None:
+ if key in self._addrs_rr:
+ self._addrs_rr.move_to_end(key)
+
self._addrs_rr[key] = (cycle(addrs), len(addrs))
if self._ttl is not None:
self._timestamps[key] = monotonic()
+ if len(self._addrs_rr) > self._max_size:
+ oldest_key, _ = self._addrs_rr.popitem(last=False)
+ self._timestamps.pop(oldest_key, None)
+
def remove(self, key: tuple[str, int]) -> None:
self._addrs_rr.pop(key, None)
-
- if self._ttl is not None:
- self._timestamps.pop(key, None)
+ self._timestamps.pop(key, None)
def clear(self) -> None:
self._addrs_rr.clear()
@@ -868,6 +876,7 @@ def next_addrs(self, key: tuple[str, int]) -> list[ResolveResult]:
addrs = list(islice(loop, length))
# Consume one more element to shift internal state of `cycle`
next(loop)
+ self._addrs_rr.move_to_end(key)
return addrs
def expired(self, key: tuple[str, int]) -> bool:
@@ -956,6 +965,7 @@ def __init__(
fingerprint: bytes | None = None,
use_dns_cache: bool = True,
ttl_dns_cache: int | None = 10,
+ dns_cache_max_size: int = 1000,
family: socket.AddressFamily = socket.AddressFamily.AF_UNSPEC,
ssl_context: SSLContext | None = None,
ssl: bool | Fingerprint | SSLContext = True,
@@ -994,7 +1004,9 @@ def __init__(
self._resolver_owner = False
self._use_dns_cache = use_dns_cache
- self._cached_hosts = _DNSCacheTable(ttl=ttl_dns_cache)
+ self._cached_hosts = _DNSCacheTable(
+ ttl=ttl_dns_cache, max_size=dns_cache_max_size
+ )
self._throttle_dns_futures: dict[tuple[str, int], set[asyncio.Future[None]]] = (
{}
)
diff --git a/tests/test_connector.py b/tests/test_connector.py
index 7fd1bd0158b..15a2613f78b 100644
--- a/tests/test_connector.py
+++ b/tests/test_connector.py
@@ -4027,6 +4027,25 @@ async def handler(request):
class TestDNSCacheTable:
+ host1 = ("localhost", 80)
+ host2 = ("foo", 80)
+ result1: ResolveResult = {
+ "hostname": "localhost",
+ "host": "127.0.0.1",
+ "port": 80,
+ "family": socket.AF_INET,
+ "proto": 0,
+ "flags": socket.AI_NUMERICHOST,
+ }
+ result2: ResolveResult = {
+ "hostname": "foo",
+ "host": "127.0.0.2",
+ "port": 80,
+ "family": socket.AF_INET,
+ "proto": 0,
+ "flags": socket.AI_NUMERICHOST,
+ }
+
@pytest.fixture
def dns_cache_table(self):
return _DNSCacheTable()
@@ -4112,6 +4131,63 @@ def test_next_addrs_single(self, dns_cache_table) -> None:
addrs = dns_cache_table.next_addrs("foo")
assert addrs == ["127.0.0.1"]
+ def test_max_size_eviction(self) -> None:
+ table = _DNSCacheTable(max_size=2)
+
+ table.add(self.host1, [self.result1])
+ table.add(self.host2, [self.result2])
+
+ host3 = ("example.com", 80)
+ result3: ResolveResult = {
+ **self.result1,
+ "hostname": "example.com",
+ "host": "1.2.3.4",
+ }
+ table.add(host3, [result3])
+
+ assert len(table._addrs_rr) == 2
+ assert self.host1 not in table._addrs_rr
+ assert host3 in table._addrs_rr
+
+ def test_lru_eviction(self) -> None:
+ table = _DNSCacheTable(max_size=2)
+
+ table.add(self.host1, [self.result1])
+ table.add(self.host2, [self.result2])
+
+ table.next_addrs(self.host1)
+
+ host3 = ("example.com", 80)
+ result3: ResolveResult = {
+ **self.result1,
+ "hostname": "example.com",
+ "host": "1.2.3.4",
+ }
+ table.add(host3, [result3])
+
+ assert self.host1 in table._addrs_rr
+ assert self.host2 not in table._addrs_rr
+
+ def test_lru_eviction_add(self) -> None:
+ table = _DNSCacheTable(max_size=2)
+
+ table.add(self.host1, [self.result1])
+ table.add(self.host2, [self.result2])
+
+ # Re-add, thus making host1 the most recently used.
+ table.add(self.host1, [self.result1])
+
+ host3 = ("example.com", 80)
+ result3: ResolveResult = {
+ **self.result1,
+ "hostname": "example.com",
+ "host": "1.2.3.4",
+ }
+ table.add(host3, [result3])
+
+ assert self.host1 in table._addrs_rr
+ assert self.host2 not in table._addrs_rr
+
async def test_connector_cache_trace_race():
class DummyTracer:
From c4d77c3533122be353b8afca8e8675e3b4cbda98 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sun, 22 Feb 2026 12:58:22 +0000
Subject: [PATCH 075/141] Bound DNS cache (#12106) (#12117)
(cherry picked from commit 8ab84c52fe58ef34794fa9b12f00b06e626adcc0)
---------
Co-authored-by: gonas
---
CHANGES/12106.feature.rst | 1 +
aiohttp/connector.py | 24 +++++++++----
tests/test_connector.py | 76 +++++++++++++++++++++++++++++++++++++++
3 files changed, 95 insertions(+), 6 deletions(-)
create mode 100644 CHANGES/12106.feature.rst
diff --git a/CHANGES/12106.feature.rst b/CHANGES/12106.feature.rst
new file mode 100644
index 00000000000..daa9088eed6
--- /dev/null
+++ b/CHANGES/12106.feature.rst
@@ -0,0 +1 @@
+Added a ``dns_cache_max_size`` parameter to ``TCPConnector`` to limit the size of the cache -- by :user:`Dreamsorcerer`.
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index 348f1271a7d..ce28ce032c5 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -856,25 +856,33 @@ async def _create_connection(
class _DNSCacheTable:
- def __init__(self, ttl: Optional[float] = None) -> None:
- self._addrs_rr: Dict[Tuple[str, int], Tuple[Iterator[ResolveResult], int]] = {}
+ def __init__(self, ttl: Optional[float] = None, max_size: int = 1000) -> None:
+ self._addrs_rr: OrderedDict[
+ Tuple[str, int], Tuple[Iterator[ResolveResult], int]
+ ] = OrderedDict()
self._timestamps: Dict[Tuple[str, int], float] = {}
self._ttl = ttl
+ self._max_size = max_size
def __contains__(self, host: object) -> bool:
return host in self._addrs_rr
def add(self, key: Tuple[str, int], addrs: List[ResolveResult]) -> None:
+ if key in self._addrs_rr:
+ self._addrs_rr.move_to_end(key)
+
self._addrs_rr[key] = (cycle(addrs), len(addrs))
if self._ttl is not None:
self._timestamps[key] = monotonic()
+ if len(self._addrs_rr) > self._max_size:
+ oldest_key, _ = self._addrs_rr.popitem(last=False)
+ self._timestamps.pop(oldest_key, None)
+
def remove(self, key: Tuple[str, int]) -> None:
self._addrs_rr.pop(key, None)
-
- if self._ttl is not None:
- self._timestamps.pop(key, None)
+ self._timestamps.pop(key, None)
def clear(self) -> None:
self._addrs_rr.clear()
@@ -885,6 +893,7 @@ def next_addrs(self, key: Tuple[str, int]) -> List[ResolveResult]:
addrs = list(islice(loop, length))
# Consume one more element to shift internal state of `cycle`
next(loop)
+ self._addrs_rr.move_to_end(key)
return addrs
def expired(self, key: Tuple[str, int]) -> bool:
@@ -973,6 +982,7 @@ def __init__(
fingerprint: Optional[bytes] = None,
use_dns_cache: bool = True,
ttl_dns_cache: Optional[int] = 10,
+ dns_cache_max_size: int = 1000,
family: socket.AddressFamily = socket.AddressFamily.AF_UNSPEC,
ssl_context: Optional[SSLContext] = None,
ssl: Union[bool, Fingerprint, SSLContext] = True,
@@ -1011,7 +1021,9 @@ def __init__(
self._resolver_owner = False
self._use_dns_cache = use_dns_cache
- self._cached_hosts = _DNSCacheTable(ttl=ttl_dns_cache)
+ self._cached_hosts = _DNSCacheTable(
+ ttl=ttl_dns_cache, max_size=dns_cache_max_size
+ )
self._throttle_dns_futures: Dict[
Tuple[str, int], Set["asyncio.Future[None]"]
] = {}
diff --git a/tests/test_connector.py b/tests/test_connector.py
index 0755cff96ff..093ed98e7f1 100644
--- a/tests/test_connector.py
+++ b/tests/test_connector.py
@@ -4036,6 +4036,25 @@ async def handler(request):
class TestDNSCacheTable:
+ host1 = ("localhost", 80)
+ host2 = ("foo", 80)
+ result1: ResolveResult = {
+ "hostname": "localhost",
+ "host": "127.0.0.1",
+ "port": 80,
+ "family": socket.AF_INET,
+ "proto": 0,
+ "flags": socket.AI_NUMERICHOST,
+ }
+ result2: ResolveResult = {
+ "hostname": "foo",
+ "host": "127.0.0.2",
+ "port": 80,
+ "family": socket.AF_INET,
+ "proto": 0,
+ "flags": socket.AI_NUMERICHOST,
+ }
+
@pytest.fixture
def dns_cache_table(self):
return _DNSCacheTable()
@@ -4121,6 +4140,63 @@ def test_next_addrs_single(self, dns_cache_table) -> None:
addrs = dns_cache_table.next_addrs("foo")
assert addrs == ["127.0.0.1"]
+ def test_max_size_eviction(self) -> None:
+ table = _DNSCacheTable(max_size=2)
+
+ table.add(self.host1, [self.result1])
+ table.add(self.host2, [self.result2])
+
+ host3 = ("example.com", 80)
+ result3: ResolveResult = {
+ **self.result1,
+ "hostname": "example.com",
+ "host": "1.2.3.4",
+ }
+ table.add(host3, [result3])
+
+ assert len(table._addrs_rr) == 2
+ assert self.host1 not in table._addrs_rr
+ assert host3 in table._addrs_rr
+
+ def test_lru_eviction(self) -> None:
+ table = _DNSCacheTable(max_size=2)
+
+ table.add(self.host1, [self.result1])
+ table.add(self.host2, [self.result2])
+
+ table.next_addrs(self.host1)
+
+ host3 = ("example.com", 80)
+ result3: ResolveResult = {
+ **self.result1,
+ "hostname": "example.com",
+ "host": "1.2.3.4",
+ }
+ table.add(host3, [result3])
+
+ assert self.host1 in table._addrs_rr
+ assert self.host2 not in table._addrs_rr
+
+ def test_lru_eviction_add(self) -> None:
+ table = _DNSCacheTable(max_size=2)
+
+ table.add(self.host1, [self.result1])
+ table.add(self.host2, [self.result2])
+
+ # Re-add, thus making host1 the most recently used.
+ table.add(self.host1, [self.result1])
+
+ host3 = ("example.com", 80)
+ result3: ResolveResult = {
+ **self.result1,
+ "hostname": "example.com",
+ "host": "1.2.3.4",
+ }
+ table.add(host3, [result3])
+
+ assert self.host1 in table._addrs_rr
+ assert self.host2 not in table._addrs_rr
+
async def test_connector_cache_trace_race():
class DummyTracer:
From 6bebdca3003ce78523518360a837309b18ad7e04 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 13:38:43 +0000
Subject: [PATCH 076/141] [PR #12113/4d15c331 backport][3.14] fix: widen
trace_request_ctx type to accept any object (#12118)
**This is a backport of PR #12113 as merged into master
(4d15c331137496127571e8a15f787020890a5abb).**
---------
Co-authored-by: nightcityblade
Co-authored-by: Sam Bull
---
CHANGES/10753.bugfix.rst | 1 +
aiohttp/client.py | 5 ++---
aiohttp/tracing.py | 7 ++-----
tests/test_tracing.py | 13 +++++++++++++
4 files changed, 18 insertions(+), 8 deletions(-)
create mode 100644 CHANGES/10753.bugfix.rst
diff --git a/CHANGES/10753.bugfix.rst b/CHANGES/10753.bugfix.rst
new file mode 100644
index 00000000000..e0f4cfd0dd1
--- /dev/null
+++ b/CHANGES/10753.bugfix.rst
@@ -0,0 +1 @@
+Widened ``trace_request_ctx`` parameter type from ``Mapping[str, Any] | None`` to ``object`` to allow passing instances of user-defined classes as trace context -- by :user:`nightcityblade`.
diff --git a/aiohttp/client.py b/aiohttp/client.py
index 7c18e8c3318..bc6f4622afc 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -14,7 +14,6 @@
Coroutine,
Generator,
Iterable,
- Mapping,
Sequence,
)
from contextlib import suppress
@@ -194,7 +193,7 @@ class _RequestOptions(TypedDict, total=False):
ssl: SSLContext | bool | Fingerprint
server_hostname: str | None
proxy_headers: LooseHeaders | None
- trace_request_ctx: Mapping[str, Any] | None
+ trace_request_ctx: object
read_bufsize: int | None
auto_decompress: bool | None
max_line_size: int | None
@@ -548,7 +547,7 @@ async def _request(
ssl: SSLContext | bool | Fingerprint = True,
server_hostname: str | None = None,
proxy_headers: LooseHeaders | None = None,
- trace_request_ctx: Mapping[str, Any] | None = None,
+ trace_request_ctx: object = None,
read_bufsize: int | None = None,
auto_decompress: bool | None = None,
max_line_size: int | None = None,
diff --git a/aiohttp/tracing.py b/aiohttp/tracing.py
index 2f66878d2ee..cfee054afe5 100644
--- a/aiohttp/tracing.py
+++ b/aiohttp/tracing.py
@@ -1,6 +1,5 @@
-from collections.abc import Mapping
from types import SimpleNamespace
-from typing import TYPE_CHECKING, TypeVar
+from typing import TYPE_CHECKING, Any, TypeVar
import attr
from aiosignal import Signal
@@ -86,9 +85,7 @@ def __init__(
self._trace_config_ctx_factory = trace_config_ctx_factory
- def trace_config_ctx(
- self, trace_request_ctx: Mapping[str, str] | None = None
- ) -> SimpleNamespace:
+ def trace_config_ctx(self, trace_request_ctx: Any = None) -> SimpleNamespace:
"""Return a new trace_config_ctx instance"""
return self._trace_config_ctx_factory(trace_request_ctx=trace_request_ctx)
diff --git a/tests/test_tracing.py b/tests/test_tracing.py
index 845c0ba6ab4..868c25da6cb 100644
--- a/tests/test_tracing.py
+++ b/tests/test_tracing.py
@@ -42,6 +42,19 @@ def test_trace_config_ctx_request_ctx(self) -> None:
)
assert trace_config_ctx.trace_request_ctx is trace_request_ctx
+ def test_trace_config_ctx_custom_class(self) -> None:
+ """Custom class instances should be accepted as trace_request_ctx (#10753)."""
+
+ class MyContext:
+ def __init__(self, request_id: int) -> None:
+ self.request_id = request_id
+
+ ctx = MyContext(request_id=42)
+ trace_config = TraceConfig()
+ trace_config_ctx = trace_config.trace_config_ctx(trace_request_ctx=ctx)
+ assert trace_config_ctx.trace_request_ctx is ctx
+ assert trace_config_ctx.trace_request_ctx.request_id == 42
+
def test_freeze(self) -> None:
trace_config = TraceConfig()
trace_config.freeze()
From 4bcb9426be75e53601105542654c96e7e5c2726b Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 15:06:25 +0000
Subject: [PATCH 077/141] [PR #12119/0e2d3ec4 backport][3.14] Fix server hang
on chunked transfer encoding size mismatch (#12122)
**This is a backport of PR #12119 as merged into master
(0e2d3ec48a950a2aacaf3f5d0d1f1597a1d5385b).**
Co-authored-by: Fridayai700
---
CHANGES/10596.bugfix.rst | 4 ++++
aiohttp/http_parser.py | 6 ++++++
tests/test_http_parser.py | 30 ++++++++++++++++++++++++++++++
3 files changed, 40 insertions(+)
create mode 100644 CHANGES/10596.bugfix.rst
diff --git a/CHANGES/10596.bugfix.rst b/CHANGES/10596.bugfix.rst
new file mode 100644
index 00000000000..f96a0215de3
--- /dev/null
+++ b/CHANGES/10596.bugfix.rst
@@ -0,0 +1,4 @@
+Fixed server hanging indefinitely when chunked transfer encoding chunk-size
+does not match actual data length. The server now raises
+``TransferEncodingError`` instead of waiting forever for data that will
+never arrive -- by :user:`Fridayai700`.
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index 9e7fd84bd75..adf73f65e16 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -917,6 +917,12 @@ def feed_data(
if chunk[: len(SEP)] == SEP:
chunk = chunk[len(SEP) :]
self._chunk = ChunkState.PARSE_CHUNKED_SIZE
+ elif len(chunk) >= len(SEP) or chunk != SEP[: len(chunk)]:
+ exc = TransferEncodingError(
+ "Chunk size mismatch: expected CRLF after chunk data"
+ )
+ set_exception(self.payload, exc)
+ raise exc
else:
self._chunk_tail = chunk
return False, b""
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index ffb12b51195..8ab991b6ee2 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -1698,6 +1698,36 @@ async def test_parse_chunked_payload_size_error(
p.feed_data(b"blah\r\n")
assert isinstance(out.exception(), http_exceptions.TransferEncodingError)
+ async def test_parse_chunked_payload_size_data_mismatch(
+ self, protocol: BaseProtocol
+ ) -> None:
+ """Chunk-size does not match actual data: should raise, not hang.
+
+ Regression test for #10596.
+ """
+ out = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ p = HttpPayloadParser(out, chunked=True, headers_parser=HeadersParser())
+ # Declared chunk-size is 4 but actual data is "Hello" (5 bytes).
+ # After consuming 4 bytes, remaining starts with "o" not "\r\n".
+ with pytest.raises(http_exceptions.TransferEncodingError):
+ p.feed_data(b"4\r\nHello\r\n0\r\n\r\n")
+ assert isinstance(out.exception(), http_exceptions.TransferEncodingError)
+
+ async def test_parse_chunked_payload_size_data_mismatch_too_short(
+ self, protocol: BaseProtocol
+ ) -> None:
+ """Chunk-size larger than data: declared 6 but only 5 bytes before CRLF.
+
+ Regression test for #10596.
+ """
+ out = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ p = HttpPayloadParser(out, chunked=True, headers_parser=HeadersParser())
+ # Declared chunk-size is 6 but actual data before CRLF is "Hello" (5 bytes).
+ # Parser reads 6 bytes: "Hello\r", then expects \r\n but sees "\n0\r\n..."
+ with pytest.raises(http_exceptions.TransferEncodingError):
+ p.feed_data(b"6\r\nHello\r\n0\r\n\r\n")
+ assert isinstance(out.exception(), http_exceptions.TransferEncodingError)
+
async def test_parse_chunked_payload_split_end(
self, protocol: BaseProtocol
) -> None:
From 36573aad2bb56d99bd4dd60a53c5588a14be16f8 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 15:06:46 +0000
Subject: [PATCH 078/141] [PR #12119/0e2d3ec4 backport][3.13] Fix server hang
on chunked transfer encoding size mismatch (#12121)
**This is a backport of PR #12119 as merged into master
(0e2d3ec48a950a2aacaf3f5d0d1f1597a1d5385b).**
Co-authored-by: Fridayai700
---
CHANGES/10596.bugfix.rst | 4 ++++
aiohttp/http_parser.py | 6 ++++++
tests/test_http_parser.py | 30 ++++++++++++++++++++++++++++++
3 files changed, 40 insertions(+)
create mode 100644 CHANGES/10596.bugfix.rst
diff --git a/CHANGES/10596.bugfix.rst b/CHANGES/10596.bugfix.rst
new file mode 100644
index 00000000000..f96a0215de3
--- /dev/null
+++ b/CHANGES/10596.bugfix.rst
@@ -0,0 +1,4 @@
+Fixed server hanging indefinitely when chunked transfer encoding chunk-size
+does not match actual data length. The server now raises
+``TransferEncodingError`` instead of waiting forever for data that will
+never arrive -- by :user:`Fridayai700`.
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index d7f127896a8..ec7d1ee6347 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -931,6 +931,12 @@ def feed_data(
if chunk[: len(SEP)] == SEP:
chunk = chunk[len(SEP) :]
self._chunk = ChunkState.PARSE_CHUNKED_SIZE
+ elif len(chunk) >= len(SEP) or chunk != SEP[: len(chunk)]:
+ exc = TransferEncodingError(
+ "Chunk size mismatch: expected CRLF after chunk data"
+ )
+ set_exception(self.payload, exc)
+ raise exc
else:
self._chunk_tail = chunk
return False, b""
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 84cdeb729e5..a92a370e0ba 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -1698,6 +1698,36 @@ async def test_parse_chunked_payload_size_error(
p.feed_data(b"blah\r\n")
assert isinstance(out.exception(), http_exceptions.TransferEncodingError)
+ async def test_parse_chunked_payload_size_data_mismatch(
+ self, protocol: BaseProtocol
+ ) -> None:
+ """Chunk-size does not match actual data: should raise, not hang.
+
+ Regression test for #10596.
+ """
+ out = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ p = HttpPayloadParser(out, chunked=True, headers_parser=HeadersParser())
+ # Declared chunk-size is 4 but actual data is "Hello" (5 bytes).
+ # After consuming 4 bytes, remaining starts with "o" not "\r\n".
+ with pytest.raises(http_exceptions.TransferEncodingError):
+ p.feed_data(b"4\r\nHello\r\n0\r\n\r\n")
+ assert isinstance(out.exception(), http_exceptions.TransferEncodingError)
+
+ async def test_parse_chunked_payload_size_data_mismatch_too_short(
+ self, protocol: BaseProtocol
+ ) -> None:
+ """Chunk-size larger than data: declared 6 but only 5 bytes before CRLF.
+
+ Regression test for #10596.
+ """
+ out = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ p = HttpPayloadParser(out, chunked=True, headers_parser=HeadersParser())
+ # Declared chunk-size is 6 but actual data before CRLF is "Hello" (5 bytes).
+ # Parser reads 6 bytes: "Hello\r", then expects \r\n but sees "\n0\r\n..."
+ with pytest.raises(http_exceptions.TransferEncodingError):
+ p.feed_data(b"6\r\nHello\r\n0\r\n\r\n")
+ assert isinstance(out.exception(), http_exceptions.TransferEncodingError)
+
async def test_parse_chunked_payload_split_end(
self, protocol: BaseProtocol
) -> None:
From 4f12a6641e3db60aa35c60c9db61b45c89176c00 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sun, 22 Feb 2026 15:17:03 +0000
Subject: [PATCH 079/141] Fix _sendfile_fallback over-reading beyond requested
count (#12096) (#12123)
(cherry picked from commit 291d969ec2806b9b920ee48c761a4c9310b13300)
Co-authored-by: Kadir Can Ozden <101993364+bysiber@users.noreply.github.com>
---
CHANGES/12096.bugfix.rst | 1 +
aiohttp/web_fileresponse.py | 4 ++--
tests/test_web_sendfile.py | 30 ++++++++++++++++++++++++++++++
3 files changed, 33 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/12096.bugfix.rst
diff --git a/CHANGES/12096.bugfix.rst b/CHANGES/12096.bugfix.rst
new file mode 100644
index 00000000000..945b452309a
--- /dev/null
+++ b/CHANGES/12096.bugfix.rst
@@ -0,0 +1 @@
+Fixed _sendfile_fallback over-reading beyond requested count -- by :user:`bysiber`.
diff --git a/aiohttp/web_fileresponse.py b/aiohttp/web_fileresponse.py
index 15672f39df7..73be9b3f090 100644
--- a/aiohttp/web_fileresponse.py
+++ b/aiohttp/web_fileresponse.py
@@ -118,11 +118,11 @@ async def _sendfile_fallback(
chunk_size = self._chunk_size
loop = asyncio.get_event_loop()
chunk = await loop.run_in_executor(
- None, self._seek_and_read, fobj, offset, chunk_size
+ None, self._seek_and_read, fobj, offset, min(chunk_size, count)
)
while chunk:
await writer.write(chunk)
- count = count - chunk_size
+ count = count - len(chunk)
if count <= 0:
break
chunk = await loop.run_in_executor(None, fobj.read, min(chunk_size, count))
diff --git a/tests/test_web_sendfile.py b/tests/test_web_sendfile.py
index 61c3b49834f..dbb2b902897 100644
--- a/tests/test_web_sendfile.py
+++ b/tests/test_web_sendfile.py
@@ -1,3 +1,4 @@
+import io
from pathlib import Path
from stat import S_IFREG, S_IRUSR, S_IWUSR
from unittest import mock
@@ -155,3 +156,32 @@ async def test_file_response_sends_headers_immediately() -> None:
# Headers should be sent immediately
writer.send_headers.assert_called_once()
+
+
+async def test_sendfile_fallback_respects_count_boundary() -> None:
+ """Regression test: _sendfile_fallback should not read beyond the requested count.
+
+ Previously the first chunk used the full chunk_size even when count was smaller,
+ and the loop subtracted chunk_size instead of the actual bytes read. Both bugs
+ could cause extra data to be sent when serving range requests.
+ """
+ file_data = b"A" * 100 + b"B" * 50 # 150 bytes total
+ fobj = io.BytesIO(file_data)
+
+ writer = mock.AsyncMock()
+ written = bytearray()
+
+ async def capture_write(data: bytes) -> None:
+ written.extend(data)
+
+ writer.write = capture_write
+ writer.drain = mock.AsyncMock()
+
+ file_sender = FileResponse("dummy.bin")
+ file_sender._chunk_size = 64 # smaller than count to test multi-chunk
+
+ # Request only the first 100 bytes (offset=0, count=100)
+ await file_sender._sendfile_fallback(writer, fobj, offset=0, count=100)
+
+ assert bytes(written) == b"A" * 100
+ assert len(written) == 100
From ceb098bae7498a7a7cc9a116918dbde06e03eb28 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sun, 22 Feb 2026 15:17:16 +0000
Subject: [PATCH 080/141] Fix _sendfile_fallback over-reading beyond requested
count (#12096) (#12124)
(cherry picked from commit 291d969ec2806b9b920ee48c761a4c9310b13300)
Co-authored-by: Kadir Can Ozden <101993364+bysiber@users.noreply.github.com>
---
CHANGES/12096.bugfix.rst | 1 +
aiohttp/web_fileresponse.py | 4 ++--
tests/test_web_sendfile.py | 30 ++++++++++++++++++++++++++++++
3 files changed, 33 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/12096.bugfix.rst
diff --git a/CHANGES/12096.bugfix.rst b/CHANGES/12096.bugfix.rst
new file mode 100644
index 00000000000..945b452309a
--- /dev/null
+++ b/CHANGES/12096.bugfix.rst
@@ -0,0 +1 @@
+Fixed _sendfile_fallback over-reading beyond requested count -- by :user:`bysiber`.
diff --git a/aiohttp/web_fileresponse.py b/aiohttp/web_fileresponse.py
index 26484b9483a..bba2c2c0d00 100644
--- a/aiohttp/web_fileresponse.py
+++ b/aiohttp/web_fileresponse.py
@@ -118,11 +118,11 @@ async def _sendfile_fallback(
chunk_size = self._chunk_size
loop = asyncio.get_event_loop()
chunk = await loop.run_in_executor(
- None, self._seek_and_read, fobj, offset, chunk_size
+ None, self._seek_and_read, fobj, offset, min(chunk_size, count)
)
while chunk:
await writer.write(chunk)
- count = count - chunk_size
+ count = count - len(chunk)
if count <= 0:
break
chunk = await loop.run_in_executor(None, fobj.read, min(chunk_size, count))
diff --git a/tests/test_web_sendfile.py b/tests/test_web_sendfile.py
index 61c3b49834f..dbb2b902897 100644
--- a/tests/test_web_sendfile.py
+++ b/tests/test_web_sendfile.py
@@ -1,3 +1,4 @@
+import io
from pathlib import Path
from stat import S_IFREG, S_IRUSR, S_IWUSR
from unittest import mock
@@ -155,3 +156,32 @@ async def test_file_response_sends_headers_immediately() -> None:
# Headers should be sent immediately
writer.send_headers.assert_called_once()
+
+
+async def test_sendfile_fallback_respects_count_boundary() -> None:
+ """Regression test: _sendfile_fallback should not read beyond the requested count.
+
+ Previously the first chunk used the full chunk_size even when count was smaller,
+ and the loop subtracted chunk_size instead of the actual bytes read. Both bugs
+ could cause extra data to be sent when serving range requests.
+ """
+ file_data = b"A" * 100 + b"B" * 50 # 150 bytes total
+ fobj = io.BytesIO(file_data)
+
+ writer = mock.AsyncMock()
+ written = bytearray()
+
+ async def capture_write(data: bytes) -> None:
+ written.extend(data)
+
+ writer.write = capture_write
+ writer.drain = mock.AsyncMock()
+
+ file_sender = FileResponse("dummy.bin")
+ file_sender._chunk_size = 64 # smaller than count to test multi-chunk
+
+ # Request only the first 100 bytes (offset=0, count=100)
+ await file_sender._sendfile_fallback(writer, fobj, offset=0, count=100)
+
+ assert bytes(written) == b"A" * 100
+ assert len(written) == 100
From dcf40f30637e8752c76781cf6703b5a236749a00 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 15:53:43 +0000
Subject: [PATCH 081/141] [PR #12091/8a631e74 backport][3.14] Restrict pickle
deserialization in CookieJar.load() (#12105)
**This is a backport of PR #12091 as merged into master
(8a631e74c1d266499dbc6bcdbc83c60f4ea3ee3c).**
---------
Co-authored-by: Yuval Elbar <41901908+YuvalElbar6@users.noreply.github.com>
Co-authored-by: Sam Bull
---
CHANGES/12091.bugfix.rst | 6 ++
aiohttp/cookiejar.py | 114 +++++++++++++++++++++-
docs/client_reference.rst | 18 +++-
docs/spelling_wordlist.txt | 7 +-
tests/conftest.py | 5 +
tests/test_cookiejar.py | 189 +++++++++++++++++++++++++++++++++++++
6 files changed, 331 insertions(+), 8 deletions(-)
create mode 100644 CHANGES/12091.bugfix.rst
diff --git a/CHANGES/12091.bugfix.rst b/CHANGES/12091.bugfix.rst
new file mode 100644
index 00000000000..45ffbc3557f
--- /dev/null
+++ b/CHANGES/12091.bugfix.rst
@@ -0,0 +1,6 @@
+Switched :py:meth:`~aiohttp.CookieJar.save` to use JSON format and
+:py:meth:`~aiohttp.CookieJar.load` to try JSON first with a fallback to
+a restricted pickle unpickler that only allows cookie-related types
+(``SimpleCookie``, ``Morsel``, ``defaultdict``, etc.), preventing
+arbitrary code execution via malicious pickle payloads
+(CWE-502) -- by :user:`YuvalElbar6`.
diff --git a/aiohttp/cookiejar.py b/aiohttp/cookiejar.py
index 016fae94d20..8a11bdd53ec 100644
--- a/aiohttp/cookiejar.py
+++ b/aiohttp/cookiejar.py
@@ -4,6 +4,7 @@
import datetime
import heapq
import itertools
+import json
import os # noqa
import pathlib
import pickle
@@ -38,6 +39,41 @@
_SIMPLE_COOKIE = SimpleCookie()
+class _RestrictedCookieUnpickler(pickle.Unpickler):
+ """A restricted unpickler that only allows cookie-related types.
+
+ This prevents arbitrary code execution when loading pickled cookie data
+ from untrusted sources. Only types that are expected in a serialized
+ CookieJar are permitted.
+
+ See: https://docs.python.org/3/library/pickle.html#restricting-globals
+ """
+
+ _ALLOWED_CLASSES: frozenset[tuple[str, str]] = frozenset(
+ {
+ # Core cookie types
+ ("http.cookies", "SimpleCookie"),
+ ("http.cookies", "Morsel"),
+ # Container types used by CookieJar._cookies
+ ("collections", "defaultdict"),
+ # builtins that pickle uses for reconstruction
+ ("builtins", "tuple"),
+ ("builtins", "set"),
+ ("builtins", "frozenset"),
+ ("builtins", "dict"),
+ }
+ )
+
+ def find_class(self, module: str, name: str) -> type:
+ if (module, name) not in self._ALLOWED_CLASSES:
+ raise pickle.UnpicklingError(
+ f"Forbidden class: {module}.{name}. "
+ "CookieJar.load() only allows cookie-related types for security. "
+ "See https://docs.python.org/3/library/pickle.html#restricting-globals"
+ )
+ return super().find_class(module, name) # type: ignore[no-any-return]
+
+
class CookieJar(AbstractCookieJar):
"""Implements cookie storage adhering to RFC 6265."""
@@ -112,14 +148,84 @@ def quote_cookie(self) -> bool:
return self._quote_cookie
def save(self, file_path: PathLike) -> None:
+ """Save cookies to a file using JSON format.
+
+ :param file_path: Path to file where cookies will be serialized,
+ :class:`str` or :class:`pathlib.Path` instance.
+ """
file_path = pathlib.Path(file_path)
- with file_path.open(mode="wb") as f:
- pickle.dump(self._cookies, f, pickle.HIGHEST_PROTOCOL)
+ data: dict[str, dict[str, dict[str, str | bool]]] = {}
+ for (domain, path), cookie in self._cookies.items():
+ key = f"{domain}|{path}"
+ data[key] = {}
+ for name, morsel in cookie.items():
+ morsel_data: dict[str, str | bool] = {
+ "key": morsel.key,
+ "value": morsel.value,
+ "coded_value": morsel.coded_value,
+ }
+ # Save all morsel attributes that have values
+ for attr in morsel._reserved: # type: ignore[attr-defined]
+ attr_val = morsel[attr]
+ if attr_val:
+ morsel_data[attr] = attr_val
+ data[key][name] = morsel_data
+ with file_path.open(mode="w", encoding="utf-8") as f:
+ json.dump(data, f, indent=2)
def load(self, file_path: PathLike) -> None:
+ """Load cookies from a file.
+
+ Tries to load JSON format first. Falls back to loading legacy
+ pickle format (using a restricted unpickler) for backward
+ compatibility with existing cookie files.
+
+ :param file_path: Path to file from where cookies will be
+ imported, :class:`str` or :class:`pathlib.Path` instance.
+ """
file_path = pathlib.Path(file_path)
- with file_path.open(mode="rb") as f:
- self._cookies = pickle.load(f)
+ # Try JSON format first
+ try:
+ with file_path.open(mode="r", encoding="utf-8") as f:
+ data = json.load(f)
+ self._cookies = self._load_json_data(data)
+ except (json.JSONDecodeError, UnicodeDecodeError, ValueError):
+ # Fall back to legacy pickle format with restricted unpickler
+ with file_path.open(mode="rb") as f:
+ self._cookies = _RestrictedCookieUnpickler(f).load()
+
+ def _load_json_data(
+ self, data: dict[str, dict[str, dict[str, str | bool]]]
+ ) -> defaultdict[tuple[str, str], SimpleCookie]:
+ """Load cookies from parsed JSON data."""
+ cookies: defaultdict[tuple[str, str], SimpleCookie] = defaultdict(SimpleCookie)
+ for compound_key, cookie_data in data.items():
+ domain, path = compound_key.split("|", 1)
+ key = (domain, path)
+ for name, morsel_data in cookie_data.items():
+ morsel: Morsel[str] = Morsel()
+ morsel_key = morsel_data["key"]
+ morsel_value = morsel_data["value"]
+ morsel_coded_value = morsel_data["coded_value"]
+ # Use __setstate__ to bypass validation, same pattern
+ # used in _build_morsel and _cookie_helpers.
+ morsel.__setstate__( # type: ignore[attr-defined]
+ {
+ "key": morsel_key,
+ "value": morsel_value,
+ "coded_value": morsel_coded_value,
+ }
+ )
+ # Restore morsel attributes
+ for attr in morsel._reserved: # type: ignore[attr-defined]
+ if attr in morsel_data and attr not in (
+ "key",
+ "value",
+ "coded_value",
+ ):
+ morsel[attr] = morsel_data[attr]
+ cookies[key][name] = morsel
+ return cookies
def clear(self, predicate: ClearCookiePredicate | None = None) -> None:
if predicate is None:
diff --git a/docs/client_reference.rst b/docs/client_reference.rst
index e8db2ac602d..651190c3cdf 100644
--- a/docs/client_reference.rst
+++ b/docs/client_reference.rst
@@ -2496,16 +2496,28 @@ Utilities
.. method:: save(file_path)
- Write a pickled representation of cookies into the file
+ Write a JSON representation of cookies into the file
at provided path.
+ .. versionchanged:: 3.14
+
+ Previously used pickle format. Now uses JSON for safe
+ serialization.
+
:param file_path: Path to file where cookies will be serialized,
:class:`str` or :class:`pathlib.Path` instance.
.. method:: load(file_path)
- Load a pickled representation of cookies from the file
- at provided path.
+ Load cookies from the file at provided path. Tries JSON format
+ first, then falls back to legacy pickle format (using a restricted
+ unpickler that only allows cookie-related types) for backward
+ compatibility with existing cookie files.
+
+ .. versionchanged:: 3.14
+
+ Now loads JSON format by default. Falls back to restricted
+ pickle for files saved by older versions.
:param file_path: Path to file from where cookies will be
imported, :class:`str` or :class:`pathlib.Path` instance.
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index cc92a04c2f5..8eeb37aa83b 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -99,6 +99,7 @@ deduplicate
defs
Dependabot
deprecations
+deserialization
DER
dev
Dev
@@ -213,7 +214,8 @@ Multipart
musllinux
mypy
Nagle
-Nagle’s
+Nagle's
+NFS
namedtuple
nameservers
namespace
@@ -236,6 +238,7 @@ param
params
parsers
pathlib
+payloads
peername
performant
pickleable
@@ -356,6 +359,8 @@ unhandled
unicode
unittest
Unittest
+unpickler
+untrusted
unix
unobvious
unsets
diff --git a/tests/conftest.py b/tests/conftest.py
index 661f539a632..71e773ddca8 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -95,6 +95,11 @@ def blockbuster(request: pytest.FixtureRequest) -> Iterator[None]:
bb.functions[func].can_block_in(
"aiohttp/web_urldispatcher.py", "add_static"
)
+ # save/load is not async, so we must allow this:
+ for func in ("io.TextIOWrapper.read", "io.BufferedReader.read"):
+ bb.functions[func].can_block_in("aiohttp/cookiejar.py", "load")
+ for func in ("io.TextIOWrapper.write", "io.BufferedWriter.write"):
+ bb.functions[func].can_block_in("aiohttp/cookiejar.py", "save")
# Note: coverage.py uses locking internally which can cause false positives
# in blockbuster when it instruments code. This is particularly problematic
# on Windows where it can lead to flaky test failures.
diff --git a/tests/test_cookiejar.py b/tests/test_cookiejar.py
index dc2c9f4cc98..9e4f7500afc 100644
--- a/tests/test_cookiejar.py
+++ b/tests/test_cookiejar.py
@@ -9,6 +9,7 @@
import unittest
from http.cookies import BaseCookie, Morsel, SimpleCookie
from operator import not_
+from pathlib import Path
from unittest import mock
import pytest
@@ -1620,3 +1621,191 @@ async def test_shared_cookie_with_multiple_domains() -> None:
# Verify cache is reused efficiently
assert ("", "") in jar._morsel_cache
assert "universal" in jar._morsel_cache[("", "")]
+
+
+# === Security tests for restricted unpickler and JSON save/load ===
+
+
+async def test_load_rejects_malicious_pickle(tmp_path: Path) -> None:
+ """Verify CookieJar.load() blocks arbitrary code execution via pickle.
+
+ A crafted pickle payload using os.system (or any non-cookie class)
+ must be rejected by the restricted unpickler.
+ """
+ import os
+
+ file_path = tmp_path / "malicious.pkl"
+
+ class RCEPayload:
+ def __reduce__(self) -> tuple[object, ...]:
+ return (os.system, ("echo PWNED",))
+
+ with open(file_path, "wb") as f:
+ pickle.dump(RCEPayload(), f, pickle.HIGHEST_PROTOCOL)
+
+ jar = CookieJar()
+ with pytest.raises(pickle.UnpicklingError, match="Forbidden class"):
+ jar.load(file_path)
+
+
+async def test_load_rejects_eval_payload(tmp_path: Path) -> None:
+ """Verify CookieJar.load() blocks eval-based pickle payloads."""
+ file_path = tmp_path / "eval_payload.pkl"
+
+ class EvalPayload:
+ def __reduce__(self) -> tuple[object, ...]:
+ return (eval, ("__import__('os').system('echo PWNED')",))
+
+ with open(file_path, "wb") as f:
+ pickle.dump(EvalPayload(), f, pickle.HIGHEST_PROTOCOL)
+
+ jar = CookieJar()
+ with pytest.raises(pickle.UnpicklingError, match="Forbidden class"):
+ jar.load(file_path)
+
+
+async def test_load_rejects_subprocess_payload(tmp_path: Path) -> None:
+ """Verify CookieJar.load() blocks subprocess-based pickle payloads."""
+ import subprocess
+
+ file_path = tmp_path / "subprocess_payload.pkl"
+
+ class SubprocessPayload:
+ def __reduce__(self) -> tuple[object, ...]:
+ return (subprocess.call, (["echo", "PWNED"],))
+
+ with open(file_path, "wb") as f:
+ pickle.dump(SubprocessPayload(), f, pickle.HIGHEST_PROTOCOL)
+
+ jar = CookieJar()
+ with pytest.raises(pickle.UnpicklingError, match="Forbidden class"):
+ jar.load(file_path)
+
+
+async def test_load_falls_back_to_pickle(
+ tmp_path: Path,
+ cookies_to_receive: SimpleCookie,
+) -> None:
+ """Verify load() falls back to restricted pickle for legacy cookie files.
+
+ Existing cookie files saved with older versions of aiohttp used pickle.
+ load() should detect that the file is not JSON and fall back to the
+ restricted pickle unpickler for backward compatibility.
+ """
+ file_path = tmp_path / "legit.pkl"
+
+ # Write a legacy pickle file directly (as old aiohttp save() would)
+ jar_save = CookieJar()
+ jar_save.update_cookies(cookies_to_receive)
+ with file_path.open(mode="wb") as f:
+ pickle.dump(jar_save._cookies, f, pickle.HIGHEST_PROTOCOL)
+
+ jar_load = CookieJar()
+ jar_load.load(file_path=file_path)
+
+ jar_test = SimpleCookie()
+ for cookie in jar_load:
+ jar_test[cookie.key] = cookie
+
+ assert jar_test == cookies_to_receive
+
+
+async def test_save_load_json_roundtrip(
+ tmp_path: Path,
+ cookies_to_receive: SimpleCookie,
+) -> None:
+ """Verify save/load roundtrip preserves cookies via JSON format."""
+ file_path = tmp_path / "cookies.json"
+
+ jar_save = CookieJar()
+ jar_save.update_cookies(cookies_to_receive)
+ jar_save.save(file_path=file_path)
+
+ jar_load = CookieJar()
+ jar_load.load(file_path=file_path)
+
+ saved_cookies = SimpleCookie()
+ for cookie in jar_save:
+ saved_cookies[cookie.key] = cookie
+
+ loaded_cookies = SimpleCookie()
+ for cookie in jar_load:
+ loaded_cookies[cookie.key] = cookie
+
+ assert saved_cookies == loaded_cookies
+
+
+async def test_save_load_json_partitioned_cookies(tmp_path: Path) -> None:
+ """Verify save/load roundtrip works with partitioned cookies."""
+ file_path = tmp_path / "partitioned.json"
+
+ jar_save = CookieJar()
+ jar_save.update_cookies_from_headers(
+ ["session=cookie; Partitioned"], URL("https://example.com/")
+ )
+ jar_save.save(file_path=file_path)
+
+ jar_load = CookieJar()
+ jar_load.load(file_path=file_path)
+
+ # Compare individual cookie values (same approach as test_save_load_partitioned_cookies)
+ saved = list(jar_save)
+ loaded = list(jar_load)
+ assert len(saved) == len(loaded)
+ for s, lo in zip(saved, loaded):
+ assert s.key == lo.key
+ assert s.value == lo.value
+ assert s["domain"] == lo["domain"]
+ assert s["path"] == lo["path"]
+
+
+async def test_json_format_is_safe(tmp_path: Path) -> None:
+ """Verify the JSON file format cannot execute code on load."""
+ import json
+
+ file_path = tmp_path / "safe.json"
+
+ # Write something that might look dangerous but is just data
+ malicious_data = {
+ "evil.com|/": {
+ "session": {
+ "key": "session",
+ "value": "__import__('os').system('echo PWNED')",
+ "coded_value": "__import__('os').system('echo PWNED')",
+ }
+ }
+ }
+ with open(file_path, "w") as f:
+ json.dump(malicious_data, f)
+
+ jar = CookieJar()
+ jar.load(file_path=file_path)
+
+ # The "malicious" string is just a cookie value, not executed code
+ cookies = list(jar)
+ assert len(cookies) == 1
+ assert cookies[0].value == "__import__('os').system('echo PWNED')"
+
+
+async def test_save_load_json_secure_cookies(tmp_path: Path) -> None:
+ """Verify save/load preserves Secure and HttpOnly flags."""
+ file_path = tmp_path / "secure.json"
+
+ jar_save = CookieJar()
+ jar_save.update_cookies_from_headers(
+ ["token=abc123; Secure; HttpOnly; Path=/; Domain=example.com"],
+ URL("https://example.com/"),
+ )
+ jar_save.save(file_path=file_path)
+
+ jar_load = CookieJar()
+ jar_load.load(file_path=file_path)
+
+ loaded_cookies = list(jar_load)
+ assert len(loaded_cookies) == 1
+ cookie = loaded_cookies[0]
+ assert cookie.key == "token"
+ assert cookie.value == "abc123"
+ assert cookie["secure"] is True
+ assert cookie["httponly"] is True
+ assert cookie["domain"] == "example.com"
From fe6974c16aaa37f2b9538c5646f4c7b5213813c9 Mon Sep 17 00:00:00 2001
From: nightcityblade
Date: Mon, 23 Feb 2026 00:00:02 +0800
Subject: [PATCH 082/141] [3.14] Backport: Fix access log timestamps ignoring
daylight savings time (#12085) (#12127)
Co-authored-by: nightcityblade
Co-authored-by: Sam Bull
---
CHANGES/11283.bugfix.rst | 3 ++
aiohttp/web_log.py | 24 +++++++--
tests/test_web_log.py | 104 +++++++++++++++++++++++++++++++++++++--
3 files changed, 124 insertions(+), 7 deletions(-)
create mode 100644 CHANGES/11283.bugfix.rst
diff --git a/CHANGES/11283.bugfix.rst b/CHANGES/11283.bugfix.rst
new file mode 100644
index 00000000000..966b9afbd00
--- /dev/null
+++ b/CHANGES/11283.bugfix.rst
@@ -0,0 +1,3 @@
+Fixed access log timestamps ignoring daylight saving time (DST) changes. The
+previous implementation used :py:data:`time.timezone` which is a constant and
+does not reflect DST transitions -- by :user:`nightcityblade`.
diff --git a/aiohttp/web_log.py b/aiohttp/web_log.py
index effdf53c3ce..27b50adf6d6 100644
--- a/aiohttp/web_log.py
+++ b/aiohttp/web_log.py
@@ -5,7 +5,8 @@
import re
import time as time_mod
from collections import namedtuple
-from typing import Any, Callable, Dict, Iterable, List, Tuple # noqa
+from collections.abc import Iterable
+from typing import Callable, ClassVar
from .abc import AbstractAccessLogger
from .web_request import BaseRequest
@@ -60,6 +61,9 @@ class AccessLogger(AbstractAccessLogger):
CLEANUP_RE = re.compile(r"(%[^s])")
_FORMAT_CACHE: dict[str, tuple[str, list[KeyMethod]]] = {}
+ _cached_tz: ClassVar[datetime.timezone | None] = None
+ _cached_tz_expires: ClassVar[float] = 0.0
+
def __init__(self, logger: logging.Logger, log_format: str = LOG_FORMAT) -> None:
"""Initialise the logger.
@@ -141,10 +145,24 @@ def _format_a(request: BaseRequest, response: StreamResponse, time: float) -> st
ip = request.remote
return ip if ip is not None else "-"
+ @classmethod
+ def _get_local_time(cls) -> datetime.datetime:
+ if cls._cached_tz is None or time_mod.time() >= cls._cached_tz_expires:
+ gmtoff = time_mod.localtime().tm_gmtoff
+ cls._cached_tz = tz = datetime.timezone(datetime.timedelta(seconds=gmtoff))
+
+ now = datetime.datetime.now(tz)
+ # Expire at every 30 mins, as any DST change should occur at 0/30 mins past.
+ d = now + datetime.timedelta(minutes=30)
+ d = d.replace(minute=30 if d.minute >= 30 else 0, second=0, microsecond=0)
+ cls._cached_tz_expires = d.timestamp()
+ return now
+
+ return datetime.datetime.now(cls._cached_tz)
+
@staticmethod
def _format_t(request: BaseRequest, response: StreamResponse, time: float) -> str:
- tz = datetime.timezone(datetime.timedelta(seconds=-time_mod.timezone))
- now = datetime.datetime.now(tz)
+ now = AccessLogger._get_local_time()
start_time = now - datetime.timedelta(seconds=time)
return start_time.strftime("[%d/%b/%Y:%H:%M:%S %z]")
diff --git a/tests/test_web_log.py b/tests/test_web_log.py
index 82b74a6e713..d8db6f621a1 100644
--- a/tests/test_web_log.py
+++ b/tests/test_web_log.py
@@ -86,12 +86,30 @@ def test_access_logger_atoms(
monkeypatch: Any, log_format: Any, expected: Any, extra: Any
) -> None:
class PatchedDatetime(datetime.datetime):
- @staticmethod
- def now(tz):
- return datetime.datetime(1843, 1, 1, 0, 30, tzinfo=tz)
+ @classmethod
+ def now(cls, tz: datetime.tzinfo | None = None) -> "PatchedDatetime":
+ assert tz is not None
+ # Simulate: real UTC time is 1842-12-31 16:30, convert to local tz
+ utc = datetime.datetime(1842, 12, 31, 16, 30, tzinfo=datetime.timezone.utc)
+ local = utc.astimezone(tz)
+ return cls(
+ local.year,
+ local.month,
+ local.day,
+ local.hour,
+ local.minute,
+ local.second,
+ tzinfo=tz,
+ )
monkeypatch.setattr("datetime.datetime", PatchedDatetime)
- monkeypatch.setattr("time.timezone", -28800)
+ # Mock localtime to return CST (+0800 = 28800 seconds)
+ mock_localtime = mock.Mock()
+ mock_localtime.return_value.tm_gmtoff = 28800
+ monkeypatch.setattr("aiohttp.web_log.time_mod.localtime", mock_localtime)
+ # Clear cached timezone so it gets rebuilt with our mock
+ AccessLogger._cached_tz = None
+ AccessLogger._cached_tz_expires = 0.0
monkeypatch.setattr("os.getpid", lambda: 42)
mock_logger = mock.Mock()
access_logger = AccessLogger(mock_logger, log_format)
@@ -109,6 +127,84 @@ def now(tz):
mock_logger.info.assert_called_with(expected, extra=extra)
+@pytest.mark.skipif(
+ IS_PYPY,
+ reason="PyPy has issues with patching datetime.datetime",
+)
+def test_access_logger_dst_timezone(monkeypatch: pytest.MonkeyPatch) -> None:
+ """Test that _format_t uses the current local UTC offset, not a cached one.
+
+ This ensures timestamps are correct during DST transitions. The old
+ implementation used time.timezone which is a constant and doesn't
+ reflect DST changes.
+ """
+ # Simulate a timezone that observes DST (e.g., US Eastern)
+ # During EST: UTC-5 (-18000s), during EDT: UTC-4 (-14400s)
+ gmtoff_est = -18000 # UTC-5
+ gmtoff_edt = -14400 # UTC-4
+
+ class PatchedDatetime(datetime.datetime):
+ @classmethod
+ def now(cls, tz: datetime.tzinfo | None = None) -> "PatchedDatetime":
+ assert tz is not None
+ # Simulate: real UTC time is 07:00, convert to local tz
+ utc = datetime.datetime(2024, 3, 10, 7, 0, 0, tzinfo=datetime.timezone.utc)
+ local = utc.astimezone(tz)
+ return cls(
+ local.year,
+ local.month,
+ local.day,
+ local.hour,
+ local.minute,
+ local.second,
+ tzinfo=tz,
+ )
+
+ monkeypatch.setattr("datetime.datetime", PatchedDatetime)
+ mock_localtime = mock.Mock()
+ mock_localtime.return_value.tm_gmtoff = gmtoff_est
+ monkeypatch.setattr("aiohttp.web_log.time_mod.localtime", mock_localtime)
+ # Force cache refresh
+ AccessLogger._cached_tz = None
+ AccessLogger._cached_tz_expires = 0.0
+
+ mock_logger = mock.Mock()
+ access_logger = AccessLogger(mock_logger, "%t")
+ request = mock.Mock(
+ headers={}, method="GET", path_qs="/", version=(1, 1), remote="127.0.0.1"
+ )
+ response = mock.Mock(headers={}, body_length=0, status=200)
+
+ # During EST (UTC-5): time is 07:00-05:00 = 02:00 EST
+ access_logger.log(request, response, 0.0)
+ call1 = mock_logger.info.call_args[0][0]
+ assert "-0500" in call1, f"Expected EST offset in {call1}"
+
+ mock_logger.reset_mock()
+
+ # Switch to EDT (UTC-4): force cache invalidation
+ mock_localtime.return_value.tm_gmtoff = gmtoff_edt
+ AccessLogger._cached_tz = None
+ AccessLogger._cached_tz_expires = 0.0
+ access_logger.log(request, response, 0.0)
+ call2 = mock_logger.info.call_args[0][0]
+ assert "-0400" in call2, f"Expected EDT offset in {call2}"
+
+ # Verify the hour changed too (02:00 -> 03:00)
+ assert "02:00:00 -0500" in call1
+ assert "03:00:00 -0400" in call2
+
+ # Verify cached tz works too
+ assert access_logger._cached_tz is not None
+ with mock.patch(
+ "aiohttp.web_log.time_mod.time",
+ return_value=access_logger._cached_tz_expires - 1,
+ ):
+ access_logger.log(request, response, 0.0)
+ call3 = mock_logger.info.call_args[0][0]
+ assert "-0400" in call3, f"Expected EDT offset in {call3}"
+
+
def test_access_logger_dicts() -> None:
log_format = "%{User-Agent}i %{Content-Length}o %{None}i"
mock_logger = mock.Mock()
From 97e11d2a53c15a11ab485b37750a0ce8f34626f3 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 16:36:43 +0000
Subject: [PATCH 083/141] [PR #12125/f049588a backport][3.14] Block absolute
paths in static files (#12129)
**This is a backport of PR #12125 as merged into master
(f049588a8ab5b6b6757677b539845c63698d61f7).**
Co-authored-by: Sam Bull
---
aiohttp/web_urldispatcher.py | 4 ++++
1 file changed, 4 insertions(+)
diff --git a/aiohttp/web_urldispatcher.py b/aiohttp/web_urldispatcher.py
index 01ee8a7808b..a725f7d792c 100644
--- a/aiohttp/web_urldispatcher.py
+++ b/aiohttp/web_urldispatcher.py
@@ -654,6 +654,10 @@ def __iter__(self) -> Iterator[AbstractRoute]:
async def _handle(self, request: Request) -> StreamResponse:
filename = request.match_info["filename"]
+ if Path(filename).is_absolute():
+ # filename is an absolute path e.g. //network/share or D:\path
+ # which could be a UNC path leading to NTLM credential theft
+ raise HTTPNotFound()
unresolved_path = self._directory.joinpath(filename)
loop = asyncio.get_running_loop()
return await loop.run_in_executor(
From 7438e8d4eaf94c95615695f527cc029e3e7987b0 Mon Sep 17 00:00:00 2001
From: nightcityblade
Date: Mon, 23 Feb 2026 01:11:21 +0800
Subject: [PATCH 084/141] [3.13] Backport: Fix access log timestamps ignoring
daylight savings time (#12085) (#12126)
Co-authored-by: nightcityblade
---
CHANGES/11283.bugfix.rst | 3 ++
aiohttp/web_log.py | 24 +++++++--
tests/test_web_log.py | 106 +++++++++++++++++++++++++++++++++++++--
3 files changed, 125 insertions(+), 8 deletions(-)
create mode 100644 CHANGES/11283.bugfix.rst
diff --git a/CHANGES/11283.bugfix.rst b/CHANGES/11283.bugfix.rst
new file mode 100644
index 00000000000..966b9afbd00
--- /dev/null
+++ b/CHANGES/11283.bugfix.rst
@@ -0,0 +1,3 @@
+Fixed access log timestamps ignoring daylight saving time (DST) changes. The
+previous implementation used :py:data:`time.timezone` which is a constant and
+does not reflect DST transitions -- by :user:`nightcityblade`.
diff --git a/aiohttp/web_log.py b/aiohttp/web_log.py
index d5ea2beeb15..02a545ae855 100644
--- a/aiohttp/web_log.py
+++ b/aiohttp/web_log.py
@@ -5,7 +5,8 @@
import re
import time as time_mod
from collections import namedtuple
-from typing import Any, Callable, Dict, Iterable, List, Tuple # noqa
+from collections.abc import Iterable
+from typing import Callable, ClassVar, Dict, List, Optional, Tuple
from .abc import AbstractAccessLogger
from .web_request import BaseRequest
@@ -60,6 +61,9 @@ class AccessLogger(AbstractAccessLogger):
CLEANUP_RE = re.compile(r"(%[^s])")
_FORMAT_CACHE: Dict[str, Tuple[str, List[KeyMethod]]] = {}
+ _cached_tz: ClassVar[Optional[datetime.timezone]] = None
+ _cached_tz_expires: ClassVar[float] = 0.0
+
def __init__(self, logger: logging.Logger, log_format: str = LOG_FORMAT) -> None:
"""Initialise the logger.
@@ -141,10 +145,24 @@ def _format_a(request: BaseRequest, response: StreamResponse, time: float) -> st
ip = request.remote
return ip if ip is not None else "-"
+ @classmethod
+ def _get_local_time(cls) -> datetime.datetime:
+ if cls._cached_tz is None or time_mod.time() >= cls._cached_tz_expires:
+ gmtoff = time_mod.localtime().tm_gmtoff
+ cls._cached_tz = tz = datetime.timezone(datetime.timedelta(seconds=gmtoff))
+
+ now = datetime.datetime.now(tz)
+ # Expire at every 30 mins, as any DST change should occur at 0/30 mins past.
+ d = now + datetime.timedelta(minutes=30)
+ d = d.replace(minute=30 if d.minute >= 30 else 0, second=0, microsecond=0)
+ cls._cached_tz_expires = d.timestamp()
+ return now
+
+ return datetime.datetime.now(cls._cached_tz)
+
@staticmethod
def _format_t(request: BaseRequest, response: StreamResponse, time: float) -> str:
- tz = datetime.timezone(datetime.timedelta(seconds=-time_mod.timezone))
- now = datetime.datetime.now(tz)
+ now = AccessLogger._get_local_time()
start_time = now - datetime.timedelta(seconds=time)
return start_time.strftime("[%d/%b/%Y:%H:%M:%S %z]")
diff --git a/tests/test_web_log.py b/tests/test_web_log.py
index 82b74a6e713..4aa5cacc29e 100644
--- a/tests/test_web_log.py
+++ b/tests/test_web_log.py
@@ -1,7 +1,7 @@
import datetime
import logging
import platform
-from typing import Any
+from typing import Any, Optional
from unittest import mock
import pytest
@@ -86,12 +86,30 @@ def test_access_logger_atoms(
monkeypatch: Any, log_format: Any, expected: Any, extra: Any
) -> None:
class PatchedDatetime(datetime.datetime):
- @staticmethod
- def now(tz):
- return datetime.datetime(1843, 1, 1, 0, 30, tzinfo=tz)
+ @classmethod
+ def now(cls, tz: Optional[datetime.tzinfo] = None) -> "PatchedDatetime":
+ assert tz is not None
+ # Simulate: real UTC time is 1842-12-31 16:30, convert to local tz
+ utc = datetime.datetime(1842, 12, 31, 16, 30, tzinfo=datetime.timezone.utc)
+ local = utc.astimezone(tz)
+ return cls(
+ local.year,
+ local.month,
+ local.day,
+ local.hour,
+ local.minute,
+ local.second,
+ tzinfo=tz,
+ )
monkeypatch.setattr("datetime.datetime", PatchedDatetime)
- monkeypatch.setattr("time.timezone", -28800)
+ # Mock localtime to return CST (+0800 = 28800 seconds)
+ mock_localtime = mock.Mock()
+ mock_localtime.return_value.tm_gmtoff = 28800
+ monkeypatch.setattr("aiohttp.web_log.time_mod.localtime", mock_localtime)
+ # Clear cached timezone so it gets rebuilt with our mock
+ AccessLogger._cached_tz = None
+ AccessLogger._cached_tz_expires = 0.0
monkeypatch.setattr("os.getpid", lambda: 42)
mock_logger = mock.Mock()
access_logger = AccessLogger(mock_logger, log_format)
@@ -109,6 +127,84 @@ def now(tz):
mock_logger.info.assert_called_with(expected, extra=extra)
+@pytest.mark.skipif(
+ IS_PYPY,
+ reason="PyPy has issues with patching datetime.datetime",
+)
+def test_access_logger_dst_timezone(monkeypatch: pytest.MonkeyPatch) -> None:
+ """Test that _format_t uses the current local UTC offset, not a cached one.
+
+ This ensures timestamps are correct during DST transitions. The old
+ implementation used time.timezone which is a constant and doesn't
+ reflect DST changes.
+ """
+ # Simulate a timezone that observes DST (e.g., US Eastern)
+ # During EST: UTC-5 (-18000s), during EDT: UTC-4 (-14400s)
+ gmtoff_est = -18000 # UTC-5
+ gmtoff_edt = -14400 # UTC-4
+
+ class PatchedDatetime(datetime.datetime):
+ @classmethod
+ def now(cls, tz: Optional[datetime.tzinfo] = None) -> "PatchedDatetime":
+ assert tz is not None
+ # Simulate: real UTC time is 07:00, convert to local tz
+ utc = datetime.datetime(2024, 3, 10, 7, 0, 0, tzinfo=datetime.timezone.utc)
+ local = utc.astimezone(tz)
+ return cls(
+ local.year,
+ local.month,
+ local.day,
+ local.hour,
+ local.minute,
+ local.second,
+ tzinfo=tz,
+ )
+
+ monkeypatch.setattr("datetime.datetime", PatchedDatetime)
+ mock_localtime = mock.Mock()
+ mock_localtime.return_value.tm_gmtoff = gmtoff_est
+ monkeypatch.setattr("aiohttp.web_log.time_mod.localtime", mock_localtime)
+ # Force cache refresh
+ AccessLogger._cached_tz = None
+ AccessLogger._cached_tz_expires = 0.0
+
+ mock_logger = mock.Mock()
+ access_logger = AccessLogger(mock_logger, "%t")
+ request = mock.Mock(
+ headers={}, method="GET", path_qs="/", version=(1, 1), remote="127.0.0.1"
+ )
+ response = mock.Mock(headers={}, body_length=0, status=200)
+
+ # During EST (UTC-5): time is 07:00-05:00 = 02:00 EST
+ access_logger.log(request, response, 0.0)
+ call1 = mock_logger.info.call_args[0][0]
+ assert "-0500" in call1, f"Expected EST offset in {call1}"
+
+ mock_logger.reset_mock()
+
+ # Switch to EDT (UTC-4): force cache invalidation
+ mock_localtime.return_value.tm_gmtoff = gmtoff_edt
+ AccessLogger._cached_tz = None
+ AccessLogger._cached_tz_expires = 0.0
+ access_logger.log(request, response, 0.0)
+ call2 = mock_logger.info.call_args[0][0]
+ assert "-0400" in call2, f"Expected EDT offset in {call2}"
+
+ # Verify the hour changed too (02:00 -> 03:00)
+ assert "02:00:00 -0500" in call1
+ assert "03:00:00 -0400" in call2
+
+ # Verify cached tz works too
+ assert access_logger._cached_tz is not None
+ with mock.patch(
+ "aiohttp.web_log.time_mod.time",
+ return_value=access_logger._cached_tz_expires - 1,
+ ):
+ access_logger.log(request, response, 0.0)
+ call3 = mock_logger.info.call_args[0][0]
+ assert "-0400" in call3, f"Expected EDT offset in {call3}"
+
+
def test_access_logger_dicts() -> None:
log_format = "%{User-Agent}i %{Content-Length}o %{None}i"
mock_logger = mock.Mock()
From 0ae2aa076c84573df83fc1fdc39eec0f5862fe3d Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 22 Feb 2026 17:11:50 +0000
Subject: [PATCH 085/141] [PR #12125/f049588a backport][3.13] Block absolute
paths in static files (#12128)
**This is a backport of PR #12125 as merged into master
(f049588a8ab5b6b6757677b539845c63698d61f7).**
Co-authored-by: Sam Bull
---
aiohttp/web_urldispatcher.py | 4 ++++
1 file changed, 4 insertions(+)
diff --git a/aiohttp/web_urldispatcher.py b/aiohttp/web_urldispatcher.py
index cfa57a31004..48465b5ee21 100644
--- a/aiohttp/web_urldispatcher.py
+++ b/aiohttp/web_urldispatcher.py
@@ -676,6 +676,10 @@ def __iter__(self) -> Iterator[AbstractRoute]:
async def _handle(self, request: Request) -> StreamResponse:
filename = request.match_info["filename"]
+ if Path(filename).is_absolute():
+ # filename is an absolute path e.g. //network/share or D:\path
+ # which could be a UNC path leading to NTLM credential theft
+ raise HTTPNotFound()
unresolved_path = self._directory.joinpath(filename)
loop = asyncio.get_running_loop()
return await loop.run_in_executor(
From 8d34ee0ed1c034abffb4d3835871aeb463786d02 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 23 Feb 2026 11:46:52 +0000
Subject: [PATCH 086/141] Bump rich from 14.3.2 to 14.3.3 (#12131)
Bumps [rich](https://github.com/Textualize/rich) from 14.3.2 to 14.3.3.
Release notes
Sourced from rich's
releases.
The infinite Release
Fixed a infinite loop in split_graphemes
[14.3.3] - 2026-02-19
Fixed
Changelog
Sourced from rich's
changelog.
[14.3.3] - 2026-02-19
Fixed
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
6 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 114fa5b8bbd..acc62b89c3c 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -207,7 +207,7 @@ requests==2.32.5
# via
# sphinx
# sphinxcontrib-spelling
-rich==14.3.2
+rich==14.3.3
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 18699d44ca4..7beae0cf82f 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -200,7 +200,7 @@ regex==2026.2.19
# via re-assert
requests==2.32.5
# via sphinx
-rich==14.3.2
+rich==14.3.3
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index ecd2c2e3ee4..0a90a5aa393 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -92,7 +92,7 @@ python-on-whales==0.80.0
# via -r requirements/lint.in
pyyaml==6.0.3
# via pre-commit
-rich==14.3.2
+rich==14.3.3
# via pytest-codspeed
six==1.17.0
# via python-dateutil
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index e49053e0b67..8f19cb942e7 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -89,7 +89,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.2.19
# via re-assert
-rich==14.3.2
+rich==14.3.3
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index fcf77a79177..9f336dcff43 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -124,7 +124,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.2.19
# via re-assert
-rich==14.3.2
+rich==14.3.3
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
diff --git a/requirements/test.txt b/requirements/test.txt
index cf90673850d..c794c3d6e14 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -124,7 +124,7 @@ re-assert==1.1.0
# via -r requirements/test-common.in
regex==2026.2.19
# via re-assert
-rich==14.3.2
+rich==14.3.3
# via pytest-codspeed
setuptools-git==1.2
# via -r requirements/test-common.in
From 8896cfc1bd49fa469e48eabe9590b03a91f1cb05 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Mon, 23 Feb 2026 13:31:19 +0000
Subject: [PATCH 087/141] [PR #12130/63a411d0 backport][3.14] Replace
unintentional except BaseException with except Exception (#12132)
**This is a backport of PR #12130 as merged into master
(63a411d09af23a9c9945c03f64285c0310f088f7).**
Co-authored-by: Dhruvil Darji
---
aiohttp/client_proto.py | 2 +-
aiohttp/http_parser.py | 4 ++--
aiohttp/worker.py | 2 +-
3 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/aiohttp/client_proto.py b/aiohttp/client_proto.py
index 08c03c192fe..dedde857704 100644
--- a/aiohttp/client_proto.py
+++ b/aiohttp/client_proto.py
@@ -324,7 +324,7 @@ def data_received(self, data: bytes) -> None:
# parse http messages
try:
messages, upgraded, tail = self._parser.feed_data(data)
- except BaseException as underlying_exc:
+ except Exception as underlying_exc:
if self.transport is not None:
# connection.release() could be called BEFORE
# data_received(), the transport is already
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index adf73f65e16..8588e9b4906 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -461,8 +461,8 @@ def get_content_length() -> int | None:
assert self._payload_parser is not None
try:
eof, data = self._payload_parser.feed_data(data[start_pos:], SEP)
- except BaseException as underlying_exc:
- reraised_exc = underlying_exc
+ except Exception as underlying_exc:
+ reraised_exc: BaseException = underlying_exc
if self.payload_exception is not None:
reraised_exc = self.payload_exception(str(underlying_exc))
diff --git a/aiohttp/worker.py b/aiohttp/worker.py
index 090a7b3b13d..27e8a8a002c 100644
--- a/aiohttp/worker.py
+++ b/aiohttp/worker.py
@@ -134,7 +134,7 @@ async def _run(self) -> None:
self.log.info("Parent changed, shutting down: %s", self)
else:
await self._wait_next_notify()
- except BaseException:
+ except Exception:
pass
await runner.cleanup()
From 26917a2bff43570ef35c70194723c84468a2d994 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Mon, 23 Feb 2026 17:21:27 +0000
Subject: [PATCH 088/141] [PR #12133/4ec897bd backport][3.14] Update Gunicorn
documentation link (#12135)
**This is a backport of PR #12133 as merged into master
(4ec897bddfa84c23389d6bfe6764e549ba50b696).**
Co-authored-by: laurent b
---
docs/logging.rst | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/logging.rst b/docs/logging.rst
index c415fa224ee..3f04c345112 100644
--- a/docs/logging.rst
+++ b/docs/logging.rst
@@ -130,7 +130,7 @@ Example of a drop-in replacement for the default access logger::
Gunicorn access logs
^^^^^^^^^^^^^^^^^^^^
-When `Gunicorn `_ is used for
+When `Gunicorn `_ is used for
:ref:`deployment `, its default access log format
will be automatically replaced with the default aiohttp's access log format.
From 57ac25bf85adbc9249ecc3408bbe7202606351f4 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Mon, 23 Feb 2026 17:21:44 +0000
Subject: [PATCH 089/141] [PR #12133/4ec897bd backport][3.13] Update Gunicorn
documentation link (#12134)
**This is a backport of PR #12133 as merged into master
(4ec897bddfa84c23389d6bfe6764e549ba50b696).**
Co-authored-by: laurent b
---
docs/logging.rst | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/logging.rst b/docs/logging.rst
index c415fa224ee..3f04c345112 100644
--- a/docs/logging.rst
+++ b/docs/logging.rst
@@ -130,7 +130,7 @@ Example of a drop-in replacement for the default access logger::
Gunicorn access logs
^^^^^^^^^^^^^^^^^^^^
-When `Gunicorn `_ is used for
+When `Gunicorn `_ is used for
:ref:`deployment `, its default access log format
will be automatically replaced with the default aiohttp's access log format.
From b68403d4f79a7d69bcea851d26b7ed7e968e16d0 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Tue, 24 Feb 2026 10:57:49 +0000
Subject: [PATCH 090/141] Bump virtualenv from 20.36.1 to 20.39.0 (#12140)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [virtualenv](https://github.com/pypa/virtualenv) from 20.36.1 to
20.39.0.
Release notes
Sourced from virtualenv's
releases.
20.39.0
What's Changed
Full Changelog: https://github.com/pypa/virtualenv/compare/20.38.0...20.39.0
20.38.0
What's Changed
New Contributors
Full Changelog: https://github.com/pypa/virtualenv/compare/20.37.0...20.38.0
Changelog
Sourced from virtualenv's
changelog.
Features - 20.39.0
- Automatically resolve version manager shims (pyenv, mise, asdf) to
the real Python binary during discovery, preventing
incorrect interpreter selection when shims are on
PATH - by
:user:gaborbernat. (:issue:3049)
- Add architecture (ISA) awareness to Python discovery — users can now
specify a CPU architecture suffix in the
--python spec string (e.g.
cpython3.12-64-arm64) to distinguish between interpreters
that share the same
version and bitness but target different architectures. Uses
sysconfig.get_platform() as the data source, with
cross-platform normalization (amd64 ↔ x86_64,
aarch64 ↔ arm64). Omitting the suffix
preserves existing
behavior - by :user:rahuldevikar.
(:issue:3059)
v20.38.0 (2026-02-19)
Features - 20.38.0
- Store app data (pip/setuptools/wheel caches) under the OS cache
directory (
platformdirs.user_cache_dir) instead of
the data directory (platformdirs.user_data_dir). Existing
app data at the old location is automatically migrated
on first use. This ensures cached files that can be redownloaded are
placed in the standard cache location (e.g.
~/.cache on Linux, ~/Library/Caches on macOS)
where they are excluded from backups and can be cleaned by
system tools - by :user:rahuldevikar.
(:issue:1884) (:issue:1884)
- Add
PKG_CONFIG_PATH environment variable support to all
activation scripts (Bash, Batch, PowerShell, Fish, C
Shell, Nushell, and Python). The virtualenv's lib/pkgconfig
directory is now automatically prepended to
PKG_CONFIG_PATH on activation and restored on deactivation,
enabling packages that use pkg-config during
build/install to find their configuration files - by
:user:rahuldevikar. (:issue:2637)
- Upgrade embedded pip to
26.0.1 from 25.3
and setuptools to 82.0.0, 75.3.4 from
75.3.2, 80.9.0
- by :user:
rahuldevikar. (:issue:3027)
- Replace
ty: ignore comments with proper type narrowing
using assertions and explicit None checks - by
:user:rahuldevikar. (:issue:3029)
Bugfixes - 20.38.0
- Exclude pywin32 DLLs (
pywintypes*.dll,
pythoncom*.dll) from being copied to the Scripts directory
during
virtualenv creation on Windows. This fixes compatibility issues with
pywin32, which expects its DLLs to be installed
in site-packages/pywin32_system32 by its own post-install
script - by :user:rahuldevikar.
(:issue:2662)
- Preserve symlinks in
pyvenv.cfg paths to match
venv behavior. Use os.path.abspath() instead
of
os.path.realpath() to normalize paths without resolving
symlinks, fixing issues with Python installations accessed
via symlinked directories (common in network-mounted filesystems) - by
:user:rahuldevikar. Fixes :issue:2770.
(:issue:2770)
- Fix Windows activation scripts to properly quote
python.exe path, preventing failures when Python is
installed in
a path with spaces (e.g., C:\Program Files) and a file
named C:\Program exists on the filesystem - by
:user:rahuldevikar. (:issue:2985)
- Fix
bash -u (set -o nounset) compatibility
in bash activation script by using ${PKG_CONFIG_PATH:-} and
${PKG_CONFIG_PATH:+:${PKG_CONFIG_PATH}} to handle unset
PKG_CONFIG_PATH - by :user:Fridayai700.
(:issue:3044)
- Gracefully handle corrupted on-disk cache and invalid JSON from
Python interrogation subprocess instead of crashing
with unhandled
JSONDecodeError or KeyError -
by :user:gaborbernat. (:issue:3054)
... (truncated)
Commits
ad56e78
release 20.39.0
ae90556
Add auto-upgrade workflow for embedded dependencies (#3057)
4582e41
🐛 fix(discovery): resolve version-manager shims to real binaries (#3067)
e32d82d
Add architecture (ISA) awareness to Python discovery (#3058)
f8f7f48
🧪 test(discovery): fix test_py_info_cache_clear outside venv (#3065)
c22b548
👷 ci(brew): add missing Homebrew Python versions and fix discovery (#3066)
a852cbb
🐛 fix(seed): add --ignore-installed to pip invoke seeder (#3064)
6d22c9e
🐛 fix(sdist): include tox.toml in sdist (#3063)
01e8f51
Move from extras to dependency-groups (#3056)
fbbb97d
release 20.38.0
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index acc62b89c3c..a4241b8dfec 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -279,7 +279,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.36.1
+virtualenv==20.39.0
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 7beae0cf82f..6dd4fa553de 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -269,7 +269,7 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.36.1
+virtualenv==20.39.0
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 0a90a5aa393..5b81802fc5b 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -121,7 +121,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# via -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.36.1
+virtualenv==20.39.0
# via pre-commit
zlib-ng==1.0.0
# via -r requirements/lint.in
From a81ec8ccd5282466a59a8477b4fe44d95fa31180 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Wed, 25 Feb 2026 10:50:05 +0000
Subject: [PATCH 091/141] Bump certifi from 2026.1.4 to 2026.2.25 (#12142)
Bumps [certifi](https://github.com/certifi/python-certifi) from 2026.1.4
to 2026.2.25.
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index a4241b8dfec..1c6c58e4492 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -38,7 +38,7 @@ brotli==1.2.0 ; platform_python_implementation == "CPython"
# via -r requirements/runtime-deps.in
build==1.3.0
# via pip-tools
-certifi==2026.1.4
+certifi==2026.2.25
# via requests
cffi==2.0.0
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 6dd4fa553de..d7ea384024a 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -38,7 +38,7 @@ brotli==1.2.0 ; platform_python_implementation == "CPython"
# via -r requirements/runtime-deps.in
build==1.3.0
# via pip-tools
-certifi==2026.1.4
+certifi==2026.2.25
# via requests
cffi==2.0.0
# via
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index fa87bdec99e..11f3f9c1be8 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -10,7 +10,7 @@ alabaster==1.0.0
# via sphinx
babel==2.18.0
# via sphinx
-certifi==2026.1.4
+certifi==2026.2.25
# via requests
charset-normalizer==3.4.4
# via requests
diff --git a/requirements/doc.txt b/requirements/doc.txt
index 973e97b2719..45943942c75 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -10,7 +10,7 @@ alabaster==1.0.0
# via sphinx
babel==2.18.0
# via sphinx
-certifi==2026.1.4
+certifi==2026.2.25
# via requests
charset-normalizer==3.4.4
# via requests
From 177e6a22fa0f30052e46f2366e1d2369462044b4 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 26 Feb 2026 10:59:28 +0000
Subject: [PATCH 092/141] Bump virtualenv from 20.39.0 to 21.0.0 (#12145)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [virtualenv](https://github.com/pypa/virtualenv) from 20.39.0 to
21.0.0.
Release notes
Sourced from virtualenv's
releases.
21.0.0
What's Changed
Full Changelog: https://github.com/pypa/virtualenv/compare/20.39.1...21.0.0
20.39.1
What's Changed
Full Changelog: https://github.com/pypa/virtualenv/compare/20.39.0...20.39.1
Changelog
Sourced from virtualenv's
changelog.
Deprecations and Removals - 21.0.0
- The Python discovery logic has been extracted into a standalone
python-discovery package on PyPI (documentation
<https://python-discovery.readthedocs.io/>_) and is now
consumed as a dependency. If you previously imported
discovery internals directly (e.g. from
virtualenv.discovery.py_info import PythonInfo), switch to
from python_discovery import PythonInfo.
Backward-compatibility re-export shims are provided at
virtualenv.discovery.py_info,
virtualenv.discovery.py_spec, and
virtualenv.discovery.cached_py_info,
however these are considered unsupported and may be removed in a future
release - by :user:gaborbernat.
(:issue:3070)
v20.39.1 (2026-02-25)
Features - 20.39.1
- Add support for creating virtual environments with RustPython - by
:user:
elmjag. (:issue:3010)
v20.39.0 (2026-02-23)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 12 +++++++++---
requirements/dev.txt | 12 +++++++++---
requirements/lint.txt | 12 +++++++++---
3 files changed, 27 insertions(+), 9 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 1c6c58e4492..305e41f7355 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -72,7 +72,9 @@ exceptiongroup==1.3.1
execnet==2.1.2
# via pytest-xdist
filelock==3.24.2
- # via virtualenv
+ # via
+ # python-discovery
+ # virtualenv
forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
@@ -139,7 +141,9 @@ pip-tools==7.5.3
pkgconfig==1.5.5
# via -r requirements/test-common.in
platformdirs==4.9.2
- # via virtualenv
+ # via
+ # python-discovery
+ # virtualenv
pluggy==1.6.0
# via
# pytest
@@ -193,6 +197,8 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
+python-discovery==1.1.0
+ # via virtualenv
python-on-whales==0.80.0
# via
# -r requirements/lint.in
@@ -279,7 +285,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.39.0
+virtualenv==21.0.0
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index d7ea384024a..cacb32f727b 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -70,7 +70,9 @@ exceptiongroup==1.3.1
execnet==2.1.2
# via pytest-xdist
filelock==3.24.2
- # via virtualenv
+ # via
+ # python-discovery
+ # virtualenv
forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
@@ -136,7 +138,9 @@ pip-tools==7.5.3
pkgconfig==1.5.5
# via -r requirements/test-common.in
platformdirs==4.9.2
- # via virtualenv
+ # via
+ # python-discovery
+ # virtualenv
pluggy==1.6.0
# via
# pytest
@@ -188,6 +192,8 @@ pytest-xdist==3.8.0
# via -r requirements/test-common.in
python-dateutil==2.9.0.post0
# via freezegun
+python-discovery==1.1.0
+ # via virtualenv
python-on-whales==0.80.0
# via
# -r requirements/lint.in
@@ -269,7 +275,7 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.39.0
+virtualenv==21.0.0
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 5b81802fc5b..09cf010489a 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -30,7 +30,9 @@ distlib==0.4.0
exceptiongroup==1.3.1
# via pytest
filelock==3.24.2
- # via virtualenv
+ # via
+ # python-discovery
+ # virtualenv
forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
@@ -60,7 +62,9 @@ packaging==26.0
pathspec==1.0.4
# via mypy
platformdirs==4.9.2
- # via virtualenv
+ # via
+ # python-discovery
+ # virtualenv
pluggy==1.6.0
# via pytest
pre-commit==4.5.1
@@ -88,6 +92,8 @@ pytest-mock==3.15.1
# via -r requirements/lint.in
python-dateutil==2.9.0.post0
# via freezegun
+python-discovery==1.1.0
+ # via virtualenv
python-on-whales==0.80.0
# via -r requirements/lint.in
pyyaml==6.0.3
@@ -121,7 +127,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# via -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==20.39.0
+virtualenv==21.0.0
# via pre-commit
zlib-ng==1.0.0
# via -r requirements/lint.in
From 08c62707d8214a292608fc7d7f82f1d4066dc264 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 26 Feb 2026 11:15:21 +0000
Subject: [PATCH 093/141] Bump filelock from 3.24.2 to 3.24.3 (#12095)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [filelock](https://github.com/tox-dev/py-filelock) from 3.24.2 to
3.24.3.
Release notes
Sourced from filelock's
releases.
3.24.3
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.24.2...3.24.3
Changelog
Sourced from filelock's
changelog.
###########
Changelog
###########
3.24.3 (2026-02-19)
- 🐛 fix(unix): handle ENOENT race on FUSE/NFS during acquire
:pr:
495
- 🐛 fix(ci): add trailing blank line after changelog entries
:pr:
492
3.24.2 (2026-02-16)
- 🐛 fix(rw): close sqlite3 cursors and skip SoftFileLock Windows race
:pr:
491
- 🐛 fix(test): resolve flaky write non-starvation test
:pr:
490
- 📝 docs: restructure using Diataxis framework
:pr:
489
3.24.1 (2026-02-15)
- 🐛 fix(soft): resolve Windows deadlock and test race condition
:pr:
488
3.24.0 (2026-02-14)
- ✨ feat(lock): add lifetime parameter for lock expiration (#68)
:pr:
486
- ✨ feat(lock): add cancel_check to acquire (#309)
:pr:
487
- 🐛 fix(api): detect same-thread self-deadlock
:pr:
481
- ✨ feat(mode): respect POSIX default ACLs (#378)
:pr:
483
- 🐛 fix(win): eliminate lock file race in threaded usage
:pr:
484
- ✨ feat(lock): add poll_interval to constructor
:pr:
482
- 🐛 fix(unix): auto-fallback to SoftFileLock on ENOSYS
:pr:
480
3.23.0 (2026-02-14)
- 📝 docs: move from Unlicense to MIT :pr:
479
- 📝 docs: add fasteners to similar libraries :pr:
478
3.22.0 (2026-02-14)
- 🐛 fix(soft): skip stale detection on Windows
:pr:
477
- ✨ feat(soft): detect and break stale locks :pr:
476
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 305e41f7355..9141a7fb70c 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -71,7 +71,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.24.2
+filelock==3.24.3
# via
# python-discovery
# virtualenv
diff --git a/requirements/dev.txt b/requirements/dev.txt
index cacb32f727b..80436ba0915 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -69,7 +69,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.24.2
+filelock==3.24.3
# via
# python-discovery
# virtualenv
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 09cf010489a..8ee30784d17 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -29,7 +29,7 @@ distlib==0.4.0
# via virtualenv
exceptiongroup==1.3.1
# via pytest
-filelock==3.24.2
+filelock==3.24.3
# via
# python-discovery
# virtualenv
From 5d1cd8c894d393ae803875cb7e5cd31c0ca179e1 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 00:27:45 +0000
Subject: [PATCH 094/141] [PR #12136/c0f1513c backport][3.14] Fix
AttributeError: 'ClientConnectorCertificateError' object has no attribute
'_os_error'. (#12148)
**This is a backport of PR #12136 as merged into master
(c0f1513c37cb145b2618aeffd8f58d2f0cf230ab).**
Co-authored-by: themylogin
---
CHANGES/12136.bugfix.rst | 2 ++
CONTRIBUTORS.txt | 1 +
aiohttp/client_exceptions.py | 15 +++++++++++++--
tests/test_client_exceptions.py | 9 +++++++++
4 files changed, 25 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/12136.bugfix.rst
diff --git a/CHANGES/12136.bugfix.rst b/CHANGES/12136.bugfix.rst
new file mode 100644
index 00000000000..14ad7edf326
--- /dev/null
+++ b/CHANGES/12136.bugfix.rst
@@ -0,0 +1,2 @@
+``ClientConnectorCertificateError.os_error`` no longer raises :exc:`AttributeError`
+-- by :user:`themylogin`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index cd47a312eeb..845b24c7f29 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -382,6 +382,7 @@ Vladimir Kamarzin
Vladimir Kozlovski
Vladimir Rutsky
Vladimir Shulyak
+Vladimir Vinogradenko
Vladimir Zakharov
Vladyslav Bohaichuk
Vladyslav Bondar
diff --git a/aiohttp/client_exceptions.py b/aiohttp/client_exceptions.py
index a371f736ecb..e3e503b7ca2 100644
--- a/aiohttp/client_exceptions.py
+++ b/aiohttp/client_exceptions.py
@@ -380,10 +380,21 @@ class ClientConnectorSSLError(*ssl_error_bases): # type: ignore[misc]
class ClientConnectorCertificateError(*cert_errors_bases): # type: ignore[misc]
"""Response certificate error."""
+ _conn_key: ConnectionKey
+
def __init__(
- self, connection_key: ConnectionKey, certificate_error: Exception
+ # TODO: If we require ssl in future, this can become ssl.CertificateError
+ self,
+ connection_key: ConnectionKey,
+ certificate_error: Exception,
) -> None:
- self._conn_key = connection_key
+ if isinstance(certificate_error, cert_errors + (OSError,)):
+ # ssl.CertificateError has errno and strerror, so we should be fine
+ os_error = certificate_error
+ else:
+ os_error = OSError()
+
+ super().__init__(connection_key, os_error)
self._certificate_error = certificate_error
self.args = (connection_key, certificate_error)
diff --git a/tests/test_client_exceptions.py b/tests/test_client_exceptions.py
index f14cfefde51..e5a8d73ca10 100644
--- a/tests/test_client_exceptions.py
+++ b/tests/test_client_exceptions.py
@@ -232,6 +232,15 @@ def test_str(self) -> None:
" [Exception: ('Bad certificate',)]"
)
+ def test_oserror(self) -> None:
+ certificate_error = OSError(1, "Bad certificate")
+ err = client.ClientConnectorCertificateError(
+ connection_key=self.connection_key, certificate_error=certificate_error
+ )
+ assert err.os_error == certificate_error
+ assert err.errno == 1
+ assert err.strerror == "Bad certificate"
+
class TestServerDisconnectedError:
def test_ctor(self) -> None:
From 5d922ee574df62e2d55aa975cd245e903ab304cb Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 00:28:05 +0000
Subject: [PATCH 095/141] [PR #12136/c0f1513c backport][3.13] Fix
AttributeError: 'ClientConnectorCertificateError' object has no attribute
'_os_error'. (#12147)
**This is a backport of PR #12136 as merged into master
(c0f1513c37cb145b2618aeffd8f58d2f0cf230ab).**
Co-authored-by: themylogin
---
CHANGES/12136.bugfix.rst | 2 ++
CONTRIBUTORS.txt | 1 +
aiohttp/client_exceptions.py | 15 +++++++++++++--
tests/test_client_exceptions.py | 9 +++++++++
4 files changed, 25 insertions(+), 2 deletions(-)
create mode 100644 CHANGES/12136.bugfix.rst
diff --git a/CHANGES/12136.bugfix.rst b/CHANGES/12136.bugfix.rst
new file mode 100644
index 00000000000..14ad7edf326
--- /dev/null
+++ b/CHANGES/12136.bugfix.rst
@@ -0,0 +1,2 @@
+``ClientConnectorCertificateError.os_error`` no longer raises :exc:`AttributeError`
+-- by :user:`themylogin`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index f310e2e5cca..63ca45cf04c 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -378,6 +378,7 @@ Vladimir Kamarzin
Vladimir Kozlovski
Vladimir Rutsky
Vladimir Shulyak
+Vladimir Vinogradenko
Vladimir Zakharov
Vladyslav Bohaichuk
Vladyslav Bondar
diff --git a/aiohttp/client_exceptions.py b/aiohttp/client_exceptions.py
index 1d298e9a8cf..4702da8d0ae 100644
--- a/aiohttp/client_exceptions.py
+++ b/aiohttp/client_exceptions.py
@@ -386,10 +386,21 @@ class ClientConnectorSSLError(*ssl_error_bases): # type: ignore[misc]
class ClientConnectorCertificateError(*cert_errors_bases): # type: ignore[misc]
"""Response certificate error."""
+ _conn_key: ConnectionKey
+
def __init__(
- self, connection_key: ConnectionKey, certificate_error: Exception
+ # TODO: If we require ssl in future, this can become ssl.CertificateError
+ self,
+ connection_key: ConnectionKey,
+ certificate_error: Exception,
) -> None:
- self._conn_key = connection_key
+ if isinstance(certificate_error, cert_errors + (OSError,)):
+ # ssl.CertificateError has errno and strerror, so we should be fine
+ os_error = certificate_error
+ else:
+ os_error = OSError()
+
+ super().__init__(connection_key, os_error)
self._certificate_error = certificate_error
self.args = (connection_key, certificate_error)
diff --git a/tests/test_client_exceptions.py b/tests/test_client_exceptions.py
index f14cfefde51..e5a8d73ca10 100644
--- a/tests/test_client_exceptions.py
+++ b/tests/test_client_exceptions.py
@@ -232,6 +232,15 @@ def test_str(self) -> None:
" [Exception: ('Bad certificate',)]"
)
+ def test_oserror(self) -> None:
+ certificate_error = OSError(1, "Bad certificate")
+ err = client.ClientConnectorCertificateError(
+ connection_key=self.connection_key, certificate_error=certificate_error
+ )
+ assert err.os_error == certificate_error
+ assert err.errno == 1
+ assert err.strerror == "Bad certificate"
+
class TestServerDisconnectedError:
def test_ctor(self) -> None:
From f43879178734822bfb3b4247e022cd0103a357e9 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Fri, 27 Feb 2026 01:30:01 +0000
Subject: [PATCH 096/141] Drop additional headers on redirect (#12146) (#12149)
(cherry picked from commit 6e8f393330f9bd6d7b24a146124ebc42eaa727b9)
---
aiohttp/client.py | 2 ++
tests/test_client_functional.py | 9 ++++++++-
2 files changed, 10 insertions(+), 1 deletion(-)
diff --git a/aiohttp/client.py b/aiohttp/client.py
index bc6f4622afc..e3e437d3088 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -926,6 +926,8 @@ async def _connect_and_send_request(
if url.origin() != redirect_origin:
auth = None
headers.pop(hdrs.AUTHORIZATION, None)
+ headers.pop(hdrs.COOKIE, None)
+ headers.pop(hdrs.PROXY_AUTHORIZATION, None)
url = parsed_redirect_url
params = {}
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index cd83503d798..96ac9733203 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -3458,6 +3458,8 @@ async def srv_from(request):
async def srv_to(request):
assert request.host == url_to.host
assert "Authorization" not in request.headers, "Header wasn't dropped"
+ assert "Proxy-Authorization" not in request.headers
+ assert "Cookie" not in request.headers
return web.Response()
server_from = await create_server_for_url_and_handler(url_from, srv_from)
@@ -3500,11 +3502,16 @@ async def close(self):
resp = await client.get(
url_from,
auth=aiohttp.BasicAuth("user", "pass"),
+ headers={"Proxy-Authorization": "Basic dXNlcjpwYXNz", "Cookie": "a=b"},
)
assert resp.status == 200
resp = await client.get(
url_from,
- headers={"Authorization": "Basic dXNlcjpwYXNz"},
+ headers={
+ "Authorization": "Basic dXNlcjpwYXNz",
+ "Proxy-Authorization": "Basic dXNlcjpwYXNz",
+ "Cookie": "a=b",
+ },
)
assert resp.status == 200
From 5351c980dcec7ad385730efdf4e1f4338b24fdb6 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Fri, 27 Feb 2026 01:31:07 +0000
Subject: [PATCH 097/141] Drop additional headers on redirect (#12146) (#12150)
(cherry picked from commit 6e8f393330f9bd6d7b24a146124ebc42eaa727b9)
---
aiohttp/client.py | 2 ++
tests/test_client_functional.py | 9 ++++++++-
2 files changed, 10 insertions(+), 1 deletion(-)
diff --git a/aiohttp/client.py b/aiohttp/client.py
index e1ea692be71..5d10d3e43da 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -893,6 +893,8 @@ async def _connect_and_send_request(
if url.origin() != redirect_origin:
auth = None
headers.pop(hdrs.AUTHORIZATION, None)
+ headers.pop(hdrs.COOKIE, None)
+ headers.pop(hdrs.PROXY_AUTHORIZATION, None)
url = parsed_redirect_url
params = {}
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index 1ae7e1b4cb3..22e89c4ea4d 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -3445,6 +3445,8 @@ async def srv_from(request):
async def srv_to(request):
assert request.host == url_to.host
assert "Authorization" not in request.headers, "Header wasn't dropped"
+ assert "Proxy-Authorization" not in request.headers
+ assert "Cookie" not in request.headers
return web.Response()
server_from = await create_server_for_url_and_handler(url_from, srv_from)
@@ -3487,11 +3489,16 @@ async def close(self):
resp = await client.get(
url_from,
auth=aiohttp.BasicAuth("user", "pass"),
+ headers={"Proxy-Authorization": "Basic dXNlcjpwYXNz", "Cookie": "a=b"},
)
assert resp.status == 200
resp = await client.get(
url_from,
- headers={"Authorization": "Basic dXNlcjpwYXNz"},
+ headers={
+ "Authorization": "Basic dXNlcjpwYXNz",
+ "Proxy-Authorization": "Basic dXNlcjpwYXNz",
+ "Cookie": "a=b",
+ },
)
assert resp.status == 200
From 8d6b7c96f4d3837fe8f503875aa63524fa3c1e61 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 11:14:14 +0000
Subject: [PATCH 098/141] Bump actions/download-artifact from 7 to 8 (#12155)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps
[actions/download-artifact](https://github.com/actions/download-artifact)
from 7 to 8.
Release notes
Sourced from actions/download-artifact's
releases.
v8.0.0
v8 - What's new
Direct downloads
To support direct uploads in actions/upload-artifact,
the action will no longer attempt to unzip all downloaded files.
Instead, the action checks the Content-Type header ahead of
unzipping and skips non-zipped files. Callers wishing to download a
zipped file as-is can also set the new skip-decompress
parameter to false.
Enforced checks (breaking)
A previous release introduced digest checks on the download. If a
download hash didn't match the expected hash from the server, the action
would log a warning. Callers can now configure the behavior on mismatch
with the digest-mismatch parameter. To be secure by
default, we are now defaulting the behavior to error which
will fail the workflow run.
ESM
To support new versions of the @actions/* packages, we've upgraded
the package to ESM.
What's Changed
Full Changelog: https://github.com/actions/download-artifact/compare/v7...v8.0.0
Commits
70fc10c
Merge pull request #461
from actions/danwkennedy/digest-mismatch-behavior
f258da9
Add change docs
ccc058e
Fix linting issues
bd7976b
Add a setting to specify what to do on hash mismatch and default it to
error
ac21fcf
Merge pull request #460
from actions/danwkennedy/download-no-unzip
15999bf
Add note about package bumps
974686e
Bump the version to v8 and add release notes
fbe48b1
Update test names to make it clearer what they do
96bf374
One more test fix
b8c4819
Fix skip decompress test
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 10 +++++-----
1 file changed, 5 insertions(+), 5 deletions(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index f669369990e..235e792d54a 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -203,7 +203,7 @@ jobs:
run: echo "PYTHON_GIL=0" >> $GITHUB_ENV
- name: Restore llhttp generated files
if: ${{ matrix.no-extensions == '' }}
- uses: actions/download-artifact@v7
+ uses: actions/download-artifact@v8
with:
name: llhttp
path: vendor/llhttp/build/
@@ -293,7 +293,7 @@ jobs:
run: |
python -m pip install -r requirements/test.in -c requirements/test.txt
- name: Restore llhttp generated files
- uses: actions/download-artifact@v7
+ uses: actions/download-artifact@v8
with:
name: llhttp
path: vendor/llhttp/build/
@@ -355,7 +355,7 @@ jobs:
python -m
pip install -r requirements/cython.in -c requirements/cython.txt
- name: Restore llhttp generated files
- uses: actions/download-artifact@v7
+ uses: actions/download-artifact@v8
with:
name: llhttp
path: vendor/llhttp/build/
@@ -445,7 +445,7 @@ jobs:
python -m
pip install -r requirements/cython.in -c requirements/cython.txt
- name: Restore llhttp generated files
- uses: actions/download-artifact@v7
+ uses: actions/download-artifact@v8
with:
name: llhttp
path: vendor/llhttp/build/
@@ -495,7 +495,7 @@ jobs:
run: |
echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
- name: Download distributions
- uses: actions/download-artifact@v7
+ uses: actions/download-artifact@v8
with:
path: dist
pattern: dist-*
From 3c7f1d67a9acc7b50c96b63f111ba6090f7fe204 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 11:14:40 +0000
Subject: [PATCH 099/141] Bump virtualenv from 21.0.0 to 21.1.0 (#12158)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [virtualenv](https://github.com/pypa/virtualenv) from 21.0.0 to
21.1.0.
Release notes
Sourced from virtualenv's
releases.
21.1.0
What's Changed
Full Changelog: https://github.com/pypa/virtualenv/compare/21.0.0...21.1.0
Changelog
Sourced from virtualenv's
changelog.
Features - 21.1.0
- Add comprehensive type annotations across the entire codebase and
ship a PEP 561
py.typed marker so downstream
consumers and type checkers recognize virtualenv as an inline-typed
package - by :user:rahuldevikar.
(:issue:3075)
v21.0.0 (2026-02-25)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 9141a7fb70c..95300df0356 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -285,7 +285,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==21.0.0
+virtualenv==21.1.0
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 80436ba0915..b569b9b80b8 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -275,7 +275,7 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho
# -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==21.0.0
+virtualenv==21.1.0
# via pre-commit
wait-for-it==2.3.0
# via -r requirements/test-common.in
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 8ee30784d17..f1b3b43bbe0 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -127,7 +127,7 @@ uvloop==0.21.0 ; platform_system != "Windows"
# via -r requirements/lint.in
valkey==6.1.1
# via -r requirements/lint.in
-virtualenv==21.0.0
+virtualenv==21.1.0
# via pre-commit
zlib-ng==1.0.0
# via -r requirements/lint.in
From 3ec83f30925feb25dad96b32583a809561b25531 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 14:22:14 +0000
Subject: [PATCH 100/141] [PR #12151/ca2012e9 backport][3.14] Update codecov
config for v5 (#12160)
**This is a backport of PR #12151 as merged into master
(ca2012e9e6c8a98de4a2c750db2a762a953c0000).**
Co-authored-by: Sam Bull
---
.codecov.yml | 15 ++++++++++++++-
.github/workflows/ci-cd.yml | 8 +++++++-
2 files changed, 21 insertions(+), 2 deletions(-)
diff --git a/.codecov.yml b/.codecov.yml
index e21d45ac7b2..e3e81ac574d 100644
--- a/.codecov.yml
+++ b/.codecov.yml
@@ -1,7 +1,11 @@
codecov:
branch: master
notify:
- after_n_builds: 13
+ manual_trigger: true
+
+comment:
+ require_head: false
+ require_base: false
coverage:
range: "95..100"
@@ -9,6 +13,15 @@ coverage:
status:
project: no
+component_management:
+ individual_components:
+ - component_id: project
+ paths:
+ - aiohttp/**
+ - component_id: tests
+ paths:
+ - tests/**
+
flags:
library:
paths:
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 235e792d54a..815763e68aa 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -249,7 +249,7 @@ jobs:
- name: Upload coverage
uses: codecov/codecov-action@v5
with:
- file: ./coverage.xml
+ files: ./coverage.xml
flags: >-
CI-GHA,OS-${{
runner.os
@@ -319,6 +319,12 @@ jobs:
runs-on: ubuntu-latest
steps:
+ - name: Trigger codecov notification
+ uses: codecov/codecov-action@v5
+ with:
+ token: ${{ secrets.CODECOV_TOKEN }}
+ fail_ci_if_error: true
+ run_command: send-notifications
- name: Decide whether the needed jobs succeeded or failed
uses: re-actors/alls-green@release/v1
with:
From c3fcc82dc16b26acbc9031246a4ea5add6f2e872 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 14:22:27 +0000
Subject: [PATCH 101/141] [PR #12151/ca2012e9 backport][3.13] Update codecov
config for v5 (#12159)
**This is a backport of PR #12151 as merged into master
(ca2012e9e6c8a98de4a2c750db2a762a953c0000).**
Co-authored-by: Sam Bull
---
.codecov.yml | 15 ++++++++++++++-
.github/workflows/ci-cd.yml | 8 +++++++-
2 files changed, 21 insertions(+), 2 deletions(-)
diff --git a/.codecov.yml b/.codecov.yml
index e21d45ac7b2..e3e81ac574d 100644
--- a/.codecov.yml
+++ b/.codecov.yml
@@ -1,7 +1,11 @@
codecov:
branch: master
notify:
- after_n_builds: 13
+ manual_trigger: true
+
+comment:
+ require_head: false
+ require_base: false
coverage:
range: "95..100"
@@ -9,6 +13,15 @@ coverage:
status:
project: no
+component_management:
+ individual_components:
+ - component_id: project
+ paths:
+ - aiohttp/**
+ - component_id: tests
+ paths:
+ - tests/**
+
flags:
library:
paths:
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index df1b0cf2719..5d1a2487c2f 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -249,7 +249,7 @@ jobs:
- name: Upload coverage
uses: codecov/codecov-action@v5
with:
- file: ./coverage.xml
+ files: ./coverage.xml
flags: >-
CI-GHA,OS-${{
runner.os
@@ -319,6 +319,12 @@ jobs:
runs-on: ubuntu-latest
steps:
+ - name: Trigger codecov notification
+ uses: codecov/codecov-action@v5
+ with:
+ token: ${{ secrets.CODECOV_TOKEN }}
+ fail_ci_if_error: true
+ run_command: send-notifications
- name: Decide whether the needed jobs succeeded or failed
uses: re-actors/alls-green@release/v1
with:
From a9a52f2f155d984741529af06299ea4ecc194f4b Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 15:42:47 +0000
Subject: [PATCH 102/141] [PR #12152/c1981efd backport][3.14] docs: replace
deprecated ujson with orjson in client quickstart (#12162)
**This is a backport of PR #12152 as merged into master
(c1981efd5f751b95f3569e956b6ee91f314aa7cb).**
Co-authored-by: indoor47
---
CHANGES/10795.doc.rst | 4 ++++
docs/client_quickstart.rst | 16 +++++++++-------
docs/spelling_wordlist.txt | 1 +
3 files changed, 14 insertions(+), 7 deletions(-)
create mode 100644 CHANGES/10795.doc.rst
diff --git a/CHANGES/10795.doc.rst b/CHANGES/10795.doc.rst
new file mode 100644
index 00000000000..91094f70c13
--- /dev/null
+++ b/CHANGES/10795.doc.rst
@@ -0,0 +1,4 @@
+Replaced the deprecated ``ujson`` library with ``orjson`` in the
+client quickstart documentation. ``ujson`` has been put into
+maintenance-only mode; ``orjson`` is the recommended alternative.
+-- by :user:`indoor47`
diff --git a/docs/client_quickstart.rst b/docs/client_quickstart.rst
index b74c2500065..bd7b0c854c5 100644
--- a/docs/client_quickstart.rst
+++ b/docs/client_quickstart.rst
@@ -203,20 +203,22 @@ Any of session's request methods like :func:`request`,
By default session uses python's standard :mod:`json` module for
-serialization. But it is possible to use different
-``serializer``. :class:`ClientSession` accepts ``json_serialize``
-parameter::
+serialization. But it is possible to use a different
+``serializer``. :class:`ClientSession` accepts ``json_serialize`` and
+``json_serialize_bytes`` parameters::
- import ujson
+ import orjson
async with aiohttp.ClientSession(
- json_serialize=ujson.dumps) as session:
+ json_serialize_bytes=orjson.dumps) as session:
await session.post(url, json={'test': 'object'})
.. note::
- ``ujson`` library is faster than standard :mod:`json` but slightly
- incompatible.
+ ``orjson`` library is faster than standard :mod:`json` and is actively
+ maintained. Since ``orjson.dumps`` returns :class:`bytes`, pass it via
+ the ``json_serialize_bytes`` parameter to avoid unnecessary
+ encoding/decoding overhead.
JSON Response Content
=====================
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 8eeb37aa83b..50d2a5f8a88 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -266,6 +266,7 @@ pytest
Pytest
qop
Quickstart
+quickstart
quote’s
rc
readonly
From 870199d4f33e12a3802fe05c4b47736dd658b681 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Fri, 27 Feb 2026 15:43:01 +0000
Subject: [PATCH 103/141] [PR #12152/c1981efd backport][3.13] docs: replace
deprecated ujson with orjson in client quickstart (#12161)
**This is a backport of PR #12152 as merged into master
(c1981efd5f751b95f3569e956b6ee91f314aa7cb).**
Co-authored-by: indoor47
---
CHANGES/10795.doc.rst | 4 ++++
docs/client_quickstart.rst | 16 +++++++++-------
docs/spelling_wordlist.txt | 1 +
3 files changed, 14 insertions(+), 7 deletions(-)
create mode 100644 CHANGES/10795.doc.rst
diff --git a/CHANGES/10795.doc.rst b/CHANGES/10795.doc.rst
new file mode 100644
index 00000000000..91094f70c13
--- /dev/null
+++ b/CHANGES/10795.doc.rst
@@ -0,0 +1,4 @@
+Replaced the deprecated ``ujson`` library with ``orjson`` in the
+client quickstart documentation. ``ujson`` has been put into
+maintenance-only mode; ``orjson`` is the recommended alternative.
+-- by :user:`indoor47`
diff --git a/docs/client_quickstart.rst b/docs/client_quickstart.rst
index b74c2500065..bd7b0c854c5 100644
--- a/docs/client_quickstart.rst
+++ b/docs/client_quickstart.rst
@@ -203,20 +203,22 @@ Any of session's request methods like :func:`request`,
By default session uses python's standard :mod:`json` module for
-serialization. But it is possible to use different
-``serializer``. :class:`ClientSession` accepts ``json_serialize``
-parameter::
+serialization. But it is possible to use a different
+``serializer``. :class:`ClientSession` accepts ``json_serialize`` and
+``json_serialize_bytes`` parameters::
- import ujson
+ import orjson
async with aiohttp.ClientSession(
- json_serialize=ujson.dumps) as session:
+ json_serialize_bytes=orjson.dumps) as session:
await session.post(url, json={'test': 'object'})
.. note::
- ``ujson`` library is faster than standard :mod:`json` but slightly
- incompatible.
+ ``orjson`` library is faster than standard :mod:`json` and is actively
+ maintained. Since ``orjson.dumps`` returns :class:`bytes`, pass it via
+ the ``json_serialize_bytes`` parameter to avoid unnecessary
+ encoding/decoding overhead.
JSON Response Content
=====================
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index d6e1acb15db..7dc09edb127 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -259,6 +259,7 @@ pytest
Pytest
qop
Quickstart
+quickstart
quote’s
rc
readonly
From 31d45fd7430005efddf9dbd4c0884214d283fe4a Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Fri, 27 Feb 2026 19:42:44 -0300
Subject: [PATCH 104/141] [3.14] Fix test_invalid_idna for idna 3.11
compatibility (#12036) (#12164)
Backport of #12036 to `3.14`.
---
CHANGES/12027.misc.rst | 1 +
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/runtime-deps.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
tests/test_client_functional.py | 2 +-
13 files changed, 13 insertions(+), 12 deletions(-)
create mode 100644 CHANGES/12027.misc.rst
diff --git a/CHANGES/12027.misc.rst b/CHANGES/12027.misc.rst
new file mode 100644
index 00000000000..0b14de408a9
--- /dev/null
+++ b/CHANGES/12027.misc.rst
@@ -0,0 +1 @@
+Fixed ``test_invalid_idna`` to work with ``idna`` 3.11 by using an invalid character (``\u0080``) that is rejected by ``yarl`` during URL construction -- by :user:`rodrigobnogueira`.
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index dc6e4b907aa..1048f618136 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -26,7 +26,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==25.1.0
# via -r requirements/base-ft.in
-idna==3.10
+idna==3.11
# via yarl
multidict==6.7.0
# via
diff --git a/requirements/base.txt b/requirements/base.txt
index f45e21291be..4c8df4dd20a 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -26,7 +26,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==25.1.0
# via -r requirements/base.in
-idna==3.10
+idna==3.11
# via yarl
multidict==6.7.0
# via
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 95300df0356..c4a416a45fa 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -89,7 +89,7 @@ gunicorn==25.1.0
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
-idna==3.10
+idna==3.11
# via
# requests
# trustme
diff --git a/requirements/dev.txt b/requirements/dev.txt
index b569b9b80b8..99f6d30d286 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -87,7 +87,7 @@ gunicorn==25.1.0
# via -r requirements/base.in
identify==2.6.16
# via pre-commit
-idna==3.10
+idna==3.11
# via
# requests
# trustme
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index 11f3f9c1be8..de53158c046 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -18,7 +18,7 @@ click==8.3.1
# via towncrier
docutils==0.21.2
# via sphinx
-idna==3.10
+idna==3.11
# via requests
imagesize==1.4.1
# via sphinx
diff --git a/requirements/doc.txt b/requirements/doc.txt
index 45943942c75..b2b5697d659 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -18,7 +18,7 @@ click==8.3.1
# via towncrier
docutils==0.21.2
# via sphinx
-idna==3.10
+idna==3.11
# via requests
imagesize==1.4.1
# via sphinx
diff --git a/requirements/lint.txt b/requirements/lint.txt
index f1b3b43bbe0..5e5552fd115 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -39,7 +39,7 @@ freezegun==1.5.5
# via -r requirements/lint.in
identify==2.6.16
# via pre-commit
-idna==3.10
+idna==3.11
# via trustme
iniconfig==2.3.0
# via pytest
diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt
index 8ff95aee594..6d38a801a10 100644
--- a/requirements/runtime-deps.txt
+++ b/requirements/runtime-deps.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-idna==3.10
+idna==3.11
# via yarl
multidict==6.7.0
# via
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 8f19cb942e7..93ee412a37c 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -28,7 +28,7 @@ forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
# via -r requirements/test-common.in
-idna==3.10
+idna==3.11
# via trustme
iniconfig==2.3.0
# via pytest
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index 9f336dcff43..115d8891785 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -49,7 +49,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==25.1.0
# via -r requirements/base-ft.in
-idna==3.10
+idna==3.11
# via
# trustme
# yarl
diff --git a/requirements/test.txt b/requirements/test.txt
index c794c3d6e14..f9603ae4a09 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -49,7 +49,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==25.1.0
# via -r requirements/base.in
-idna==3.10
+idna==3.11
# via
# trustme
# yarl
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index 96ac9733203..545d8075c5e 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -3332,7 +3332,7 @@ async def test_invalid_idna() -> None:
session = aiohttp.ClientSession()
try:
with pytest.raises(aiohttp.InvalidURL):
- await session.get("http://\u2061owhefopw.com")
+ await session.get("http://\u0080owhefopw.com")
finally:
await session.close()
From ed00ce2d6d3c3702084a90e2231bf8c78e6d98d7 Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Fri, 27 Feb 2026 19:43:03 -0300
Subject: [PATCH 105/141] [3.13] Fix test_invalid_idna for idna 3.11
compatibility (#12036) (#12165)
Backport of #12036 to `3.13`.
---
CHANGES/12027.misc.rst | 1 +
requirements/base-ft.txt | 2 +-
requirements/base.txt | 2 +-
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/doc-spelling.txt | 2 +-
requirements/doc.txt | 2 +-
requirements/lint.txt | 2 +-
requirements/runtime-deps.txt | 2 +-
requirements/test-common.txt | 2 +-
requirements/test-ft.txt | 2 +-
requirements/test.txt | 2 +-
tests/test_client_functional.py | 2 +-
13 files changed, 13 insertions(+), 12 deletions(-)
create mode 100644 CHANGES/12027.misc.rst
diff --git a/CHANGES/12027.misc.rst b/CHANGES/12027.misc.rst
new file mode 100644
index 00000000000..0b14de408a9
--- /dev/null
+++ b/CHANGES/12027.misc.rst
@@ -0,0 +1 @@
+Fixed ``test_invalid_idna`` to work with ``idna`` 3.11 by using an invalid character (``\u0080``) that is rejected by ``yarl`` during URL construction -- by :user:`rodrigobnogueira`.
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
index cce3b614e26..cccf78c5995 100644
--- a/requirements/base-ft.txt
+++ b/requirements/base-ft.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==23.0.0
# via -r requirements/base-ft.in
-idna==3.10
+idna==3.11
# via yarl
multidict==6.6.4
# via
diff --git a/requirements/base.txt b/requirements/base.txt
index f1f72f3ed05..fe242c5d019 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -24,7 +24,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==23.0.0
# via -r requirements/base.in
-idna==3.4
+idna==3.11
# via yarl
multidict==6.6.4
# via
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 8cc65554bbd..495ab431cc8 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -90,7 +90,7 @@ gunicorn==23.0.0
# via -r requirements/base.in
identify==2.6.15
# via pre-commit
-idna==3.3
+idna==3.11
# via
# requests
# trustme
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 86a0843a7c5..b3f9fc2981b 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -88,7 +88,7 @@ gunicorn==23.0.0
# via -r requirements/base.in
identify==2.6.15
# via pre-commit
-idna==3.4
+idna==3.11
# via
# requests
# trustme
diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt
index 55c618df961..cd065a5936b 100644
--- a/requirements/doc-spelling.txt
+++ b/requirements/doc-spelling.txt
@@ -18,7 +18,7 @@ click==8.1.8
# via towncrier
docutils==0.21.2
# via sphinx
-idna==3.4
+idna==3.11
# via requests
imagesize==1.4.1
# via sphinx
diff --git a/requirements/doc.txt b/requirements/doc.txt
index 7aa95aee161..c16b79e47f5 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -18,7 +18,7 @@ click==8.1.8
# via towncrier
docutils==0.21.2
# via sphinx
-idna==3.10
+idna==3.11
# via requests
imagesize==1.4.1
# via sphinx
diff --git a/requirements/lint.txt b/requirements/lint.txt
index a9b088d96b7..d5950d3bcdd 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -35,7 +35,7 @@ freezegun==1.5.5
# via -r requirements/lint.in
identify==2.6.15
# via pre-commit
-idna==3.7
+idna==3.11
# via trustme
iniconfig==2.1.0
# via pytest
diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt
index 86d8f58e491..4202c42fefc 100644
--- a/requirements/runtime-deps.txt
+++ b/requirements/runtime-deps.txt
@@ -22,7 +22,7 @@ frozenlist==1.8.0
# via
# -r requirements/runtime-deps.in
# aiosignal
-idna==3.10
+idna==3.11
# via yarl
multidict==6.6.4
# via
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
index 1ab1c6b3550..be1987311af 100644
--- a/requirements/test-common.txt
+++ b/requirements/test-common.txt
@@ -28,7 +28,7 @@ forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
# via -r requirements/test-common.in
-idna==3.10
+idna==3.11
# via trustme
iniconfig==2.1.0
# via pytest
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
index dd791bc78e7..0d9df55b375 100644
--- a/requirements/test-ft.txt
+++ b/requirements/test-ft.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==23.0.0
# via -r requirements/base-ft.in
-idna==3.10
+idna==3.11
# via
# trustme
# yarl
diff --git a/requirements/test.txt b/requirements/test.txt
index 87faf489087..95b278f12fb 100644
--- a/requirements/test.txt
+++ b/requirements/test.txt
@@ -47,7 +47,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==23.0.0
# via -r requirements/base.in
-idna==3.4
+idna==3.11
# via
# trustme
# yarl
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index 22e89c4ea4d..118ce19ca64 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -3319,7 +3319,7 @@ async def test_invalid_idna() -> None:
session = aiohttp.ClientSession()
try:
with pytest.raises(aiohttp.InvalidURL):
- await session.get("http://\u2061owhefopw.com")
+ await session.get("http://\u0080owhefopw.com")
finally:
await session.close()
From 0ec5ba44086fff232f1dba89e343f5c2da2c75df Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Sat, 28 Feb 2026 20:12:46 -0300
Subject: [PATCH 106/141] [3.14 backport] Fix test_data_file race condition on
Python 3.14 free-threaded (#12172)
---
CHANGES/12170.misc.rst | 1 +
tests/test_client_request.py | 13 ++++++++++++-
2 files changed, 13 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12170.misc.rst
diff --git a/CHANGES/12170.misc.rst b/CHANGES/12170.misc.rst
new file mode 100644
index 00000000000..23c63db50e9
--- /dev/null
+++ b/CHANGES/12170.misc.rst
@@ -0,0 +1 @@
+Fixed race condition in ``test_data_file`` on Python 3.14 free-threaded builds -- by :user:`rodrigobnogueira`.
diff --git a/tests/test_client_request.py b/tests/test_client_request.py
index 08bd698b84c..3d5898abc0d 100644
--- a/tests/test_client_request.py
+++ b/tests/test_client_request.py
@@ -1173,7 +1173,18 @@ async def test_data_file(loop, buf, conn) -> None:
assert isinstance(req.body, payload.BufferedReaderPayload)
assert req.headers["TRANSFER-ENCODING"] == "chunked"
- resp = await req.send(conn)
+ original_write_bytes = req.write_bytes
+
+ async def _mock_write_bytes(
+ writer: AbstractStreamWriter, conn: mock.Mock, content_length: int | None
+ ) -> None:
+ # Ensure the task is scheduled so _writer isn't None
+ await asyncio.sleep(0)
+ await original_write_bytes(writer, conn, content_length)
+
+ with mock.patch.object(req, "write_bytes", _mock_write_bytes):
+ resp = await req.send(conn)
+
assert asyncio.isfuture(req._writer)
await resp.wait_for_close()
From d5cd8727f90d65f74f5ba574eec1f8dc100fd96a Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Sat, 28 Feb 2026 20:13:00 -0300
Subject: [PATCH 107/141] [3.13 backport] Fix test_data_file race condition on
Python 3.14 free-threaded (#12171)
---
CHANGES/12170.misc.rst | 1 +
tests/test_client_request.py | 13 ++++++++++++-
2 files changed, 13 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12170.misc.rst
diff --git a/CHANGES/12170.misc.rst b/CHANGES/12170.misc.rst
new file mode 100644
index 00000000000..23c63db50e9
--- /dev/null
+++ b/CHANGES/12170.misc.rst
@@ -0,0 +1 @@
+Fixed race condition in ``test_data_file`` on Python 3.14 free-threaded builds -- by :user:`rodrigobnogueira`.
diff --git a/tests/test_client_request.py b/tests/test_client_request.py
index db25f6ff910..e3cdc1c62d3 100644
--- a/tests/test_client_request.py
+++ b/tests/test_client_request.py
@@ -1173,7 +1173,18 @@ async def test_data_file(loop, buf, conn) -> None:
assert isinstance(req.body, payload.BufferedReaderPayload)
assert req.headers["TRANSFER-ENCODING"] == "chunked"
- resp = await req.send(conn)
+ original_write_bytes = req.write_bytes
+
+ async def _mock_write_bytes(
+ writer: AbstractStreamWriter, conn: mock.Mock, content_length: Optional[int]
+ ) -> None:
+ # Ensure the task is scheduled so _writer isn't None
+ await asyncio.sleep(0)
+ await original_write_bytes(writer, conn, content_length)
+
+ with mock.patch.object(req, "write_bytes", _mock_write_bytes):
+ resp = await req.send(conn)
+
assert asyncio.isfuture(req._writer)
await resp.wait_for_close()
From 70847a4826ccd749c33003e4a0ddd79886b49f3a Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Sun, 1 Mar 2026 15:54:33 +0000
Subject: [PATCH 108/141] Bump filelock from 3.24.3 to 3.25.0 (#12179)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [filelock](https://github.com/tox-dev/py-filelock) from 3.24.3 to
3.25.0.
Release notes
Sourced from filelock's
releases.
3.25.0
What's Changed
Full Changelog: https://github.com/tox-dev/filelock/compare/3.24.4...3.25.0
3.24.4
What's Changed
New Contributors
Full Changelog: https://github.com/tox-dev/filelock/compare/3.24.3...3.24.4
Changelog
Sourced from filelock's
changelog.
###########
Changelog
###########
3.25.0 (2026-03-01)
- ✨ feat(async): add AsyncReadWriteLock :pr:
506
- Standardize .github files to .yaml suffix
- build(deps): bump actions/download-artifact from 7 to 8
:pr:
503 - by :user:dependabot[bot]
- build(deps): bump actions/upload-artifact from 6 to 7
:pr:
502 - by :user:dependabot[bot]
- Move SECURITY.md to .github/SECURITY.md
- Add security policy
- Add permissions to check workflow :pr:
500
- [pre-commit.ci] pre-commit autoupdate :pr:
499 - by
:user:pre-commit-ci[bot]
3.24.3 (2026-02-19)
- 🐛 fix(unix): handle ENOENT race on FUSE/NFS during acquire
:pr:
495
- 🐛 fix(ci): add trailing blank line after changelog entries
:pr:
492
3.24.2 (2026-02-16)
- 🐛 fix(rw): close sqlite3 cursors and skip SoftFileLock Windows race
:pr:
491
- 🐛 fix(test): resolve flaky write non-starvation test
:pr:
490
- 📝 docs: restructure using Diataxis framework
:pr:
489
3.24.1 (2026-02-15)
- 🐛 fix(soft): resolve Windows deadlock and test race condition
:pr:
488
3.24.0 (2026-02-14)
- ✨ feat(lock): add lifetime parameter for lock expiration (#68)
:pr:
486
- ✨ feat(lock): add cancel_check to acquire (#309)
:pr:
487
- 🐛 fix(api): detect same-thread self-deadlock
:pr:
481
- ✨ feat(mode): respect POSIX default ACLs (#378)
:pr:
483
- 🐛 fix(win): eliminate lock file race in threaded usage
:pr:
484
- ✨ feat(lock): add poll_interval to constructor
:pr:
482
- 🐛 fix(unix): auto-fallback to SoftFileLock on ENOSYS
:pr:
480
... (truncated)
Commits
7f195d9
Release 3.25.0
df2754e
✨ feat(async): add AsyncReadWriteLock (#506)
8a359c5
Standardize .github files to .yaml suffix
9e7b33d
build(deps): bump actions/download-artifact from 7 to 8 (#503)
5fe6836
build(deps): bump actions/upload-artifact from 6 to 7 (#502)
af265f9
Move SECURITY.md to .github/SECURITY.md
67a5569
Add security policy
4b8c261
Add permissions to check workflow (#500)
e749d66
[pre-commit.ci] pre-commit autoupdate (#499)
721b37b
Fix ValueError in _acquire_transaction_lock when blocking=False with
timeout ...
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index c4a416a45fa..633f3d4ea47 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -71,7 +71,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.24.3
+filelock==3.25.0
# via
# python-discovery
# virtualenv
diff --git a/requirements/dev.txt b/requirements/dev.txt
index 99f6d30d286..b642cd1ee70 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -69,7 +69,7 @@ exceptiongroup==1.3.1
# via pytest
execnet==2.1.2
# via pytest-xdist
-filelock==3.24.3
+filelock==3.25.0
# via
# python-discovery
# virtualenv
diff --git a/requirements/lint.txt b/requirements/lint.txt
index 5e5552fd115..e3e5655eb3f 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -29,7 +29,7 @@ distlib==0.4.0
# via virtualenv
exceptiongroup==1.3.1
# via pytest
-filelock==3.24.3
+filelock==3.25.0
# via
# python-discovery
# virtualenv
From 15122829ee5502431f2d3e9fc6ef1578fe1507d0 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 1 Mar 2026 16:03:16 +0000
Subject: [PATCH 109/141] [PR #12175/7c33f51a backport][3.14] Dependabot:
update docker (#12176)
**This is a backport of PR #12175 as merged into master
(7c33f51a808bbdc3f1efdcd47d20e9593fb72349).**
Co-authored-by: Sam Bull
---
.github/dependabot.yml | 15 +++++++++++++++
1 file changed, 15 insertions(+)
diff --git a/.github/dependabot.yml b/.github/dependabot.yml
index 8b56354f345..8b7e6a28873 100644
--- a/.github/dependabot.yml
+++ b/.github/dependabot.yml
@@ -30,6 +30,21 @@ updates:
interval: "daily"
open-pull-requests-limit: 10
+ - package-ecosystem: "docker"
+ directory: "/"
+ labels:
+ - dependencies
+ schedule:
+ interval: "monthly"
+
+ - package-ecosystem: "docker"
+ directory: "/"
+ labels:
+ - dependencies
+ target-branch: "3.14"
+ schedule:
+ interval: "monthly"
+
# Maintain dependencies for Python aiohttp backport
- package-ecosystem: "pip"
directory: "/"
From 5d7f860e1801bc515878f6b95e21afe04945ae08 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Sun, 1 Mar 2026 17:17:18 +0000
Subject: [PATCH 110/141] Bump crossbario/autobahn-testsuite from 0.8.2 to
25.10.1 in /tests/autobahn (#12182)
Bumps crossbario/autobahn-testsuite from 0.8.2 to 25.10.1.
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
tests/autobahn/Dockerfile.autobahn | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/tests/autobahn/Dockerfile.autobahn b/tests/autobahn/Dockerfile.autobahn
index 45f18182804..ffd90ba282e 100644
--- a/tests/autobahn/Dockerfile.autobahn
+++ b/tests/autobahn/Dockerfile.autobahn
@@ -1,4 +1,4 @@
-FROM crossbario/autobahn-testsuite:0.8.2
+FROM crossbario/autobahn-testsuite:25.10.1
RUN apt-get update && apt-get install python3 python3-pip -y
RUN pip3 install wait-for-it
From b93d41126fd924baa0ff9ea4aa10169d3bedd41c Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 1 Mar 2026 17:27:22 +0000
Subject: [PATCH 111/141] [PR #12180/76d63cf9 backport][3.14] Fix docker update
for autobahn (#12181)
**This is a backport of PR #12180 as merged into master
(76d63cf960b588d117a56d0edb1ed463bc61bc1f).**
Co-authored-by: Sam Bull
---
.github/dependabot.yml | 22 +++++++++++-----------
1 file changed, 11 insertions(+), 11 deletions(-)
diff --git a/.github/dependabot.yml b/.github/dependabot.yml
index 8b7e6a28873..6ac64362454 100644
--- a/.github/dependabot.yml
+++ b/.github/dependabot.yml
@@ -30,29 +30,29 @@ updates:
interval: "daily"
open-pull-requests-limit: 10
- - package-ecosystem: "docker"
+ # Maintain dependencies for Python aiohttp backport
+ - package-ecosystem: "pip"
directory: "/"
+ allow:
+ - dependency-type: "all"
labels:
- dependencies
+ target-branch: "3.14"
schedule:
- interval: "monthly"
+ interval: "daily"
+ open-pull-requests-limit: 10
- package-ecosystem: "docker"
- directory: "/"
+ directory: "/tests/autobahn/"
labels:
- dependencies
- target-branch: "3.14"
schedule:
interval: "monthly"
- # Maintain dependencies for Python aiohttp backport
- - package-ecosystem: "pip"
- directory: "/"
- allow:
- - dependency-type: "all"
+ - package-ecosystem: "docker"
+ directory: "/tests/autobahn/"
labels:
- dependencies
target-branch: "3.14"
schedule:
- interval: "daily"
- open-pull-requests-limit: 10
+ interval: "monthly"
From 9029e60eef3850dfc0e70c565e3d9213010e74b8 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Mon, 2 Mar 2026 11:55:43 +0000
Subject: [PATCH 112/141] Bump identify from 2.6.16 to 2.6.17 (#12189)
Bumps [identify](https://github.com/pre-commit/identify) from 2.6.16 to
2.6.17.
Commits
bc5fa61
v2.6.17
c866be7
Merge pull request #563
from seanbudd/patch-2
c20eeb5
Add support for sconstruct and sconscript extensions
8f02442
Merge pull request #577
from andykernahan/add-slnx
a10759d
Merge pull request #571
from petamas/add-entitlements
749d185
Add support for 'slnx' file extension
a4ed2ca
Merge pull request #560
from sebastiw/patch-1
53e33c6
Add 'escript' file type for Erlang
50e032f
Merge pull request #574
from petamas/add-diff
93f1aa6
Merge pull request #569
from petamas/add-uv-lock
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
requirements/constraints.txt | 2 +-
requirements/dev.txt | 2 +-
requirements/lint.txt | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/requirements/constraints.txt b/requirements/constraints.txt
index 633f3d4ea47..0bc7d27d78f 100644
--- a/requirements/constraints.txt
+++ b/requirements/constraints.txt
@@ -87,7 +87,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==25.1.0
# via -r requirements/base.in
-identify==2.6.16
+identify==2.6.17
# via pre-commit
idna==3.11
# via
diff --git a/requirements/dev.txt b/requirements/dev.txt
index b642cd1ee70..7d67f91b6f2 100644
--- a/requirements/dev.txt
+++ b/requirements/dev.txt
@@ -85,7 +85,7 @@ frozenlist==1.8.0
# aiosignal
gunicorn==25.1.0
# via -r requirements/base.in
-identify==2.6.16
+identify==2.6.17
# via pre-commit
idna==3.11
# via
diff --git a/requirements/lint.txt b/requirements/lint.txt
index e3e5655eb3f..9cea924d820 100644
--- a/requirements/lint.txt
+++ b/requirements/lint.txt
@@ -37,7 +37,7 @@ forbiddenfruit==0.1.4
# via blockbuster
freezegun==1.5.5
# via -r requirements/lint.in
-identify==2.6.16
+identify==2.6.17
# via pre-commit
idna==3.11
# via trustme
From 7dbaece69a347090958da094e970e6619e79ee9d Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Mon, 2 Mar 2026 10:05:34 -0300
Subject: [PATCH 113/141] [3.14 backport] feat: Add dynamic port binding to
TCPSite (#12167) (#12184)
Co-authored-by: Tom Whittock
Co-authored-by: Sam Bull
Co-authored-by: Tom Whittock <136440158+twhittock-disguise@users.noreply.github.com>
Co-authored-by: J. Nick Koston
Co-authored-by: rodrigo.nogueira
---
CHANGES/10665.feature.rst | 1 +
CONTRIBUTORS.txt | 1 +
aiohttp/web_runner.py | 33 ++++++++++++++++++++++++++++++---
docs/web_reference.rst | 10 +++++++++-
examples/fake_server.py | 11 +++++------
tests/test_run_app.py | 2 ++
tests/test_web_runner.py | 34 +++++++++++++++++++++-------------
7 files changed, 69 insertions(+), 23 deletions(-)
create mode 100644 CHANGES/10665.feature.rst
diff --git a/CHANGES/10665.feature.rst b/CHANGES/10665.feature.rst
new file mode 100644
index 00000000000..afb4768c7cf
--- /dev/null
+++ b/CHANGES/10665.feature.rst
@@ -0,0 +1 @@
+Added :py:attr:`~aiohttp.web.TCPSite.port` accessor for dynamic port allocations in :class:`~aiohttp.web.TCPSite` -- by :user:`twhittock-disguise` and :user:`rodrigobnogueira`.
diff --git a/CONTRIBUTORS.txt b/CONTRIBUTORS.txt
index 845b24c7f29..755bcb7d1aa 100644
--- a/CONTRIBUTORS.txt
+++ b/CONTRIBUTORS.txt
@@ -357,6 +357,7 @@ Thomas Forbes
Thomas Grainger
Tim Menninger
Tolga Tezel
+Tom Whittock
Tomasz Trebski
Toshiaki Tanaka
Trevor Gamblin
diff --git a/aiohttp/web_runner.py b/aiohttp/web_runner.py
index 6ea43c6237c..b2128fa3cb1 100644
--- a/aiohttp/web_runner.py
+++ b/aiohttp/web_runner.py
@@ -7,8 +7,10 @@
from yarl import URL
+from .abc import AbstractAccessLogger
from .typedefs import PathLike
from .web_app import Application
+from .web_log import AccessLogger
from .web_server import Server
if TYPE_CHECKING:
@@ -80,7 +82,7 @@ async def stop(self) -> None:
class TCPSite(BaseSite):
- __slots__ = ("_host", "_port", "_reuse_address", "_reuse_port")
+ __slots__ = ("_host", "_port", "_bound_port", "_reuse_address", "_reuse_port")
def __init__(
self,
@@ -104,14 +106,29 @@ def __init__(
if port is None:
port = 8443 if self._ssl_context else 8080
self._port = port
+ self._bound_port: int | None = None
self._reuse_address = reuse_address
self._reuse_port = reuse_port
+ @property
+ def port(self) -> int:
+ """The port the server is listening on.
+
+ If the server hasn't been started yet, this returns the requested port
+ (which might be 0 for a dynamic port).
+ After the server starts, it returns the actual bound port. This is
+ especially useful when port=0 was requested, as it allows retrieving the
+ dynamically assigned port after the site has started.
+ """
+ if self._bound_port is not None:
+ return self._bound_port
+ return self._port
+
@property
def name(self) -> str:
scheme = "https" if self._ssl_context else "http"
host = "0.0.0.0" if not self._host else self._host
- return str(URL.build(scheme=scheme, host=host, port=self._port))
+ return str(URL.build(scheme=scheme, host=host, port=self.port))
async def start(self) -> None:
await super().start()
@@ -127,6 +144,10 @@ async def start(self) -> None:
reuse_address=self._reuse_address,
reuse_port=self._reuse_port,
)
+ if self._server.sockets:
+ self._bound_port = self._server.sockets[0].getsockname()[1]
+ else:
+ self._bound_port = self._port
class UnixSite(BaseSite):
@@ -369,13 +390,19 @@ class AppRunner(BaseRunner):
__slots__ = ("_app",)
def __init__(
- self, app: Application, *, handle_signals: bool = False, **kwargs: Any
+ self,
+ app: Application,
+ *,
+ handle_signals: bool = False,
+ access_log_class: type[AbstractAccessLogger] = AccessLogger,
+ **kwargs: Any,
) -> None:
super().__init__(handle_signals=handle_signals, **kwargs)
if not isinstance(app, Application):
raise TypeError(
f"The first argument should be web.Application instance, got {app!r}"
)
+ self._kwargs["access_log_class"] = access_log_class
self._app = app
@property
diff --git a/docs/web_reference.rst b/docs/web_reference.rst
index 0870538b211..a3d9e1b284d 100644
--- a/docs/web_reference.rst
+++ b/docs/web_reference.rst
@@ -2998,7 +2998,9 @@ application on specific TCP or Unix socket, e.g.::
:param str host: HOST to listen on, all interfaces if ``None`` (default).
- :param int port: PORT to listed on, ``8080`` if ``None`` (default).
+ :param int port: PORT to listen on, ``8080`` if ``None`` (default).
+ Use ``0`` to let the OS assign a free ephemeral port
+ (see :attr:`port`).
:param float shutdown_timeout: a timeout used for both waiting on pending
tasks before application shutdown and for
@@ -3026,6 +3028,12 @@ application on specific TCP or Unix socket, e.g.::
this flag when being created. This option is not
supported on Windows.
+ .. attribute:: port
+
+ Read-only. The actual port number the server is bound to, only
+ guaranteed to be correct after the site has been started.
+
+
.. class:: UnixSite(runner, path, *, \
shutdown_timeout=60.0, ssl_context=None, \
backlog=128)
diff --git a/examples/fake_server.py b/examples/fake_server.py
index bdfa671036c..af6de0903e9 100755
--- a/examples/fake_server.py
+++ b/examples/fake_server.py
@@ -8,7 +8,6 @@
from aiohttp import web
from aiohttp.abc import AbstractResolver, ResolveResult
from aiohttp.resolver import DefaultResolver
-from aiohttp.test_utils import unused_port
class FakeResolver(AbstractResolver):
@@ -61,13 +60,13 @@ def __init__(self, *, loop):
self.ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
self.ssl_context.load_cert_chain(str(ssl_cert), str(ssl_key))
- async def start(self):
- port = unused_port()
- self.runner = web.AppRunner(self.app)
+ async def start(self) -> dict[str, int]:
await self.runner.setup()
- site = web.TCPSite(self.runner, "127.0.0.1", port, ssl_context=self.ssl_context)
+ site = web.TCPSite(
+ self.runner, "127.0.0.1", port=0, ssl_context=self.ssl_context
+ )
await site.start()
- return {"graph.facebook.com": port}
+ return {"graph.facebook.com": site.port}
async def stop(self):
await self.runner.cleanup()
diff --git a/tests/test_run_app.py b/tests/test_run_app.py
index c2ec32b2390..899064aa165 100644
--- a/tests/test_run_app.py
+++ b/tests/test_run_app.py
@@ -61,8 +61,10 @@ def patched_loop(
) -> Iterator[asyncio.AbstractEventLoop]:
server = mock.create_autospec(asyncio.Server, spec_set=True, instance=True)
server.wait_closed.return_value = None
+ server.sockets = []
unix_server = mock.create_autospec(asyncio.Server, spec_set=True, instance=True)
unix_server.wait_closed.return_value = None
+ unix_server.sockets = []
with mock.patch.object(
loop, "create_server", autospec=True, spec_set=True, return_value=server
):
diff --git a/tests/test_web_runner.py b/tests/test_web_runner.py
index 22ce3d00650..d9e16a82f39 100644
--- a/tests/test_web_runner.py
+++ b/tests/test_web_runner.py
@@ -2,7 +2,7 @@
import platform
import signal
from typing import Any
-from unittest.mock import patch
+from unittest import mock
import pytest
@@ -167,20 +167,16 @@ async def test_tcpsite_default_host(make_runner: Any) -> None:
site = web.TCPSite(runner)
assert site.name == "http://0.0.0.0:8080"
- calls = []
-
- async def mock_create_server(*args, **kwargs):
- calls.append((args, kwargs))
-
- with patch("asyncio.get_event_loop") as mock_get_loop:
- mock_get_loop.return_value.create_server = mock_create_server
+ m = mock.create_autospec(asyncio.AbstractEventLoop, spec_set=True, instance=True)
+ m.create_server.return_value = mock.create_autospec(asyncio.Server, spec_set=True)
+ with mock.patch(
+ "asyncio.get_event_loop", autospec=True, spec_set=True, return_value=m
+ ):
await site.start()
- assert len(calls) == 1
- server, host, port = calls[0][0]
- assert server is runner.server
- assert host is None
- assert port == 8080
+ m.create_server.assert_called_once()
+ args, kwargs = m.create_server.call_args
+ assert args == (runner.server, None, 8080)
async def test_tcpsite_empty_str_host(make_runner: Any) -> None:
@@ -242,3 +238,15 @@ async def test_app_handler_args_ceil_threshold(value: Any, expected: Any) -> Non
assert rh._timeout_ceil_threshold == expected
await runner.cleanup()
assert app
+
+
+async def test_tcpsite_ephemeral_port(make_runner: Any) -> None:
+ runner = make_runner()
+ await runner.setup()
+ site = web.TCPSite(runner, port=0)
+ assert site.port == 0
+
+ await site.start()
+ assert site.port != 0
+ assert site.name.startswith("http://0.0.0.0:")
+ await site.stop()
From 8d539f791cf9897ca08ed2d96a37d20eb7745898 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Wed, 4 Mar 2026 10:53:20 +0000
Subject: [PATCH 114/141] Bump docker/setup-qemu-action from 3 to 4 (#12193)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps
[docker/setup-qemu-action](https://github.com/docker/setup-qemu-action)
from 3 to 4.
Release notes
Sourced from docker/setup-qemu-action's
releases.
v4.0.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.7.0...v4.0.0
v3.7.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.6.0...v3.7.0
v3.6.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.5.0...v3.6.0
v3.5.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.4.0...v3.5.0
v3.4.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.3.0...v3.4.0
v3.3.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.2.0...v3.3.0
v3.2.0
Full Changelog: https://github.com/docker/setup-qemu-action/compare/v3.1.0...v3.2.0
v3.1.0
... (truncated)
Commits
ce36039
Merge pull request #245
from crazy-max/node24
6386344
node 24 as default runtime
1ea3db7
Merge pull request #243
from docker/dependabot/npm_and_yarn/docker/actions-to...
b56a002
chore: update generated content
c43f02d
build(deps): bump @docker/actions-toolkit from 0.67.0 to
0.77.0
ce10c58
Merge pull request #244
from docker/dependabot/npm_and_yarn/actions/core-3.0.0
429fc9d
chore: update generated content
060e5f8
build(deps): bump @actions/core from 1.11.1 to 3.0.0
44be13e
Merge pull request #231
from docker/dependabot/npm_and_yarn/js-yaml-3.14.2
1897438
chore: update generated content
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 815763e68aa..686c8c026b8 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -423,7 +423,7 @@ jobs:
submodules: true
- name: Set up QEMU
if: ${{ matrix.qemu }}
- uses: docker/setup-qemu-action@v3
+ uses: docker/setup-qemu-action@v4
with:
platforms: all
# This should be temporary
From cd01de26bf303737b8972fde9701c3bfebe19066 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Wed, 4 Mar 2026 18:55:04 +0000
Subject: [PATCH 115/141] Rework autobahn tests (#12173) (#12196)
(cherry picked from commit 16ffed73d809790fafebd6d3ce52b1b7dce8e2f5)
---
.github/workflows/ci-cd.yml | 103 +++++++++++-
CHANGES/12173.contrib.rst | 1 +
setup.cfg | 4 +-
tests/autobahn/client/client.py | 39 ++---
tests/autobahn/client/fuzzingserver.json | 5 +-
tests/autobahn/server/fuzzingclient.json | 6 +-
tests/autobahn/server/server.py | 26 +--
tests/autobahn/test_autobahn.py | 205 +++++++++++++++--------
8 files changed, 258 insertions(+), 131 deletions(-)
create mode 100644 CHANGES/12173.contrib.rst
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 686c8c026b8..0dc5293b834 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -222,7 +222,7 @@ jobs:
PIP_USER: 1
run: >-
PATH="${HOME}/Library/Python/3.11/bin:${HOME}/.local/bin:${PATH}"
- pytest --junitxml=junit.xml
+ pytest --junitxml=junit.xml -m 'not dev_mode and not autobahn'
shell: bash
- name: Re-run the failing tests with maximum verbosity
if: failure()
@@ -265,6 +265,98 @@ jobs:
with:
token: ${{ secrets.CODECOV_TOKEN }}
+ autobahn:
+ permissions:
+ contents: read # to fetch code (actions/checkout)
+
+ name: Autobahn testsuite
+ needs: gen_llhttp
+ strategy:
+ matrix:
+ pyver: ['3.14']
+ no-extensions: ['']
+ os: [ubuntu]
+ fail-fast: true
+ runs-on: ${{ matrix.os }}-latest
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v6
+ with:
+ submodules: true
+ - name: Setup Python ${{ matrix.pyver }}
+ id: python-install
+ uses: actions/setup-python@v6
+ with:
+ allow-prereleases: true
+ python-version: ${{ matrix.pyver }}
+ - name: Get pip cache dir
+ id: pip-cache
+ run: |
+ echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
+ shell: bash
+ - name: Cache PyPI
+ uses: actions/cache@v5.0.3
+ with:
+ key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
+ path: ${{ steps.pip-cache.outputs.dir }}
+ restore-keys: |
+ pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-
+ - name: Update pip, wheel, setuptools, build, twine
+ run: |
+ python -m pip install -U pip wheel setuptools build twine
+ - name: Install dependencies
+ env:
+ DEPENDENCY_GROUP: test${{ endsWith(matrix.pyver, 't') && '-ft' || '' }}
+ run: |
+ python -Im pip install -r requirements/${{ env.DEPENDENCY_GROUP }}.in -c requirements/${{ env.DEPENDENCY_GROUP }}.txt
+ - name: Restore llhttp generated files
+ if: ${{ matrix.no-extensions == '' }}
+ uses: actions/download-artifact@v8
+ with:
+ name: llhttp
+ path: vendor/llhttp/build/
+ - name: Cythonize
+ if: ${{ matrix.no-extensions == '' }}
+ run: |
+ make cythonize
+ - name: Install self
+ env:
+ AIOHTTP_NO_EXTENSIONS: ${{ matrix.no-extensions }}
+ run: python -m pip install -e .
+ - name: Run unittests
+ env:
+ COLOR: yes
+ AIOHTTP_NO_EXTENSIONS: ${{ matrix.no-extensions }}
+ PIP_USER: 1
+ run: >-
+ PATH="${HOME}/Library/Python/3.11/bin:${HOME}/.local/bin:${PATH}"
+ pytest --junitxml=junit.xml --numprocesses=0 -m autobahn
+ shell: bash
+ - name: Turn coverage into xml
+ env:
+ COLOR: 'yes'
+ PIP_USER: 1
+ run: |
+ python -m coverage xml
+ - name: Upload coverage
+ uses: codecov/codecov-action@v5
+ with:
+ files: ./coverage.xml
+ flags: >-
+ CI-GHA,OS-${{
+ runner.os
+ }},VM-${{
+ matrix.os
+ }},Py-${{
+ steps.python-install.outputs.python-version
+ }}
+ token: ${{ secrets.CODECOV_TOKEN }}
+ - name: Upload test results to Codecov
+ if: ${{ !cancelled() }}
+ uses: codecov/test-results-action@v1
+ with:
+ token: ${{ secrets.CODECOV_TOKEN }}
+
benchmark:
name: Benchmark
needs:
@@ -315,20 +407,21 @@ jobs:
needs:
- lint
- test
+ - autobahn
runs-on: ubuntu-latest
steps:
+ - name: Decide whether the needed jobs succeeded or failed
+ uses: re-actors/alls-green@release/v1
+ with:
+ jobs: ${{ toJSON(needs) }}
- name: Trigger codecov notification
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: true
run_command: send-notifications
- - name: Decide whether the needed jobs succeeded or failed
- uses: re-actors/alls-green@release/v1
- with:
- jobs: ${{ toJSON(needs) }}
pre-deploy:
name: Pre-Deploy
diff --git a/CHANGES/12173.contrib.rst b/CHANGES/12173.contrib.rst
new file mode 100644
index 00000000000..7c1bd3d1c1b
--- /dev/null
+++ b/CHANGES/12173.contrib.rst
@@ -0,0 +1 @@
+Fixed and reworked ``autobahn`` tests -- by :user:`Dreamsorcerer`.
diff --git a/setup.cfg b/setup.cfg
index b1bf1464a11..5f02a07c14f 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -61,8 +61,7 @@ addopts =
--cov=aiohttp
--cov=tests/
- # run tests that are not marked with dev_mode
- -m "not dev_mode"
+ -m "not dev_mode and not autobahn and not internal"
filterwarnings =
error
ignore:module 'ssl' has no attribute 'OP_NO_COMPRESSION'. The Python interpreter is compiled against OpenSSL < 1.0.0. Ref. https.//docs.python.org/3/library/ssl.html#ssl.OP_NO_COMPRESSION:UserWarning
@@ -93,6 +92,7 @@ minversion = 3.8.2
testpaths = tests/
xfail_strict = true
markers =
+ autobahn: Autobahn testsuite. Should be run as a separate job.
dev_mode: mark test to run in dev mode.
internal: tests which may cause issues for packagers, but should be run in aiohttp's CI.
skip_blockbuster: mark test to skip the blockbuster fixture.
diff --git a/tests/autobahn/client/client.py b/tests/autobahn/client/client.py
index dfca77d12b2..2a58ff8bf33 100644
--- a/tests/autobahn/client/client.py
+++ b/tests/autobahn/client/client.py
@@ -2,40 +2,31 @@
import asyncio
-import aiohttp
+from aiohttp import ClientSession, WSMsgType
async def client(url: str, name: str) -> None:
- async with aiohttp.ClientSession() as session:
- async with session.ws_connect(url + "/getCaseCount") as ws:
- num_tests = int((await ws.receive()).data)
- print("running %d cases" % num_tests)
+ async with ClientSession(base_url=url) as session:
+ async with session.ws_connect("/getCaseCount") as ws:
+ msg = await ws.receive()
+ assert msg.type is WSMsgType.TEXT
+ num_tests = int(msg.data)
for i in range(1, num_tests + 1):
- print("running test case:", i)
- text_url = url + "/runCase?case=%d&agent=%s" % (i, name)
- async with session.ws_connect(text_url) as ws:
+ async with session.ws_connect(
+ "/runCase", params={"case": i, "agent": name}
+ ) as ws:
async for msg in ws:
- if msg.type == aiohttp.WSMsgType.TEXT:
+ if msg.type is WSMsgType.TEXT:
await ws.send_str(msg.data)
- elif msg.type == aiohttp.WSMsgType.BINARY:
+ elif msg.type is WSMsgType.BINARY:
await ws.send_bytes(msg.data)
else:
break
- url = url + "/updateReports?agent=%s" % name
- async with session.ws_connect(url) as ws:
- print("finally requesting %s" % url)
+ async with session.ws_connect("/updateReports", params={"agent": name}) as ws:
+ pass
-async def run(url: str, name: str) -> None:
- try:
- await client(url, name)
- except Exception:
- import traceback
-
- traceback.print_exc()
-
-
-if __name__ == "__main__":
- asyncio.run(run("http://localhost:9001", "aiohttp"))
+if __name__ == "__main__": # pragma: no branch
+ asyncio.run(client("http://localhost:9001", "aiohttp"))
diff --git a/tests/autobahn/client/fuzzingserver.json b/tests/autobahn/client/fuzzingserver.json
index 5f8bf31d203..8530d15bd36 100644
--- a/tests/autobahn/client/fuzzingserver.json
+++ b/tests/autobahn/client/fuzzingserver.json
@@ -1,12 +1,11 @@
-
{
"url": "ws://localhost:9001",
- "options": {"failByDrop": false},
+ "options": {"failByDrop": true},
"outdir": "./reports/clients",
"webport": 8080,
"cases": ["*"],
- "exclude-cases": ["12.*", "13.*"],
+ "exclude-cases": [],
"exclude-agent-cases": {}
}
diff --git a/tests/autobahn/server/fuzzingclient.json b/tests/autobahn/server/fuzzingclient.json
index 0ed2f84acf8..bf34731a2e8 100644
--- a/tests/autobahn/server/fuzzingclient.json
+++ b/tests/autobahn/server/fuzzingclient.json
@@ -1,16 +1,16 @@
{
- "options": { "failByDrop": false },
+ "options": {"failByDrop": true},
"outdir": "./reports/servers",
"servers": [
{
"agent": "AutobahnServer",
"url": "ws://localhost:9001",
- "options": { "version": 18 }
+ "options": {"version": 18}
}
],
"cases": ["*"],
- "exclude-cases": ["12.*", "13.*"],
+ "exclude-cases": [],
"exclude-agent-cases": {}
}
diff --git a/tests/autobahn/server/server.py b/tests/autobahn/server/server.py
index 86e3b5bfe0c..f03816099ef 100644
--- a/tests/autobahn/server/server.py
+++ b/tests/autobahn/server/server.py
@@ -9,24 +9,15 @@
async def wshandler(request: web.Request) -> web.WebSocketResponse:
ws = web.WebSocketResponse(autoclose=False)
- is_ws = ws.can_prepare(request)
- if not is_ws:
- raise web.HTTPBadRequest()
-
await ws.prepare(request)
request.app[websockets].append(ws)
- while True:
- msg = await ws.receive()
-
- if msg.type == web.WSMsgType.TEXT:
+ async for msg in ws:
+ if msg.type is web.WSMsgType.TEXT:
await ws.send_str(msg.data)
elif msg.type == web.WSMsgType.BINARY:
await ws.send_bytes(msg.data)
- elif msg.type == web.WSMsgType.CLOSE:
- await ws.close()
- break
else:
break
@@ -34,22 +25,17 @@ async def wshandler(request: web.Request) -> web.WebSocketResponse:
async def on_shutdown(app: web.Application) -> None:
- ws_list = app[websockets]
- for ws in set(ws_list):
+ for ws in app[websockets]:
await ws.close(code=WSCloseCode.GOING_AWAY, message=b"Server shutdown")
-if __name__ == "__main__":
+if __name__ == "__main__": # pragma: no branch
logging.basicConfig(
level=logging.DEBUG, format="%(asctime)s %(levelname)s %(message)s"
)
app = web.Application()
- l: list[web.WebSocketResponse] = []
- app[websockets] = l
+ app[websockets] = []
app.router.add_route("GET", "/", wshandler)
app.on_shutdown.append(on_shutdown)
- try:
- web.run_app(app, port=9001)
- except KeyboardInterrupt:
- print("Server stopped at http://127.0.0.1:9001")
+ web.run_app(app, port=9001)
diff --git a/tests/autobahn/test_autobahn.py b/tests/autobahn/test_autobahn.py
index d6498242143..a72cf8dbbb0 100644
--- a/tests/autobahn/test_autobahn.py
+++ b/tests/autobahn/test_autobahn.py
@@ -1,17 +1,22 @@
import json
+import pprint
import subprocess
-import sys
-from collections.abc import Generator
+from collections.abc import Iterator
from pathlib import Path
-from typing import TYPE_CHECKING, Any
+from typing import TYPE_CHECKING
import pytest
from pytest import TempPathFactory
if TYPE_CHECKING:
- import python_on_whales
+ from python_on_whales import DockerException, docker
else:
python_on_whales = pytest.importorskip("python_on_whales")
+ DockerException = python_on_whales.DockerException
+ docker = python_on_whales.docker
+
+# (Test number, test status, test report)
+Result = tuple[str, str, dict[str, object] | None]
@pytest.fixture(scope="session")
@@ -20,16 +25,12 @@ def report_dir(tmp_path_factory: TempPathFactory) -> Path:
@pytest.fixture(scope="session", autouse=True)
-def build_autobahn_testsuite() -> Generator[None, None, None]:
-
- try:
- python_on_whales.docker.build(
- file="tests/autobahn/Dockerfile.autobahn",
- tags=["autobahn-testsuite"],
- context_path=".",
- )
- except python_on_whales.DockerException:
- pytest.skip(msg="The docker daemon is not running.")
+def build_autobahn_testsuite() -> Iterator[None]:
+ docker.build(
+ file="tests/autobahn/Dockerfile.autobahn",
+ tags=["autobahn-testsuite"],
+ context_path=".",
+ )
try:
yield
@@ -37,80 +38,117 @@ def build_autobahn_testsuite() -> Generator[None, None, None]:
python_on_whales.docker.image.remove(x="autobahn-testsuite")
-def get_failed_tests(report_path: str, name: str) -> list[dict[str, Any]]:
- path = Path(report_path)
- result_summary = json.loads((path / "index.json").read_text())[name]
- failed_messages = []
- PASS = {"OK", "INFORMATIONAL"}
- entry_fields = {"case", "description", "expectation", "expected", "received"}
- for results in result_summary.values():
- if results["behavior"] in PASS and results["behaviorClose"] in PASS:
- continue
- report = json.loads((path / results["reportfile"]).read_text())
- failed_messages.append({field: report[field] for field in entry_fields})
- return failed_messages
+def get_report(path: Path, result: dict[str, str]) -> dict[str, object] | None:
+ if result["behaviorClose"] == "OK":
+ return None
+ return json.loads((path / result["reportfile"]).read_text()) # type: ignore[no-any-return]
-@pytest.mark.skipif(sys.platform == "darwin", reason="Don't run on macOS")
-@pytest.mark.xfail
-def test_client(report_dir: Path, request: Any) -> None:
+def get_test_results(path: Path, name: str) -> tuple[Result, ...]:
+ results = json.loads((path / "index.json").read_text())[name]
+ return tuple(
+ (k, r["behaviorClose"], get_report(path, r)) for k, r in results.items()
+ )
+
+
+def process_xfail(
+ results: tuple[Result, ...], xfail: dict[str, str]
+) -> list[dict[str, object]]:
+ failed = []
+ for number, status, details in results:
+ if number in xfail:
+ assert status not in {"OK", "INFORMATIONAL"} # Strict xfail
+ assert details is not None
+ if details["result"] == xfail[number]:
+ continue
+ if status not in {"OK", "INFORMATIONAL"}: # pragma: no cover
+ assert details is not None
+ pprint.pprint(details)
+ failed.append(details)
+ return failed
+
+
+@pytest.mark.autobahn
+def test_client(report_dir: Path, request: pytest.FixtureRequest) -> None:
+ client = subprocess.Popen(
+ (
+ "wait-for-it",
+ "-s",
+ "localhost:9001",
+ "--",
+ "coverage",
+ "run",
+ "-a",
+ "tests/autobahn/client/client.py",
+ )
+ )
try:
- print("Starting autobahn-testsuite server")
- autobahn_container = python_on_whales.docker.run(
+ autobahn_container = docker.run(
detach=True,
image="autobahn-testsuite",
name="autobahn",
publish=[(9001, 9001)],
remove=True,
volumes=[
- (f"{request.fspath.dirname}/client", "/config"),
- (f"{report_dir}", "/reports"),
+ (request.path.parent / "client", "/config"),
+ (report_dir, "/reports"),
],
)
- print("Running aiohttp test client")
- client = subprocess.Popen(
- ["wait-for-it", "-s", "localhost:9001", "--"]
- + [sys.executable]
- + ["tests/autobahn/client/client.py"]
- )
client.wait()
finally:
- print("Stopping client and server")
client.terminate()
client.wait()
# https://github.com/gabrieldemarmiesse/python-on-whales/pull/580
autobahn_container.stop() # type: ignore[union-attr]
- failed_messages = get_failed_tests(f"{report_dir}/clients", "aiohttp")
-
- assert not failed_messages, "\n".join(
- "\n\t".join(
- f"{field}: {msg[field]}"
- for field in ("case", "description", "expectation", "expected", "received")
- )
- for msg in failed_messages
+ results = get_test_results(report_dir / "clients", "aiohttp")
+ xfail = {
+ "3.4": "Actual events match at least one expected.",
+ "7.9.5": "The close code should have been 1002 or empty",
+ "9.1.4": "Did not receive message within 100 seconds.",
+ "9.1.5": "Did not receive message within 100 seconds.",
+ "9.1.6": "Did not receive message within 100 seconds.",
+ "9.2.4": "Did not receive message within 10 seconds.",
+ "9.2.5": "Did not receive message within 100 seconds.",
+ "9.2.6": "Did not receive message within 100 seconds.",
+ "9.3.1": "Did not receive message within 100 seconds.",
+ "9.3.2": "Did not receive message within 100 seconds.",
+ "9.3.3": "Did not receive message within 100 seconds.",
+ "9.3.4": "Did not receive message within 100 seconds.",
+ "9.3.5": "Did not receive message within 100 seconds.",
+ "9.3.6": "Did not receive message within 100 seconds.",
+ "9.3.7": "Did not receive message within 100 seconds.",
+ "9.3.8": "Did not receive message within 100 seconds.",
+ "9.3.9": "Did not receive message within 100 seconds.",
+ "9.4.1": "Did not receive message within 100 seconds.",
+ "9.4.2": "Did not receive message within 100 seconds.",
+ "9.4.3": "Did not receive message within 100 seconds.",
+ "9.4.4": "Did not receive message within 100 seconds.",
+ "9.4.5": "Did not receive message within 100 seconds.",
+ "9.4.6": "Did not receive message within 100 seconds.",
+ "9.4.7": "Did not receive message within 100 seconds.",
+ "9.4.8": "Did not receive message within 100 seconds.",
+ "9.4.9": "Did not receive message within 100 seconds.",
+ }
+ assert not process_xfail(results, xfail)
+
+
+@pytest.mark.autobahn
+def test_server(report_dir: Path, request: pytest.FixtureRequest) -> None:
+ server = subprocess.Popen(
+ ("coverage", "run", "-a", "tests/autobahn/server/server.py")
)
-
-
-@pytest.mark.skipif(sys.platform == "darwin", reason="Don't run on macOS")
-@pytest.mark.xfail
-def test_server(report_dir: Path, request: Any) -> None:
try:
- print("Starting aiohttp test server")
- server = subprocess.Popen(
- [sys.executable] + ["tests/autobahn/server/server.py"]
- )
- print("Starting autobahn-testsuite client")
- python_on_whales.docker.run(
+ docker.run(
image="autobahn-testsuite",
name="autobahn",
remove=True,
volumes=[
- (f"{request.fspath.dirname}/server", "/config"),
- (f"{report_dir}", "/reports"),
+ (request.path.parent / "server", "/config"),
+ (report_dir, "/reports"),
],
- networks=["host"],
- command=[
+ networks=("host",),
+ command=(
"wait-for-it",
"-s",
"localhost:9001",
@@ -120,19 +158,38 @@ def test_server(report_dir: Path, request: Any) -> None:
"fuzzingclient",
"--spec",
"/config/fuzzingclient.json",
- ],
+ ),
)
finally:
- print("Stopping client and server")
server.terminate()
server.wait()
- failed_messages = get_failed_tests(f"{report_dir}/servers", "AutobahnServer")
-
- assert not failed_messages, "\n".join(
- "\n\t".join(
- f"{field}: {msg[field]}"
- for field in ("case", "description", "expectation", "expected", "received")
- )
- for msg in failed_messages
- )
+ results = get_test_results(report_dir / "servers", "AutobahnServer")
+ xfail = {
+ "7.9.5": "The close code should have been 1002 or empty",
+ "9.1.4": "Did not receive message within 100 seconds.",
+ "9.1.5": "Did not receive message within 100 seconds.",
+ "9.1.6": "Did not receive message within 100 seconds.",
+ "9.2.4": "Did not receive message within 10 seconds.",
+ "9.2.5": "Did not receive message within 100 seconds.",
+ "9.2.6": "Did not receive message within 100 seconds.",
+ "9.3.1": "Did not receive message within 100 seconds.",
+ "9.3.2": "Did not receive message within 100 seconds.",
+ "9.3.3": "Did not receive message within 100 seconds.",
+ "9.3.4": "Did not receive message within 100 seconds.",
+ "9.3.5": "Did not receive message within 100 seconds.",
+ "9.3.6": "Did not receive message within 100 seconds.",
+ "9.3.7": "Did not receive message within 100 seconds.",
+ "9.3.8": "Did not receive message within 100 seconds.",
+ "9.3.9": "Did not receive message within 100 seconds.",
+ "9.4.1": "Did not receive message within 100 seconds.",
+ "9.4.2": "Did not receive message within 100 seconds.",
+ "9.4.3": "Did not receive message within 100 seconds.",
+ "9.4.4": "Did not receive message within 100 seconds.",
+ "9.4.5": "Did not receive message within 100 seconds.",
+ "9.4.6": "Did not receive message within 100 seconds.",
+ "9.4.7": "Did not receive message within 100 seconds.",
+ "9.4.8": "Did not receive message within 100 seconds.",
+ "9.4.9": "Did not receive message within 100 seconds.",
+ }
+ assert not process_xfail(results, xfail)
From a9ca19bc9624dee329ccc2574faaecfcc769349d Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 5 Mar 2026 10:52:39 +0000
Subject: [PATCH 116/141] Bump pypa/cibuildwheel from 3.3.1 to 3.4.0 (#12200)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [pypa/cibuildwheel](https://github.com/pypa/cibuildwheel) from
3.3.1 to 3.4.0.
Release notes
Sourced from pypa/cibuildwheel's
releases.
v3.4.0
- 🌟 You can now build wheels using
uv as a build
frontend. This should improve performance, especially if your project
has lots of build dependencies. To use, set build-frontend
to uv. (#2322)
- ⚠️ We no longer support running on Travis CI. It may continue
working but we don't run tests there anymore so we can't be sure. (#2682)
- ✨ Improvements to building rust wheels on Android (#2650)
- 🐛 Fix bug with the GitHub Action on Windows, where PATH was getting
unnecessarily changed, causing issues with meson builds. (#2723)
- ✨ Add support for quiet setting on
build and
uv from the cibuildwheel build-verbosity
setting. (#2737)
- 📚 Docs updates, including guidance on using Meson on Windows (#2718)
Changelog
Sourced from pypa/cibuildwheel's
changelog.
v3.4.0
5 March 2026
- 🌟 You can now build wheels using
uv as a build
frontend. This should improve performance, especially if your project
has lots of build dependencies. To use, set build-frontend
to uv. (#2322)
- ⚠️ We no longer support running on Travis CI. It may continue
working but we don't run tests there anymore so we can't be sure. (#2682)
- ✨ Improvements to building rust wheels on Android (#2650)
- 🐛 Fix bug with the GitHub Action on Windows, where PATH was getting
unnecessarily changed, causing issues with meson builds. (#2723)
- ✨ Add support for quiet setting on
build and
uv from the cibuildwheel build-verbosity
setting. (#2737)
- 📚 Docs updates, including guidance on using Meson on Windows (#2718)
Commits
ee02a15
Bump version: v3.4.0
f08ce70
chore: match copyright to BSD-2-clause template (#2758)
95b4b79
Fix incorrect document regarding pyodide auditwheel (#2752)
f046d0a
Bump to Pyodide v0.29.3 (#2743)
f5a31ee
chore(deps): bump the actions group across 1 directory with 4 updates
(#2755)
35ec456
[pre-commit.ci] pre-commit autoupdate (#2756)
51fed90
[Bot] Update dependencies (#2734)
b70562f
Debug GraalPy/uv test failures on Windows (#2741)
0a85e29
Work around CDN rate limiting on Python.org in bin/update_pythons.py (#2753)
2d33864
Pin the actions/setup-python for the externally used action.yml (#2749)
- Additional commits viewable in compare
view
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 0dc5293b834..0cc547d887e 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -552,7 +552,7 @@ jobs:
run: |
make cythonize
- name: Build wheels
- uses: pypa/cibuildwheel@v3.3.1
+ uses: pypa/cibuildwheel@v3.4.0
env:
CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
CIBW_ARCHS_MACOS: x86_64 arm64 universal2
From 2655be941be0c0c679e15b2374b4d4695460529a Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 7 Mar 2026 18:13:48 +0000
Subject: [PATCH 117/141] Restrict reason (#12209) (#12211)
(cherry picked from commit 18510482de080bf741c560bf2a190fdc54b4eed4)
---------
Co-authored-by: Dhiral Vyas
---
aiohttp/_http_writer.pyx | 2 +-
aiohttp/http_writer.py | 1 +
aiohttp/web_exceptions.py | 2 ++
aiohttp/web_response.py | 4 ++--
tests/test_web_exceptions.py | 12 +++++++++++-
tests/test_web_response.py | 28 +++++++++++++++++++++++++---
6 files changed, 42 insertions(+), 7 deletions(-)
diff --git a/aiohttp/_http_writer.pyx b/aiohttp/_http_writer.pyx
index 7989c186c89..f168b6c2f7a 100644
--- a/aiohttp/_http_writer.pyx
+++ b/aiohttp/_http_writer.pyx
@@ -131,7 +131,7 @@ def _serialize_headers(str status_line, headers):
_init_writer(&writer, buf)
try:
- if _write_str(&writer, status_line) < 0:
+ if _write_str_raise_on_nlcr(&writer, status_line) < 0:
raise
if _write_byte(&writer, b'\r') < 0:
raise
diff --git a/aiohttp/http_writer.py b/aiohttp/http_writer.py
index 8393d4a7c5e..d388b0d36b5 100644
--- a/aiohttp/http_writer.py
+++ b/aiohttp/http_writer.py
@@ -361,6 +361,7 @@ def _safe_header(string: str) -> str:
def _py_serialize_headers(status_line: str, headers: "CIMultiDict[str]") -> bytes:
+ _safe_header(status_line)
headers_gen = (_safe_header(k) + ": " + _safe_header(v) for k, v in headers.items())
line = status_line + "\r\n" + "\r\n".join(headers_gen) + "\r\n\r\n"
return line.encode("utf-8")
diff --git a/aiohttp/web_exceptions.py b/aiohttp/web_exceptions.py
index 8b914257a26..5f065a86add 100644
--- a/aiohttp/web_exceptions.py
+++ b/aiohttp/web_exceptions.py
@@ -101,6 +101,8 @@ def __init__(
"body argument is deprecated for http web exceptions",
DeprecationWarning,
)
+ if reason is not None and ("\r" in reason or "\n" in reason):
+ raise ValueError("Reason cannot contain \\r or \\n")
Response.__init__(
self,
status=self.status_code,
diff --git a/aiohttp/web_response.py b/aiohttp/web_response.py
index 9706cade9b5..91abbb4b61c 100644
--- a/aiohttp/web_response.py
+++ b/aiohttp/web_response.py
@@ -161,8 +161,8 @@ def _set_status(self, status: int, reason: str | None) -> None:
self._status = int(status)
if reason is None:
reason = REASON_PHRASES.get(self._status, "")
- elif "\n" in reason:
- raise ValueError("Reason cannot contain \\n")
+ elif "\r" in reason or "\n" in reason:
+ raise ValueError("Reason cannot contain \\r or \\n")
self._reason = reason
@property
diff --git a/tests/test_web_exceptions.py b/tests/test_web_exceptions.py
index 40c6bd4e791..81ef8b565d1 100644
--- a/tests/test_web_exceptions.py
+++ b/tests/test_web_exceptions.py
@@ -277,5 +277,15 @@ def test_unicode_text_body_unauthorized() -> None:
def test_multiline_reason() -> None:
- with pytest.raises(ValueError, match=r"Reason cannot contain \\n"):
+ with pytest.raises(ValueError, match=r"Reason cannot contain"):
web.HTTPOk(reason="Bad\r\nInjected-header: foo")
+
+
+def test_reason_with_cr() -> None:
+ with pytest.raises(ValueError, match=r"Reason cannot contain"):
+ web.HTTPOk(reason="OK\rSet-Cookie: evil=1")
+
+
+def test_reason_with_lf() -> None:
+ with pytest.raises(ValueError, match=r"Reason cannot contain"):
+ web.HTTPOk(reason="OK\nSet-Cookie: evil=1")
diff --git a/tests/test_web_response.py b/tests/test_web_response.py
index a6467367d61..799d34ce063 100644
--- a/tests/test_web_response.py
+++ b/tests/test_web_response.py
@@ -14,6 +14,7 @@
from re_assert import Matches
from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs, web
+from aiohttp.abc import AbstractStreamWriter
from aiohttp.helpers import ETag
from aiohttp.http_writer import StreamWriter, _serialize_headers
from aiohttp.multipart import BodyPartReader, MultipartWriter
@@ -1098,6 +1099,27 @@ def test_set_status_with_empty_reason() -> None:
assert resp.reason == ""
+def test_set_status_reason_with_cr() -> None:
+ resp = web.StreamResponse()
+
+ with pytest.raises(ValueError, match="Reason cannot contain"):
+ resp.set_status(200, "OK\rSet-Cookie: evil=1")
+
+
+def test_set_status_reason_with_lf() -> None:
+ resp = web.StreamResponse()
+
+ with pytest.raises(ValueError, match="Reason cannot contain"):
+ resp.set_status(200, "OK\nSet-Cookie: evil=1")
+
+
+def test_set_status_reason_with_crlf() -> None:
+ resp = web.StreamResponse()
+
+ with pytest.raises(ValueError, match="Reason cannot contain"):
+ resp.set_status(200, "OK\r\nSet-Cookie: evil=1")
+
+
async def test_start_force_close() -> None:
req = make_request("GET", "/")
resp = StreamResponse()
@@ -1398,9 +1420,9 @@ async def test_render_with_body(buf, writer) -> None:
)
-async def test_multiline_reason(buf, writer) -> None:
- with pytest.raises(ValueError, match=r"Reason cannot contain \\n"):
- Response(reason="Bad\r\nInjected-header: foo")
+async def test_multiline_reason(buf: bytearray, writer: AbstractStreamWriter) -> None:
+ with pytest.raises(ValueError, match=r"Reason cannot contain \\r or \\n"):
+ web.Response(reason="Bad\r\nInjected-header: foo")
async def test_send_set_cookie_header(buf, writer) -> None:
From 764a6c83b6763b64b6f8820201b334922646170d Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 7 Mar 2026 18:43:00 +0000
Subject: [PATCH 118/141] Reject null bytes in headers (#12210) (#12213)
(cherry picked from commit bad4131d31a6d4ce71c3a7cc83b9fc01a70dfe55)
Co-authored-by: vmfunc
---
aiohttp/_http_parser.pyx | 7 +++++++
aiohttp/_http_writer.pyx | 4 ++--
aiohttp/http_writer.py | 4 ++--
tests/test_http_parser.py | 9 ++++++++-
tests/test_http_writer.py | 20 ++++++++++++++++----
5 files changed, 35 insertions(+), 9 deletions(-)
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index cc8b13cfbe8..496498d6f58 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -384,6 +384,13 @@ cdef class HttpParser:
name = find_header(self._raw_name)
value = self._raw_value.decode('utf-8', 'surrogateescape')
+ # reject null bytes in header values - matches the Python parser
+ # check at http_parser.py. llhttp in lenient mode doesn't reject
+ # these itself, so we need to catch them here.
+ # ref: RFC 9110 section 5.5 (CTL chars forbidden in field values)
+ if "\x00" in value:
+ raise InvalidHeader(self._raw_value)
+
self._headers.append((name, value))
if len(self._headers) > self._max_headers:
raise BadHttpMessage("Too many headers received")
diff --git a/aiohttp/_http_writer.pyx b/aiohttp/_http_writer.pyx
index f168b6c2f7a..ee8dcd7d2e9 100644
--- a/aiohttp/_http_writer.pyx
+++ b/aiohttp/_http_writer.pyx
@@ -111,9 +111,9 @@ cdef inline int _write_str_raise_on_nlcr(Writer* writer, object s):
out_str = str(s)
for ch in out_str:
- if ch == 0x0D or ch == 0x0A:
+ if ch in {0x0D, 0x0A, 0x00}:
raise ValueError(
- "Newline or carriage return detected in headers. "
+ "Newline, carriage return, or null byte detected in headers. "
"Potential header injection attack."
)
if _write_utf8(writer, ch) < 0:
diff --git a/aiohttp/http_writer.py b/aiohttp/http_writer.py
index d388b0d36b5..574bc9c263f 100644
--- a/aiohttp/http_writer.py
+++ b/aiohttp/http_writer.py
@@ -352,9 +352,9 @@ async def drain(self) -> None:
def _safe_header(string: str) -> str:
- if "\r" in string or "\n" in string:
+ if "\r" in string or "\n" in string or "\x00" in string:
raise ValueError(
- "Newline or carriage return detected in headers. "
+ "Newline, carriage return, or null byte detected in headers. "
"Potential header injection attack."
)
return string
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 8ab991b6ee2..02606db29ef 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -1197,7 +1197,14 @@ def test_http_response_parser_strict_headers(response) -> None:
response.feed_data(b"HTTP/1.1 200 test\r\nFoo: abc\x01def\r\n\r\n")
-def test_http_response_parser_bad_crlf(response) -> None:
+def test_http_response_parser_null_byte_in_header_value(
+ response: HttpResponseParser,
+) -> None:
+ with pytest.raises(http_exceptions.InvalidHeader):
+ response.feed_data(b"HTTP/1.1 200 OK\r\nFoo: abc\x00def\r\n\r\n")
+
+
+def test_http_response_parser_bad_crlf(response: HttpResponseParser) -> None:
"""Still a lot of dodgy servers sending bad requests like this."""
messages, upgrade, tail = response.feed_data(
b"HTTP/1.0 200 OK\nFoo: abc\nBar: def\n\nBODY\n"
diff --git a/tests/test_http_writer.py b/tests/test_http_writer.py
index c01efe2f909..0032f040cf5 100644
--- a/tests/test_http_writer.py
+++ b/tests/test_http_writer.py
@@ -996,10 +996,22 @@ def test_serialize_headers_raises_on_new_line_or_carriage_return(char: str) -> N
with pytest.raises(
ValueError,
- match=(
- "Newline or carriage return detected in headers. "
- "Potential header injection attack."
- ),
+ match="detected in headers",
+ ):
+ _serialize_headers(status_line, headers)
+
+
+def test_serialize_headers_raises_on_null_byte() -> None:
+ status_line = "HTTP/1.1 200 OK"
+ headers = CIMultiDict(
+ {
+ hdrs.CONTENT_TYPE: "text/plain\x00",
+ }
+ )
+
+ with pytest.raises(
+ ValueError,
+ match="null byte detected in headers",
):
_serialize_headers(status_line, headers)
From db560cfeab32aa35f3d06f7a24238bedc01ffe2b Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 7 Mar 2026 18:53:18 +0000
Subject: [PATCH 119/141] Reject null bytes in headers (#12210) (#12214)
(cherry picked from commit bad4131d31a6d4ce71c3a7cc83b9fc01a70dfe55)
Co-authored-by: vmfunc
---
aiohttp/_http_parser.pyx | 7 +++++++
aiohttp/_http_writer.pyx | 4 ++--
aiohttp/http_writer.py | 4 ++--
tests/test_http_parser.py | 9 ++++++++-
tests/test_http_writer.py | 20 ++++++++++++++++----
5 files changed, 35 insertions(+), 9 deletions(-)
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index cc8b13cfbe8..496498d6f58 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -384,6 +384,13 @@ cdef class HttpParser:
name = find_header(self._raw_name)
value = self._raw_value.decode('utf-8', 'surrogateescape')
+ # reject null bytes in header values - matches the Python parser
+ # check at http_parser.py. llhttp in lenient mode doesn't reject
+ # these itself, so we need to catch them here.
+ # ref: RFC 9110 section 5.5 (CTL chars forbidden in field values)
+ if "\x00" in value:
+ raise InvalidHeader(self._raw_value)
+
self._headers.append((name, value))
if len(self._headers) > self._max_headers:
raise BadHttpMessage("Too many headers received")
diff --git a/aiohttp/_http_writer.pyx b/aiohttp/_http_writer.pyx
index 7989c186c89..209e41c5570 100644
--- a/aiohttp/_http_writer.pyx
+++ b/aiohttp/_http_writer.pyx
@@ -111,9 +111,9 @@ cdef inline int _write_str_raise_on_nlcr(Writer* writer, object s):
out_str = str(s)
for ch in out_str:
- if ch == 0x0D or ch == 0x0A:
+ if ch in {0x0D, 0x0A, 0x00}:
raise ValueError(
- "Newline or carriage return detected in headers. "
+ "Newline, carriage return, or null byte detected in headers. "
"Potential header injection attack."
)
if _write_utf8(writer, ch) < 0:
diff --git a/aiohttp/http_writer.py b/aiohttp/http_writer.py
index a140b218b25..3352e02edf0 100644
--- a/aiohttp/http_writer.py
+++ b/aiohttp/http_writer.py
@@ -352,9 +352,9 @@ async def drain(self) -> None:
def _safe_header(string: str) -> str:
- if "\r" in string or "\n" in string:
+ if "\r" in string or "\n" in string or "\x00" in string:
raise ValueError(
- "Newline or carriage return detected in headers. "
+ "Newline, carriage return, or null byte detected in headers. "
"Potential header injection attack."
)
return string
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index a92a370e0ba..621fc0f24f2 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -1197,7 +1197,14 @@ def test_http_response_parser_strict_headers(response) -> None:
response.feed_data(b"HTTP/1.1 200 test\r\nFoo: abc\x01def\r\n\r\n")
-def test_http_response_parser_bad_crlf(response) -> None:
+def test_http_response_parser_null_byte_in_header_value(
+ response: HttpResponseParser,
+) -> None:
+ with pytest.raises(http_exceptions.InvalidHeader):
+ response.feed_data(b"HTTP/1.1 200 OK\r\nFoo: abc\x00def\r\n\r\n")
+
+
+def test_http_response_parser_bad_crlf(response: HttpResponseParser) -> None:
"""Still a lot of dodgy servers sending bad requests like this."""
messages, upgrade, tail = response.feed_data(
b"HTTP/1.0 200 OK\nFoo: abc\nBar: def\n\nBODY\n"
diff --git a/tests/test_http_writer.py b/tests/test_http_writer.py
index ffd20a0d677..923777291db 100644
--- a/tests/test_http_writer.py
+++ b/tests/test_http_writer.py
@@ -996,10 +996,22 @@ def test_serialize_headers_raises_on_new_line_or_carriage_return(char: str) -> N
with pytest.raises(
ValueError,
- match=(
- "Newline or carriage return detected in headers. "
- "Potential header injection attack."
- ),
+ match="detected in headers",
+ ):
+ _serialize_headers(status_line, headers)
+
+
+def test_serialize_headers_raises_on_null_byte() -> None:
+ status_line = "HTTP/1.1 200 OK"
+ headers = CIMultiDict(
+ {
+ hdrs.CONTENT_TYPE: "text/plain\x00",
+ }
+ )
+
+ with pytest.raises(
+ ValueError,
+ match="null byte detected in headers",
):
_serialize_headers(status_line, headers)
From 53b35a2f8869c37a133e60bf1a82a1c01642ba2b Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 7 Mar 2026 19:00:02 +0000
Subject: [PATCH 120/141] Restrict reason (#12209) (#12212)
(cherry picked from commit 18510482de080bf741c560bf2a190fdc54b4eed4)
---------
Co-authored-by: Dhiral Vyas
---
aiohttp/_http_writer.pyx | 2 +-
aiohttp/http_writer.py | 1 +
aiohttp/web_exceptions.py | 2 ++
aiohttp/web_response.py | 4 ++--
tests/test_web_exceptions.py | 12 +++++++++++-
tests/test_web_response.py | 30 ++++++++++++++++++++++++++----
6 files changed, 43 insertions(+), 8 deletions(-)
diff --git a/aiohttp/_http_writer.pyx b/aiohttp/_http_writer.pyx
index 209e41c5570..ee8dcd7d2e9 100644
--- a/aiohttp/_http_writer.pyx
+++ b/aiohttp/_http_writer.pyx
@@ -131,7 +131,7 @@ def _serialize_headers(str status_line, headers):
_init_writer(&writer, buf)
try:
- if _write_str(&writer, status_line) < 0:
+ if _write_str_raise_on_nlcr(&writer, status_line) < 0:
raise
if _write_byte(&writer, b'\r') < 0:
raise
diff --git a/aiohttp/http_writer.py b/aiohttp/http_writer.py
index 3352e02edf0..8c1c910d741 100644
--- a/aiohttp/http_writer.py
+++ b/aiohttp/http_writer.py
@@ -361,6 +361,7 @@ def _safe_header(string: str) -> str:
def _py_serialize_headers(status_line: str, headers: "CIMultiDict[str]") -> bytes:
+ _safe_header(status_line)
headers_gen = (_safe_header(k) + ": " + _safe_header(v) for k, v in headers.items())
line = status_line + "\r\n" + "\r\n".join(headers_gen) + "\r\n\r\n"
return line.encode("utf-8")
diff --git a/aiohttp/web_exceptions.py b/aiohttp/web_exceptions.py
index ee2c1e72d40..30792c28100 100644
--- a/aiohttp/web_exceptions.py
+++ b/aiohttp/web_exceptions.py
@@ -101,6 +101,8 @@ def __init__(
"body argument is deprecated for http web exceptions",
DeprecationWarning,
)
+ if reason is not None and ("\r" in reason or "\n" in reason):
+ raise ValueError("Reason cannot contain \\r or \\n")
Response.__init__(
self,
status=self.status_code,
diff --git a/aiohttp/web_response.py b/aiohttp/web_response.py
index e5f8b6cd652..364270e4dca 100644
--- a/aiohttp/web_response.py
+++ b/aiohttp/web_response.py
@@ -158,8 +158,8 @@ def _set_status(self, status: int, reason: Optional[str]) -> None:
self._status = int(status)
if reason is None:
reason = REASON_PHRASES.get(self._status, "")
- elif "\n" in reason:
- raise ValueError("Reason cannot contain \\n")
+ elif "\r" in reason or "\n" in reason:
+ raise ValueError("Reason cannot contain \\r or \\n")
self._reason = reason
@property
diff --git a/tests/test_web_exceptions.py b/tests/test_web_exceptions.py
index 3358a947d3d..5a98d0f9423 100644
--- a/tests/test_web_exceptions.py
+++ b/tests/test_web_exceptions.py
@@ -273,5 +273,15 @@ def test_unicode_text_body_unauthorized() -> None:
def test_multiline_reason() -> None:
- with pytest.raises(ValueError, match=r"Reason cannot contain \\n"):
+ with pytest.raises(ValueError, match=r"Reason cannot contain"):
web.HTTPOk(reason="Bad\r\nInjected-header: foo")
+
+
+def test_reason_with_cr() -> None:
+ with pytest.raises(ValueError, match=r"Reason cannot contain"):
+ web.HTTPOk(reason="OK\rSet-Cookie: evil=1")
+
+
+def test_reason_with_lf() -> None:
+ with pytest.raises(ValueError, match=r"Reason cannot contain"):
+ web.HTTPOk(reason="OK\nSet-Cookie: evil=1")
diff --git a/tests/test_web_response.py b/tests/test_web_response.py
index 0525c1584f2..5a4fb7e66d0 100644
--- a/tests/test_web_response.py
+++ b/tests/test_web_response.py
@@ -13,7 +13,8 @@
from multidict import CIMultiDict, CIMultiDictProxy, MultiDict
from re_assert import Matches
-from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs
+from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs, web
+from aiohttp.abc import AbstractStreamWriter
from aiohttp.helpers import ETag
from aiohttp.http_writer import StreamWriter, _serialize_headers
from aiohttp.multipart import BodyPartReader, MultipartWriter
@@ -1008,6 +1009,27 @@ def test_set_status_with_empty_reason() -> None:
assert resp.reason == ""
+def test_set_status_reason_with_cr() -> None:
+ resp = web.StreamResponse()
+
+ with pytest.raises(ValueError, match="Reason cannot contain"):
+ resp.set_status(200, "OK\rSet-Cookie: evil=1")
+
+
+def test_set_status_reason_with_lf() -> None:
+ resp = web.StreamResponse()
+
+ with pytest.raises(ValueError, match="Reason cannot contain"):
+ resp.set_status(200, "OK\nSet-Cookie: evil=1")
+
+
+def test_set_status_reason_with_crlf() -> None:
+ resp = web.StreamResponse()
+
+ with pytest.raises(ValueError, match="Reason cannot contain"):
+ resp.set_status(200, "OK\r\nSet-Cookie: evil=1")
+
+
async def test_start_force_close() -> None:
req = make_request("GET", "/")
resp = StreamResponse()
@@ -1308,9 +1330,9 @@ async def test_render_with_body(buf, writer) -> None:
)
-async def test_multiline_reason(buf, writer) -> None:
- with pytest.raises(ValueError, match=r"Reason cannot contain \\n"):
- Response(reason="Bad\r\nInjected-header: foo")
+async def test_multiline_reason(buf: bytearray, writer: AbstractStreamWriter) -> None:
+ with pytest.raises(ValueError, match=r"Reason cannot contain \\r or \\n"):
+ web.Response(reason="Bad\r\nInjected-header: foo")
async def test_send_set_cookie_header(buf, writer) -> None:
From 03d5a8a32e046cc636ae423f3499ab0917d35cd2 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sat, 7 Mar 2026 19:38:44 +0000
Subject: [PATCH 121/141] [PR #12201/61a661d9 backport][3.14] Drop version from
autobahn config (#12215)
**This is a backport of PR #12201 as merged into master
(61a661d965c4611ac4ada4602541843df0874349).**
Co-authored-by: Sam Bull
---
tests/autobahn/server/fuzzingclient.json | 6 +-----
1 file changed, 1 insertion(+), 5 deletions(-)
diff --git a/tests/autobahn/server/fuzzingclient.json b/tests/autobahn/server/fuzzingclient.json
index bf34731a2e8..0842cfcb887 100644
--- a/tests/autobahn/server/fuzzingclient.json
+++ b/tests/autobahn/server/fuzzingclient.json
@@ -3,11 +3,7 @@
"outdir": "./reports/servers",
"servers": [
- {
- "agent": "AutobahnServer",
- "url": "ws://localhost:9001",
- "options": {"version": 18}
- }
+ {"agent": "AutobahnServer", "url": "ws://localhost:9001"}
],
"cases": ["*"],
From 3007c37005aed0c7778860b065e8172ea939c4c9 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Tue, 10 Mar 2026 20:09:34 +0000
Subject: [PATCH 122/141] Restrict multipart header sizes (#12208) (#12227)
(cherry picked from commit 5fe9dfb64400a574f5ba3f2d3b49b7db47567c29)
---
aiohttp/multipart.py | 25 +++++++++++++++---
aiohttp/streams.py | 16 +++++++-----
aiohttp/test_utils.py | 3 +++
aiohttp/web_protocol.py | 7 ++++++
aiohttp/web_request.py | 7 +++++-
tests/test_multipart.py | 4 +--
tests/test_streams.py | 9 ++++---
tests/test_web_request.py | 53 ++++++++++++++++++++++++++++++++++++++-
8 files changed, 106 insertions(+), 18 deletions(-)
diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py
index bcc01b352ef..5a4f5742fcb 100644
--- a/aiohttp/multipart.py
+++ b/aiohttp/multipart.py
@@ -28,6 +28,7 @@
)
from .helpers import CHAR, TOKEN, parse_mimetype, reify
from .http import HeadersParser
+from .http_exceptions import BadHttpMessage
from .log import internal_logger
from .payload import (
JsonPayload,
@@ -645,7 +646,14 @@ class MultipartReader:
#: Body part reader class for non multipart/* content types.
part_reader_cls = BodyPartReader
- def __init__(self, headers: Mapping[str, str], content: StreamReader) -> None:
+ def __init__(
+ self,
+ headers: Mapping[str, str],
+ content: StreamReader,
+ *,
+ max_field_size: int = 8190,
+ max_headers: int = 128,
+ ) -> None:
self._mimetype = parse_mimetype(headers[CONTENT_TYPE])
assert self._mimetype.type == "multipart", "multipart/* content type expected"
if "boundary" not in self._mimetype.parameters:
@@ -658,6 +666,8 @@ def __init__(self, headers: Mapping[str, str], content: StreamReader) -> None:
self._content = content
self._default_charset: str | None = None
self._last_part: MultipartReader | BodyPartReader | None = None
+ self._max_field_size = max_field_size
+ self._max_headers = max_headers
self._at_eof = False
self._at_bof = True
self._unread: list[bytes] = []
@@ -757,7 +767,12 @@ def _get_part_reader(
if mimetype.type == "multipart":
if self.multipart_reader_cls is None:
return type(self)(headers, self._content)
- return self.multipart_reader_cls(headers, self._content)
+ return self.multipart_reader_cls(
+ headers,
+ self._content,
+ max_field_size=self._max_field_size,
+ max_headers=self._max_headers,
+ )
else:
return self.part_reader_cls(
self._boundary,
@@ -819,12 +834,14 @@ async def _read_boundary(self) -> None:
async def _read_headers(self) -> "CIMultiDictProxy[str]":
lines = []
while True:
- chunk = await self._content.readline()
+ chunk = await self._content.readline(max_line_length=self._max_field_size)
chunk = chunk.rstrip(b"\r\n")
lines.append(chunk)
if not chunk:
break
- parser = HeadersParser()
+ if len(lines) > self._max_headers:
+ raise BadHttpMessage("Too many headers received")
+ parser = HeadersParser(max_field_size=self._max_field_size)
headers, raw_headers = parser.parse_headers(lines)
return headers
diff --git a/aiohttp/streams.py b/aiohttp/streams.py
index 3a424c28558..33cd2113d84 100644
--- a/aiohttp/streams.py
+++ b/aiohttp/streams.py
@@ -12,6 +12,7 @@
set_exception,
set_result,
)
+from .http_exceptions import LineTooLong
from .log import internal_logger
__all__ = (
@@ -363,10 +364,12 @@ async def _wait(self, func_name: str) -> None:
finally:
self._waiter = None
- async def readline(self) -> bytes:
- return await self.readuntil()
+ async def readline(self, *, max_line_length: int | None = None) -> bytes:
+ return await self.readuntil(max_size=max_line_length)
- async def readuntil(self, separator: bytes = b"\n") -> bytes:
+ async def readuntil(
+ self, separator: bytes = b"\n", *, max_size: int | None = None
+ ) -> bytes:
seplen = len(separator)
if seplen == 0:
raise ValueError("Separator should be at least one-byte string")
@@ -377,6 +380,7 @@ async def readuntil(self, separator: bytes = b"\n") -> bytes:
chunk = b""
chunk_size = 0
not_enough = True
+ max_size = max_size or self._high_water
while not_enough:
while self._buffer and not_enough:
@@ -391,8 +395,8 @@ async def readuntil(self, separator: bytes = b"\n") -> bytes:
if ichar:
not_enough = False
- if chunk_size > self._high_water:
- raise ValueError("Chunk too big")
+ if chunk_size > max_size:
+ raise LineTooLong(chunk[:100] + b"...", max_size)
if self._eof:
break
@@ -613,7 +617,7 @@ async def wait_eof(self) -> None:
def feed_data(self, data: bytes, n: int = 0) -> None:
pass
- async def readline(self) -> bytes:
+ async def readline(self, *, max_line_length: int | None = None) -> bytes:
return b""
async def read(self, n: int = -1) -> bytes:
diff --git a/aiohttp/test_utils.py b/aiohttp/test_utils.py
index 64ad80d2b3d..ccc05e1ef5d 100644
--- a/aiohttp/test_utils.py
+++ b/aiohttp/test_utils.py
@@ -755,6 +755,9 @@ def make_mocked_request(
if protocol is sentinel:
protocol = mock.Mock()
+ protocol.max_field_size = 8190
+ protocol.max_line_length = 8190
+ protocol.max_headers = 128
protocol.transport = transport
type(protocol).peername = mock.PropertyMock(
return_value=transport.get_extra_info("peername")
diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py
index eb7aea87a2c..68fb266c786 100644
--- a/aiohttp/web_protocol.py
+++ b/aiohttp/web_protocol.py
@@ -131,6 +131,9 @@ class RequestHandler(BaseProtocol):
"""
__slots__ = (
+ "max_field_size",
+ "max_headers",
+ "max_line_size",
"_request_count",
"_keepalive",
"_manager",
@@ -195,6 +198,10 @@ def __init__(
self._request_handler: _RequestHandler | None = manager.request_handler
self._request_factory: _RequestFactory | None = manager.request_factory
+ self.max_line_size = max_line_size
+ self.max_headers = max_headers
+ self.max_field_size = max_field_size
+
self._tcp_keepalive = tcp_keepalive
# placeholder to be replaced on keepalive timeout setup
self._next_keepalive_close_time = 0.0
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index 38eb0de2905..8e540f64774 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -705,7 +705,12 @@ async def json(self, *, loads: JSONDecoder = DEFAULT_JSON_DECODER) -> Any:
async def multipart(self) -> MultipartReader:
"""Return async iterator to process BODY as multipart."""
- return MultipartReader(self._headers, self._payload)
+ return MultipartReader(
+ self._headers,
+ self._payload,
+ max_field_size=self._protocol.max_field_size,
+ max_headers=self._protocol.max_headers,
+ )
async def post(self) -> "MultiDictProxy[str | bytes | FileField]":
"""Return POST parameters."""
diff --git a/tests/test_multipart.py b/tests/test_multipart.py
index 94a3986acc4..34e745830a9 100644
--- a/tests/test_multipart.py
+++ b/tests/test_multipart.py
@@ -84,7 +84,7 @@ async def read(self, size=None):
def at_eof(self):
return self.content.tell() == len(self.content.getbuffer())
- async def readline(self):
+ async def readline(self, *, max_line_length: int | None = None) -> bytes:
return self.content.readline()
def unread_data(self, data):
@@ -854,7 +854,7 @@ async def read(self, size=None) -> bytes:
def at_eof(self) -> bool:
return not self.content
- async def readline(self) -> bytes:
+ async def readline(self, *, max_line_length: int | None = None) -> bytes:
line = b""
while self.content and b"\n" not in line:
line += self.content.pop(0)
diff --git a/tests/test_streams.py b/tests/test_streams.py
index c5bc6716b7f..93686746ee0 100644
--- a/tests/test_streams.py
+++ b/tests/test_streams.py
@@ -12,6 +12,7 @@
from re_assert import Matches
from aiohttp import streams
+from aiohttp.http_exceptions import LineTooLong
DATA = b"line1\nline2\nline3\n"
@@ -325,7 +326,7 @@ async def test_readline_limit_with_existing_data(self) -> None:
stream.feed_data(b"li")
stream.feed_data(b"ne1\nline2\n")
- with pytest.raises(ValueError):
+ with pytest.raises(LineTooLong):
await stream.readline()
# The buffer should contain the remaining data after exception
stream.feed_eof()
@@ -346,7 +347,7 @@ def cb():
loop.call_soon(cb)
- with pytest.raises(ValueError):
+ with pytest.raises(LineTooLong):
await stream.readline()
data = await stream.read()
assert b"chunk3\n" == data
@@ -436,7 +437,7 @@ async def test_readuntil_limit_with_existing_data(self, separator: bytes) -> Non
stream.feed_data(b"li")
stream.feed_data(b"ne1" + separator + b"line2" + separator)
- with pytest.raises(ValueError):
+ with pytest.raises(LineTooLong):
await stream.readuntil(separator)
# The buffer should contain the remaining data after exception
stream.feed_eof()
@@ -458,7 +459,7 @@ def cb():
loop.call_soon(cb)
- with pytest.raises(ValueError, match="Chunk too big"):
+ with pytest.raises(LineTooLong):
await stream.readuntil(separator)
data = await stream.read()
assert b"chunk3#" == data
diff --git a/tests/test_web_request.py b/tests/test_web_request.py
index 3a6e17a0a74..027e68ca746 100644
--- a/tests/test_web_request.py
+++ b/tests/test_web_request.py
@@ -14,6 +14,7 @@
from aiohttp import HttpVersion
from aiohttp.base_protocol import BaseProtocol
+from aiohttp.http_exceptions import BadHttpMessage, LineTooLong
from aiohttp.http_parser import RawRequestMessage
from aiohttp.streams import StreamReader
from aiohttp.test_utils import make_mocked_request
@@ -980,7 +981,57 @@ async def test_multipart_formdata_file(protocol: BaseProtocol) -> None:
result["a_file"].file.close()
-async def test_make_too_big_request_limit_None(protocol) -> None:
+async def test_multipart_formdata_headers_too_many(protocol: BaseProtocol) -> None:
+ many = b"".join(f"X-{i}: a\r\n".encode() for i in range(130))
+ body = (
+ b"--b\r\n"
+ b'Content-Disposition: form-data; name="a"\r\n' + many + b"\r\n1\r\n"
+ b"--b--\r\n"
+ )
+ content_type = "multipart/form-data; boundary=b"
+ payload = StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ payload.feed_data(body)
+ payload.feed_eof()
+ req = make_mocked_request(
+ "POST",
+ "/",
+ headers={"CONTENT-TYPE": content_type},
+ payload=payload,
+ )
+
+ with pytest.raises(BadHttpMessage, match="Too many headers received"):
+ await req.post()
+
+
+async def test_multipart_formdata_header_too_long(protocol: BaseProtocol) -> None:
+ k = b"t" * 4100
+ body = (
+ b"--b\r\n"
+ b'Content-Disposition: form-data; name="a"\r\n'
+ + k
+ + b":"
+ + k
+ + b"\r\n"
+ + b"\r\n1\r\n"
+ b"--b--\r\n"
+ )
+ content_type = "multipart/form-data; boundary=b"
+ payload = StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ payload.feed_data(body)
+ payload.feed_eof()
+ req = make_mocked_request(
+ "POST",
+ "/",
+ headers={"CONTENT-TYPE": content_type},
+ payload=payload,
+ )
+
+ match = "400, message:\n Got more than 8190 bytes when reading"
+ with pytest.raises(LineTooLong, match=match):
+ await req.post()
+
+
+async def test_make_too_big_request_limit_None(protocol: BaseProtocol) -> None:
payload = StreamReader(protocol, 2**16, loop=asyncio.get_event_loop())
large_file = 1024**2 * b"x"
too_large_file = large_file + b"x"
From 8a74257b3804c9aac0bf644af93070f68f6c5a6f Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Tue, 10 Mar 2026 20:36:38 +0000
Subject: [PATCH 123/141] Restrict multipart header sizes (#12208) (#12228)
(cherry picked from commit 5fe9dfb64400a574f5ba3f2d3b49b7db47567c29)
---
aiohttp/multipart.py | 29 ++++++++++++++++-----
aiohttp/streams.py | 16 +++++++-----
aiohttp/test_utils.py | 3 +++
aiohttp/web_protocol.py | 7 ++++++
aiohttp/web_request.py | 7 +++++-
tests/test_multipart.py | 5 ++--
tests/test_streams.py | 9 ++++---
tests/test_web_request.py | 53 ++++++++++++++++++++++++++++++++++++++-
8 files changed, 109 insertions(+), 20 deletions(-)
diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py
index 8f1f9c93937..ec8b82b7ed4 100644
--- a/aiohttp/multipart.py
+++ b/aiohttp/multipart.py
@@ -41,6 +41,7 @@
)
from .helpers import CHAR, TOKEN, parse_mimetype, reify
from .http import HeadersParser
+from .http_exceptions import BadHttpMessage
from .log import internal_logger
from .payload import (
JsonPayload,
@@ -658,7 +659,14 @@ class MultipartReader:
#: Body part reader class for non multipart/* content types.
part_reader_cls = BodyPartReader
- def __init__(self, headers: Mapping[str, str], content: StreamReader) -> None:
+ def __init__(
+ self,
+ headers: Mapping[str, str],
+ content: StreamReader,
+ *,
+ max_field_size: int = 8190,
+ max_headers: int = 128,
+ ) -> None:
self._mimetype = parse_mimetype(headers[CONTENT_TYPE])
assert self._mimetype.type == "multipart", "multipart/* content type expected"
if "boundary" not in self._mimetype.parameters:
@@ -669,8 +677,10 @@ def __init__(self, headers: Mapping[str, str], content: StreamReader) -> None:
self.headers = headers
self._boundary = ("--" + self._get_boundary()).encode()
self._content = content
- self._default_charset: Optional[str] = None
- self._last_part: Optional[Union["MultipartReader", BodyPartReader]] = None
+ self._default_charset: str | None = None
+ self._last_part: MultipartReader | BodyPartReader | None = None
+ self._max_field_size = max_field_size
+ self._max_headers = max_headers
self._at_eof = False
self._at_bof = True
self._unread: List[bytes] = []
@@ -770,7 +780,12 @@ def _get_part_reader(
if mimetype.type == "multipart":
if self.multipart_reader_cls is None:
return type(self)(headers, self._content)
- return self.multipart_reader_cls(headers, self._content)
+ return self.multipart_reader_cls(
+ headers,
+ self._content,
+ max_field_size=self._max_field_size,
+ max_headers=self._max_headers,
+ )
else:
return self.part_reader_cls(
self._boundary,
@@ -832,12 +847,14 @@ async def _read_boundary(self) -> None:
async def _read_headers(self) -> "CIMultiDictProxy[str]":
lines = []
while True:
- chunk = await self._content.readline()
+ chunk = await self._content.readline(max_line_length=self._max_field_size)
chunk = chunk.rstrip(b"\r\n")
lines.append(chunk)
if not chunk:
break
- parser = HeadersParser()
+ if len(lines) > self._max_headers:
+ raise BadHttpMessage("Too many headers received")
+ parser = HeadersParser(max_field_size=self._max_field_size)
headers, raw_headers = parser.parse_headers(lines)
return headers
diff --git a/aiohttp/streams.py b/aiohttp/streams.py
index 6cc74fc9cbd..921827eb343 100644
--- a/aiohttp/streams.py
+++ b/aiohttp/streams.py
@@ -21,6 +21,7 @@
set_exception,
set_result,
)
+from .http_exceptions import LineTooLong
from .log import internal_logger
__all__ = (
@@ -372,10 +373,12 @@ async def _wait(self, func_name: str) -> None:
finally:
self._waiter = None
- async def readline(self) -> bytes:
- return await self.readuntil()
+ async def readline(self, *, max_line_length: Optional[int] = None) -> bytes:
+ return await self.readuntil(max_size=max_line_length)
- async def readuntil(self, separator: bytes = b"\n") -> bytes:
+ async def readuntil(
+ self, separator: bytes = b"\n", *, max_size: Optional[int] = None
+ ) -> bytes:
seplen = len(separator)
if seplen == 0:
raise ValueError("Separator should be at least one-byte string")
@@ -386,6 +389,7 @@ async def readuntil(self, separator: bytes = b"\n") -> bytes:
chunk = b""
chunk_size = 0
not_enough = True
+ max_size = max_size or self._high_water
while not_enough:
while self._buffer and not_enough:
@@ -400,8 +404,8 @@ async def readuntil(self, separator: bytes = b"\n") -> bytes:
if ichar:
not_enough = False
- if chunk_size > self._high_water:
- raise ValueError("Chunk too big")
+ if chunk_size > max_size:
+ raise LineTooLong(chunk[:100] + b"...", max_size)
if self._eof:
break
@@ -622,7 +626,7 @@ async def wait_eof(self) -> None:
def feed_data(self, data: bytes, n: int = 0) -> None:
pass
- async def readline(self) -> bytes:
+ async def readline(self, *, max_line_length: Optional[int] = None) -> bytes:
return b""
async def read(self, n: int = -1) -> bytes:
diff --git a/aiohttp/test_utils.py b/aiohttp/test_utils.py
index 87c31427867..34b77d47cd5 100644
--- a/aiohttp/test_utils.py
+++ b/aiohttp/test_utils.py
@@ -729,6 +729,9 @@ def make_mocked_request(
if protocol is sentinel:
protocol = mock.Mock()
+ protocol.max_field_size = 8190
+ protocol.max_line_length = 8190
+ protocol.max_headers = 128
protocol.transport = transport
type(protocol).peername = mock.PropertyMock(
return_value=transport.get_extra_info("peername")
diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py
index 58bbdfd849b..84e70e59088 100644
--- a/aiohttp/web_protocol.py
+++ b/aiohttp/web_protocol.py
@@ -142,6 +142,9 @@ class RequestHandler(BaseProtocol):
"""
__slots__ = (
+ "max_field_size",
+ "max_headers",
+ "max_line_size",
"_request_count",
"_keepalive",
"_manager",
@@ -205,6 +208,10 @@ def __init__(
self._request_handler: Optional[_RequestHandler] = manager.request_handler
self._request_factory: Optional[_RequestFactory] = manager.request_factory
+ self.max_line_size = max_line_size
+ self.max_headers = max_headers
+ self.max_field_size = max_field_size
+
self._tcp_keepalive = tcp_keepalive
# placeholder to be replaced on keepalive timeout setup
self._next_keepalive_close_time = 0.0
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index c652cdb4e97..0b9ce346c20 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -696,7 +696,12 @@ async def json(self, *, loads: JSONDecoder = DEFAULT_JSON_DECODER) -> Any:
async def multipart(self) -> MultipartReader:
"""Return async iterator to process BODY as multipart."""
- return MultipartReader(self._headers, self._payload)
+ return MultipartReader(
+ self._headers,
+ self._payload,
+ max_field_size=self._protocol.max_field_size,
+ max_headers=self._protocol.max_headers,
+ )
async def post(self) -> "MultiDictProxy[Union[str, bytes, FileField]]":
"""Return POST parameters."""
diff --git a/tests/test_multipart.py b/tests/test_multipart.py
index 3e18bd93f97..587d383cfd2 100644
--- a/tests/test_multipart.py
+++ b/tests/test_multipart.py
@@ -3,6 +3,7 @@
import json
import pathlib
import sys
+from typing import Optional
from unittest import mock
import pytest
@@ -85,7 +86,7 @@ async def read(self, size=None):
def at_eof(self):
return self.content.tell() == len(self.content.getbuffer())
- async def readline(self):
+ async def readline(self, *, max_line_length: Optional[int] = None) -> bytes:
return self.content.readline()
def unread_data(self, data):
@@ -856,7 +857,7 @@ async def read(self, size=None) -> bytes:
def at_eof(self) -> bool:
return not self.content
- async def readline(self) -> bytes:
+ async def readline(self, *, max_line_length: int | None = None) -> bytes:
line = b""
while self.content and b"\n" not in line:
line += self.content.pop(0)
diff --git a/tests/test_streams.py b/tests/test_streams.py
index c5bc6716b7f..93686746ee0 100644
--- a/tests/test_streams.py
+++ b/tests/test_streams.py
@@ -12,6 +12,7 @@
from re_assert import Matches
from aiohttp import streams
+from aiohttp.http_exceptions import LineTooLong
DATA = b"line1\nline2\nline3\n"
@@ -325,7 +326,7 @@ async def test_readline_limit_with_existing_data(self) -> None:
stream.feed_data(b"li")
stream.feed_data(b"ne1\nline2\n")
- with pytest.raises(ValueError):
+ with pytest.raises(LineTooLong):
await stream.readline()
# The buffer should contain the remaining data after exception
stream.feed_eof()
@@ -346,7 +347,7 @@ def cb():
loop.call_soon(cb)
- with pytest.raises(ValueError):
+ with pytest.raises(LineTooLong):
await stream.readline()
data = await stream.read()
assert b"chunk3\n" == data
@@ -436,7 +437,7 @@ async def test_readuntil_limit_with_existing_data(self, separator: bytes) -> Non
stream.feed_data(b"li")
stream.feed_data(b"ne1" + separator + b"line2" + separator)
- with pytest.raises(ValueError):
+ with pytest.raises(LineTooLong):
await stream.readuntil(separator)
# The buffer should contain the remaining data after exception
stream.feed_eof()
@@ -458,7 +459,7 @@ def cb():
loop.call_soon(cb)
- with pytest.raises(ValueError, match="Chunk too big"):
+ with pytest.raises(LineTooLong):
await stream.readuntil(separator)
data = await stream.read()
assert b"chunk3#" == data
diff --git a/tests/test_web_request.py b/tests/test_web_request.py
index e4b3979e67b..7778128c19e 100644
--- a/tests/test_web_request.py
+++ b/tests/test_web_request.py
@@ -13,6 +13,7 @@
from aiohttp import HttpVersion
from aiohttp.base_protocol import BaseProtocol
+from aiohttp.http_exceptions import BadHttpMessage, LineTooLong
from aiohttp.http_parser import RawRequestMessage
from aiohttp.streams import StreamReader
from aiohttp.test_utils import make_mocked_request
@@ -896,7 +897,57 @@ async def test_multipart_formdata_file(protocol: BaseProtocol) -> None:
result["a_file"].file.close()
-async def test_make_too_big_request_limit_None(protocol) -> None:
+async def test_multipart_formdata_headers_too_many(protocol: BaseProtocol) -> None:
+ many = b"".join(f"X-{i}: a\r\n".encode() for i in range(130))
+ body = (
+ b"--b\r\n"
+ b'Content-Disposition: form-data; name="a"\r\n' + many + b"\r\n1\r\n"
+ b"--b--\r\n"
+ )
+ content_type = "multipart/form-data; boundary=b"
+ payload = StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ payload.feed_data(body)
+ payload.feed_eof()
+ req = make_mocked_request(
+ "POST",
+ "/",
+ headers={"CONTENT-TYPE": content_type},
+ payload=payload,
+ )
+
+ with pytest.raises(BadHttpMessage, match="Too many headers received"):
+ await req.post()
+
+
+async def test_multipart_formdata_header_too_long(protocol: BaseProtocol) -> None:
+ k = b"t" * 4100
+ body = (
+ b"--b\r\n"
+ b'Content-Disposition: form-data; name="a"\r\n'
+ + k
+ + b":"
+ + k
+ + b"\r\n"
+ + b"\r\n1\r\n"
+ b"--b--\r\n"
+ )
+ content_type = "multipart/form-data; boundary=b"
+ payload = StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+ payload.feed_data(body)
+ payload.feed_eof()
+ req = make_mocked_request(
+ "POST",
+ "/",
+ headers={"CONTENT-TYPE": content_type},
+ payload=payload,
+ )
+
+ match = "400, message:\n Got more than 8190 bytes when reading"
+ with pytest.raises(LineTooLong, match=match):
+ await req.post()
+
+
+async def test_make_too_big_request_limit_None(protocol: BaseProtocol) -> None:
payload = StreamReader(protocol, 2**16, loop=asyncio.get_event_loop())
large_file = 1024**2 * b"x"
too_large_file = large_file + b"x"
From 9d588220522692d731005b1cbcc79e439dcfa7e6 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Tue, 10 Mar 2026 22:13:46 +0000
Subject: [PATCH 124/141] [PR #12216/9cc4b917 backport][3.14] Check multipart
max_size during iteration (#12230)
**This is a backport of PR #12216 as merged into master
(9cc4b917c54833a22f65edae7963d16a6eeb1f54).**
---------
Co-authored-by: Sam Bull
---
aiohttp/web_request.py | 20 ++++++++++++++------
1 file changed, 14 insertions(+), 6 deletions(-)
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index 8e540f64774..4bdd7aff606 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -778,17 +778,25 @@ async def post(self) -> "MultiDictProxy[str | bytes | FileField]":
out.add(field.name, ff)
else:
# deal with ordinary data
- value = await field.read(decode=True)
+ raw_data = bytearray()
+ while chunk := await field.read_chunk():
+ size += len(chunk)
+ if 0 < max_size < size:
+ raise HTTPRequestEntityTooLarge(
+ max_size=max_size, actual_size=size
+ )
+ raw_data.extend(chunk)
+
+ value = bytearray()
+ # form-data doesn't support compression, so don't need to check size again.
+ async for d in field.decode_iter(raw_data):
+ value.extend(d)
+
if field_ct is None or field_ct.startswith("text/"):
charset = field.get_charset(default="utf-8")
out.add(field.name, value.decode(charset))
else:
out.add(field.name, value)
- size += len(value)
- if 0 < max_size < size:
- raise HTTPRequestEntityTooLarge(
- max_size=max_size, actual_size=size
- )
else:
raise ValueError(
"To decode nested multipart you need to use custom reader",
From cbb774f38330563422ca0c413a71021d7b944145 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Tue, 10 Mar 2026 22:14:02 +0000
Subject: [PATCH 125/141] [PR #12216/9cc4b917 backport][3.13] Check multipart
max_size during iteration (#12229)
**This is a backport of PR #12216 as merged into master
(9cc4b917c54833a22f65edae7963d16a6eeb1f54).**
---------
Co-authored-by: Sam Bull
---
aiohttp/web_request.py | 20 ++++++++++++++------
1 file changed, 14 insertions(+), 6 deletions(-)
diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py
index 0b9ce346c20..f2d4c511e44 100644
--- a/aiohttp/web_request.py
+++ b/aiohttp/web_request.py
@@ -769,17 +769,25 @@ async def post(self) -> "MultiDictProxy[Union[str, bytes, FileField]]":
out.add(field.name, ff)
else:
# deal with ordinary data
- value = await field.read(decode=True)
+ raw_data = bytearray()
+ while chunk := await field.read_chunk():
+ size += len(chunk)
+ if 0 < max_size < size:
+ raise HTTPRequestEntityTooLarge(
+ max_size=max_size, actual_size=size
+ )
+ raw_data.extend(chunk)
+
+ value = bytearray()
+ # form-data doesn't support compression, so don't need to check size again.
+ async for d in field.decode_iter(raw_data):
+ value.extend(d)
+
if field_ct is None or field_ct.startswith("text/"):
charset = field.get_charset(default="utf-8")
out.add(field.name, value.decode(charset))
else:
out.add(field.name, value)
- size += len(value)
- if 0 < max_size < size:
- raise HTTPRequestEntityTooLarge(
- max_size=max_size, actual_size=size
- )
else:
raise ValueError(
"To decode nested multipart you need to use custom reader",
From 0341d64536cbd3110793b90467aaea6a6fbf9b28 Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Wed, 11 Mar 2026 20:37:58 -0300
Subject: [PATCH 126/141] [PR #12231/7043bc56 backport][3.14] Adjust header
value character checks to RFC 9110 (#12236)
---
CHANGES/12231.bugfix.rst | 2 ++
aiohttp/http_parser.py | 9 ++++++++-
tests/test_http_parser.py | 10 ++++++++++
3 files changed, 20 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12231.bugfix.rst
diff --git a/CHANGES/12231.bugfix.rst b/CHANGES/12231.bugfix.rst
new file mode 100644
index 00000000000..cd74bd1e7e5
--- /dev/null
+++ b/CHANGES/12231.bugfix.rst
@@ -0,0 +1,2 @@
+Adjusted pure-Python request header value validation to align with RFC 9110 control-character handling, while preserving lax response parser behavior, and added regression tests for Host/header control-character cases.
+-- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index 8588e9b4906..0119153cfce 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -70,6 +70,10 @@
VERSRE: Final[Pattern[str]] = re.compile(r"HTTP/(\d)\.(\d)", re.ASCII)
DIGITS: Final[Pattern[str]] = re.compile(r"\d+", re.ASCII)
HEXDIGITS: Final[Pattern[bytes]] = re.compile(rb"[0-9a-fA-F]+")
+# https://www.rfc-editor.org/rfc/rfc9110#section-5.5-5
+_FIELD_VALUE_FORBIDDEN_CTL_RE: Final[Pattern[str]] = re.compile(
+ r"[\x00-\x08\x0a-\x1f\x7f]"
+)
class RawRequestMessage(NamedTuple):
@@ -194,7 +198,10 @@ def parse_headers(
value = bvalue.decode("utf-8", "surrogateescape")
# https://www.rfc-editor.org/rfc/rfc9110.html#section-5.5-5
- if "\n" in value or "\r" in value or "\x00" in value:
+ if self._lax:
+ if "\n" in value or "\r" in value or "\x00" in value:
+ raise InvalidHeader(bvalue)
+ elif _FIELD_VALUE_FORBIDDEN_CTL_RE.search(value):
raise InvalidHeader(bvalue)
headers.add(name, value)
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 02606db29ef..ef058588181 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -221,6 +221,9 @@ def test_bad_header_name(parser: Any, rfc9110_5_6_2_token_delim: str) -> None:
"Foo : bar", # https://www.rfc-editor.org/rfc/rfc9112.html#section-5.1-2
"Foo\t: bar",
"\xffoo: bar",
+ "Foo: abc\x01def", # CTL bytes forbidden per RFC 9110 §5.5
+ "Foo: abc\x7fdef", # DEL is also a CTL byte
+ "Foo: abc\x1fdef",
),
)
def test_bad_headers(parser: Any, hdr: str) -> None:
@@ -229,6 +232,13 @@ def test_bad_headers(parser: Any, hdr: str) -> None:
parser.feed_data(text)
+def test_ctl_host_header_bad_characters(parser: HttpRequestParser) -> None:
+ """CTL byte in Host header must be rejected."""
+ text = b"GET /test HTTP/1.1\r\nHost: trusted.example\x01@bad.test\r\n\r\n"
+ with pytest.raises(http_exceptions.BadHttpMessage):
+ parser.feed_data(text)
+
+
def test_unpaired_surrogate_in_header_py(loop: Any, protocol: Any) -> None:
parser = HttpRequestParserPy(
protocol,
From 9370b9714a7a56003cacd31a9b4ae16eab109ba4 Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Wed, 11 Mar 2026 20:38:17 -0300
Subject: [PATCH 127/141] [PR #12231/7043bc56 backport][3.13] Adjust header
value character checks to RFC 9110 (#12235)
Co-authored-by: rodrigo.nogueira
---
CHANGES/12231.bugfix.rst | 2 ++
aiohttp/http_parser.py | 9 ++++++++-
tests/test_http_parser.py | 10 ++++++++++
3 files changed, 20 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12231.bugfix.rst
diff --git a/CHANGES/12231.bugfix.rst b/CHANGES/12231.bugfix.rst
new file mode 100644
index 00000000000..cd74bd1e7e5
--- /dev/null
+++ b/CHANGES/12231.bugfix.rst
@@ -0,0 +1,2 @@
+Adjusted pure-Python request header value validation to align with RFC 9110 control-character handling, while preserving lax response parser behavior, and added regression tests for Host/header control-character cases.
+-- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index ec7d1ee6347..ec7db0868e1 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -81,6 +81,10 @@
# token = 1*tchar
_TCHAR_SPECIALS: Final[str] = re.escape("!#$%&'*+-.^_`|~")
TOKENRE: Final[Pattern[str]] = re.compile(f"[0-9A-Za-z{_TCHAR_SPECIALS}]+")
+# https://www.rfc-editor.org/rfc/rfc9110#section-5.5-5
+_FIELD_VALUE_FORBIDDEN_CTL_RE: Final[Pattern[str]] = re.compile(
+ r"[\x00-\x08\x0a-\x1f\x7f]"
+)
VERSRE: Final[Pattern[str]] = re.compile(r"HTTP/(\d)\.(\d)", re.ASCII)
DIGITS: Final[Pattern[str]] = re.compile(r"\d+", re.ASCII)
HEXDIGITS: Final[Pattern[bytes]] = re.compile(rb"[0-9a-fA-F]+")
@@ -208,7 +212,10 @@ def parse_headers(
value = bvalue.decode("utf-8", "surrogateescape")
# https://www.rfc-editor.org/rfc/rfc9110.html#section-5.5-5
- if "\n" in value or "\r" in value or "\x00" in value:
+ if self._lax:
+ if "\n" in value or "\r" in value or "\x00" in value:
+ raise InvalidHeader(bvalue)
+ elif _FIELD_VALUE_FORBIDDEN_CTL_RE.search(value):
raise InvalidHeader(bvalue)
headers.add(name, value)
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 621fc0f24f2..8ddb3e4ff34 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -221,6 +221,9 @@ def test_bad_header_name(parser: Any, rfc9110_5_6_2_token_delim: str) -> None:
"Foo : bar", # https://www.rfc-editor.org/rfc/rfc9112.html#section-5.1-2
"Foo\t: bar",
"\xffoo: bar",
+ "Foo: abc\x01def", # CTL bytes forbidden per RFC 9110 §5.5
+ "Foo: abc\x7fdef", # DEL is also a CTL byte
+ "Foo: abc\x1fdef",
),
)
def test_bad_headers(parser: Any, hdr: str) -> None:
@@ -229,6 +232,13 @@ def test_bad_headers(parser: Any, hdr: str) -> None:
parser.feed_data(text)
+def test_ctl_host_header_bad_characters(parser: HttpRequestParser) -> None:
+ """CTL byte in Host header must be rejected."""
+ text = b"GET /test HTTP/1.1\r\nHost: trusted.example\x01@bad.test\r\n\r\n"
+ with pytest.raises(http_exceptions.BadHttpMessage):
+ parser.feed_data(text)
+
+
def test_unpaired_surrogate_in_header_py(loop: Any, protocol: Any) -> None:
parser = HttpRequestParserPy(
protocol,
From 52426413e67ab03df04c7796ce6bb186a2510ddc Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 15 Mar 2026 13:33:49 +0000
Subject: [PATCH 128/141] [PR #12240/345d2537 backport][3.14] Reject duplicate
singleton headers in C extension parser (#12242)
**This is a backport of PR #12240 as merged into master
(345d25371562dd56de099f1fcd5720e96c6e7702).**
Co-authored-by: Rodrigo Nogueira
---
CHANGES/12240.bugfix.rst | 5 +++++
aiohttp/_http_parser.pyx | 22 +++++++++++++++++++++
tests/test_http_parser.py | 41 +++++++++++++++++++++++++++++++++++++++
3 files changed, 68 insertions(+)
create mode 100644 CHANGES/12240.bugfix.rst
diff --git a/CHANGES/12240.bugfix.rst b/CHANGES/12240.bugfix.rst
new file mode 100644
index 00000000000..49508b3f595
--- /dev/null
+++ b/CHANGES/12240.bugfix.rst
@@ -0,0 +1,5 @@
+Rejected duplicate singleton headers (``Host``, ``Content-Type``,
+``Content-Length``, etc.) in the C extension HTTP parser to match
+the pure Python parser behavior, preventing potential host-based
+access control bypasses via parser differentials
+-- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index 496498d6f58..18c293bfda2 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -71,6 +71,20 @@ cdef object StreamReader = _StreamReader
cdef object DeflateBuffer = _DeflateBuffer
cdef bytes EMPTY_BYTES = b""
+# https://www.rfc-editor.org/rfc/rfc9110.html#section-5.5-6
+cdef tuple SINGLETON_HEADERS = (
+ hdrs.CONTENT_LENGTH,
+ hdrs.CONTENT_LOCATION,
+ hdrs.CONTENT_RANGE,
+ hdrs.CONTENT_TYPE,
+ hdrs.ETAG,
+ hdrs.HOST,
+ hdrs.MAX_FORWARDS,
+ hdrs.SERVER,
+ hdrs.TRANSFER_ENCODING,
+ hdrs.USER_AGENT,
+)
+
cdef inline object extend(object buf, const char* at, size_t length):
cdef Py_ssize_t s
cdef char* ptr
@@ -430,6 +444,14 @@ cdef class HttpParser:
raw_headers = tuple(self._raw_headers)
headers = CIMultiDictProxy(CIMultiDict(self._headers))
+ # https://www.rfc-editor.org/rfc/rfc9110.html#name-collected-abnf
+ bad_hdr = next(
+ (h for h in SINGLETON_HEADERS if len(headers.getall(h, ())) > 1),
+ None,
+ )
+ if bad_hdr is not None:
+ raise BadHttpMessage(f"Duplicate '{bad_hdr}' header found.")
+
if self._cparser.type == cparser.HTTP_REQUEST:
h_upg = headers.get("upgrade", "")
allowed = upgrade and h_upg.isascii() and h_upg.lower() in ALLOWED_UPGRADES
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index ef058588181..569f3cd29fe 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -265,6 +265,47 @@ def test_content_length_transfer_encoding(parser: Any) -> None:
parser.feed_data(text)
+@pytest.mark.parametrize(
+ "hdr",
+ (
+ "Content-Length",
+ "Content-Location",
+ "Content-Range",
+ "Content-Type",
+ "ETag",
+ "Host",
+ "Max-Forwards",
+ "Server",
+ "Transfer-Encoding",
+ "User-Agent",
+ ),
+)
+def test_duplicate_singleton_header_rejected(
+ parser: HttpRequestParser, hdr: str
+) -> None:
+ val1, val2 = ("1", "2") if hdr == "Content-Length" else ("value1", "value2")
+ text = (
+ f"GET /test HTTP/1.1\r\n"
+ f"Host: example.com\r\n"
+ f"{hdr}: {val1}\r\n"
+ f"{hdr}: {val2}\r\n"
+ f"\r\n"
+ ).encode()
+ with pytest.raises(http_exceptions.BadHttpMessage, match="Duplicate"):
+ parser.feed_data(text)
+
+
+def test_duplicate_host_header_rejected(parser: HttpRequestParser) -> None:
+ text = (
+ b"GET /admin HTTP/1.1\r\n"
+ b"Host: admin.example\r\n"
+ b"Host: public.example\r\n"
+ b"\r\n"
+ )
+ with pytest.raises(http_exceptions.BadHttpMessage, match="Duplicate.*Host"):
+ parser.feed_data(text)
+
+
def test_bad_chunked(parser: HttpRequestParser) -> None:
"""Test that invalid chunked encoding doesn't allow content-length to be used."""
text = (
From e00ca3cca92c465c7913c4beb763a72da9ed8349 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Sun, 15 Mar 2026 13:58:08 +0000
Subject: [PATCH 129/141] [PR #12240/345d2537 backport][3.13] Reject duplicate
singleton headers in C extension parser (#12241)
**This is a backport of PR #12240 as merged into master
(345d25371562dd56de099f1fcd5720e96c6e7702).**
Co-authored-by: Rodrigo Nogueira
---
CHANGES/12240.bugfix.rst | 5 +++++
aiohttp/_http_parser.pyx | 22 +++++++++++++++++++++
tests/test_http_parser.py | 41 +++++++++++++++++++++++++++++++++++++++
3 files changed, 68 insertions(+)
create mode 100644 CHANGES/12240.bugfix.rst
diff --git a/CHANGES/12240.bugfix.rst b/CHANGES/12240.bugfix.rst
new file mode 100644
index 00000000000..49508b3f595
--- /dev/null
+++ b/CHANGES/12240.bugfix.rst
@@ -0,0 +1,5 @@
+Rejected duplicate singleton headers (``Host``, ``Content-Type``,
+``Content-Length``, etc.) in the C extension HTTP parser to match
+the pure Python parser behavior, preventing potential host-based
+access control bypasses via parser differentials
+-- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index 496498d6f58..18c293bfda2 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -71,6 +71,20 @@ cdef object StreamReader = _StreamReader
cdef object DeflateBuffer = _DeflateBuffer
cdef bytes EMPTY_BYTES = b""
+# https://www.rfc-editor.org/rfc/rfc9110.html#section-5.5-6
+cdef tuple SINGLETON_HEADERS = (
+ hdrs.CONTENT_LENGTH,
+ hdrs.CONTENT_LOCATION,
+ hdrs.CONTENT_RANGE,
+ hdrs.CONTENT_TYPE,
+ hdrs.ETAG,
+ hdrs.HOST,
+ hdrs.MAX_FORWARDS,
+ hdrs.SERVER,
+ hdrs.TRANSFER_ENCODING,
+ hdrs.USER_AGENT,
+)
+
cdef inline object extend(object buf, const char* at, size_t length):
cdef Py_ssize_t s
cdef char* ptr
@@ -430,6 +444,14 @@ cdef class HttpParser:
raw_headers = tuple(self._raw_headers)
headers = CIMultiDictProxy(CIMultiDict(self._headers))
+ # https://www.rfc-editor.org/rfc/rfc9110.html#name-collected-abnf
+ bad_hdr = next(
+ (h for h in SINGLETON_HEADERS if len(headers.getall(h, ())) > 1),
+ None,
+ )
+ if bad_hdr is not None:
+ raise BadHttpMessage(f"Duplicate '{bad_hdr}' header found.")
+
if self._cparser.type == cparser.HTTP_REQUEST:
h_upg = headers.get("upgrade", "")
allowed = upgrade and h_upg.isascii() and h_upg.lower() in ALLOWED_UPGRADES
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 8ddb3e4ff34..2bdcdb3b97a 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -265,6 +265,47 @@ def test_content_length_transfer_encoding(parser: Any) -> None:
parser.feed_data(text)
+@pytest.mark.parametrize(
+ "hdr",
+ (
+ "Content-Length",
+ "Content-Location",
+ "Content-Range",
+ "Content-Type",
+ "ETag",
+ "Host",
+ "Max-Forwards",
+ "Server",
+ "Transfer-Encoding",
+ "User-Agent",
+ ),
+)
+def test_duplicate_singleton_header_rejected(
+ parser: HttpRequestParser, hdr: str
+) -> None:
+ val1, val2 = ("1", "2") if hdr == "Content-Length" else ("value1", "value2")
+ text = (
+ f"GET /test HTTP/1.1\r\n"
+ f"Host: example.com\r\n"
+ f"{hdr}: {val1}\r\n"
+ f"{hdr}: {val2}\r\n"
+ f"\r\n"
+ ).encode()
+ with pytest.raises(http_exceptions.BadHttpMessage, match="Duplicate"):
+ parser.feed_data(text)
+
+
+def test_duplicate_host_header_rejected(parser: HttpRequestParser) -> None:
+ text = (
+ b"GET /admin HTTP/1.1\r\n"
+ b"Host: admin.example\r\n"
+ b"Host: public.example\r\n"
+ b"\r\n"
+ )
+ with pytest.raises(http_exceptions.BadHttpMessage, match="Duplicate.*Host"):
+ parser.feed_data(text)
+
+
def test_bad_chunked(parser: HttpRequestParser) -> None:
"""Test that invalid chunked encoding doesn't allow content-length to be used."""
text = (
From 5f50bac47d29cb5b0c16cc266000bc60e0a432e1 Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Sun, 15 Mar 2026 18:17:44 -0300
Subject: [PATCH 130/141] [PR #12217 backport][3.14] Raise on redirect with
consumed non-rewindable request bodies (#12245)
---
CHANGES/12195.bugfix.rst | 2 ++
aiohttp/client.py | 13 +++++++-
docs/client_quickstart.rst | 35 ++++++++++++--------
docs/spelling_wordlist.txt | 1 +
tests/test_client_functional.py | 53 ++++++++++++++++++++++++------
tests/test_client_ws_functional.py | 13 ++++----
6 files changed, 87 insertions(+), 30 deletions(-)
create mode 100644 CHANGES/12195.bugfix.rst
diff --git a/CHANGES/12195.bugfix.rst b/CHANGES/12195.bugfix.rst
new file mode 100644
index 00000000000..34729b52f0d
--- /dev/null
+++ b/CHANGES/12195.bugfix.rst
@@ -0,0 +1,2 @@
+Fixed redirects with consumed non-rewindable request bodies to raise
+:class:`aiohttp.ClientPayloadError` instead of silently sending an empty body.
diff --git a/aiohttp/client.py b/aiohttp/client.py
index e3e437d3088..be44fb1ef2c 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -876,7 +876,18 @@ async def _connect_and_send_request(
# For 307/308, always preserve the request body
# For 301/302 with non-POST methods, preserve the request body
# https://www.rfc-editor.org/rfc/rfc9110#section-15.4.3-3.1
- # Use the existing payload to avoid recreating it from a potentially consumed file
+ # Use the existing payload to avoid recreating it from
+ # a potentially consumed file.
+ #
+ # If the payload is already consumed and cannot be replayed,
+ # fail fast instead of silently sending an empty body.
+ if req._body is not None and req._body.consumed:
+ resp.close()
+ raise ClientPayloadError(
+ "Cannot follow redirect with a consumed request "
+ "body. Use bytes, a seekable file-like object, "
+ "or set allow_redirects=False."
+ )
data = req._body
r_url = resp.headers.get(hdrs.LOCATION) or resp.headers.get(
diff --git a/docs/client_quickstart.rst b/docs/client_quickstart.rst
index bd7b0c854c5..d00333c1e48 100644
--- a/docs/client_quickstart.rst
+++ b/docs/client_quickstart.rst
@@ -343,25 +343,34 @@ send large files without reading them into memory.
As a simple case, simply provide a file-like object for your body::
- with open('massive-body', 'rb') as f:
- await session.post('http://httpbin.org/post', data=f)
+ with open("massive-body", "rb") as f:
+ await session.post("https://httpbin.org/post", data=f)
-Or you can use *asynchronous generator*::
+Or you can provide an *asynchronous generator*, for example to generate
+data on the fly::
- async def file_sender(file_name=None):
- async with aiofiles.open(file_name, 'rb') as f:
- chunk = await f.read(64*1024)
- while chunk:
- yield chunk
- chunk = await f.read(64*1024)
+ async def data_generator():
+ for i in range(10):
+ yield f"line {i}\n".encode()
- # Then you can use file_sender as a data provider:
-
- async with session.post('http://httpbin.org/post',
- data=file_sender(file_name='huge_file')) as resp:
+ async with session.post("https://httpbin.org/post",
+ data=data_generator()) as resp:
print(await resp.text())
+.. warning::
+
+ Async generators and other non-rewindable data sources
+ (such as :class:`~aiohttp.StreamReader`) cannot be replayed if a
+ redirect occurs (for example, HTTP 307 or 308). If the request body
+ has already been streamed, :mod:`aiohttp` raises
+ :class:`~aiohttp.ClientPayloadError`.
+
+ If your endpoint may redirect, either:
+
+ * Pass a seekable file-like object or :class:`bytes`.
+ * Disable redirects with ``allow_redirects=False`` and handle them manually.
+
Because the :attr:`~aiohttp.ClientResponse.content` attribute is a
:class:`~aiohttp.StreamReader` (provides async iterator protocol), you
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 50d2a5f8a88..2d39c4a1713 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -304,6 +304,7 @@ sa
Satisfiable
scalability
schemas
+seekable
sendfile
serializable
serializer
diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py
index 545d8075c5e..524b712d4b8 100644
--- a/tests/test_client_functional.py
+++ b/tests/test_client_functional.py
@@ -5033,7 +5033,7 @@ async def final_handler(request: web.Request) -> web.Response:
async def test_async_iterable_payload_redirect(aiohttp_client: AiohttpClient) -> None:
- """Test that AsyncIterablePayload cannot be reused across redirects."""
+ """Test redirecting consumed AsyncIterablePayload raises an error."""
data_received = []
async def redirect_handler(request: web.Request) -> web.Response:
@@ -5061,17 +5061,50 @@ async def async_gen() -> AsyncIterator[bytes]:
payload = AsyncIterablePayload(async_gen())
- resp = await client.post("/redirect", data=payload)
- assert resp.status == 200
- text = await resp.text()
- # AsyncIterablePayload is consumed after first use, so redirect gets empty body
- assert text == "Received: "
+ with pytest.raises(
+ aiohttp.ClientPayloadError,
+ match="Cannot follow redirect with a consumed request body",
+ ):
+ await client.post("/redirect", data=payload)
+
+ # Only the first endpoint should have received data.
+ expected_data = b"".join(chunks)
+ assert data_received == [("redirect", expected_data)]
+
+
+@pytest.mark.parametrize("status", (301, 302))
+async def test_async_iterable_payload_redirect_non_post_301_302(
+ aiohttp_client: AiohttpClient, status: int
+) -> None:
+ """Test consumed async iterable body raises on 301/302 for non-POST methods."""
+ data_received = []
+
+ async def redirect_handler(request: web.Request) -> web.Response:
+ data = await request.read()
+ data_received.append(("redirect", data))
+ return web.Response(status=status, headers={"Location": "/final_destination"})
+
+ app = web.Application()
+ app.router.add_put("/redirect", redirect_handler)
+
+ client = await aiohttp_client(app)
+
+ chunks = [b"chunk1", b"chunk2", b"chunk3"]
+
+ async def async_gen() -> AsyncIterator[bytes]:
+ for chunk in chunks:
+ yield chunk
+
+ payload = AsyncIterablePayload(async_gen())
+
+ with pytest.raises(
+ aiohttp.ClientPayloadError,
+ match="Cannot follow redirect with a consumed request body",
+ ):
+ await client.put("/redirect", data=payload)
- # Only the first endpoint should have received data
expected_data = b"".join(chunks)
- assert len(data_received) == 2
- assert data_received[0] == ("redirect", expected_data)
- assert data_received[1] == ("final", b"") # Empty after being consumed
+ assert data_received == [("redirect", expected_data)]
async def test_buffered_reader_payload_redirect(aiohttp_client: AiohttpClient) -> None:
diff --git a/tests/test_client_ws_functional.py b/tests/test_client_ws_functional.py
index 6a4f2296fe8..2e50d636f9e 100644
--- a/tests/test_client_ws_functional.py
+++ b/tests/test_client_ws_functional.py
@@ -886,12 +886,14 @@ async def test_heartbeat_does_not_timeout_while_receiving_large_frame(
which could cause a ping/pong timeout while bytes were still being received.
"""
payload = b"x" * 2048
- heartbeat = 0.05
+ heartbeat = 0.1
chunk_size = 64
delay = 0.01
async def handler(request: web.Request) -> web.WebSocketResponse:
- ws = web.WebSocketResponse()
+ # Disable auto-PONG so a heartbeat PING during frame streaming would
+ # surface as a timeout/closure on the client side.
+ ws = web.WebSocketResponse(autoping=False)
await ws.prepare(request)
assert ws._writer is not None
@@ -918,10 +920,8 @@ async def handler(request: web.Request) -> web.WebSocketResponse:
client = await aiohttp_client(app)
async with client.ws_connect("/", heartbeat=heartbeat) as resp:
- # If heartbeat was not reset on any incoming bytes, the client would start
- # sending PINGs while we're still streaming the message body, and since the
- # server handler never calls receive(), no PONG would be produced and the
- # client would close with a ping/pong timeout.
+ # If heartbeat were not reset on incoming bytes, the client would send
+ # a PING while this frame is still being streamed.
with mock.patch.object(
resp._writer, "send_frame", wraps=resp._writer.send_frame
) as sf:
@@ -931,6 +931,7 @@ async def handler(request: web.Request) -> web.WebSocketResponse:
), "Heartbeat PING sent while data was still being received"
assert msg.type is WSMsgType.BINARY
assert msg.data == payload
+ assert not resp.closed
async def test_heartbeat_no_pong_after_receive_many_messages(
From 2b256cfe7d92b392683e341abd07a4729f705d04 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Mon, 16 Mar 2026 21:42:28 +0000
Subject: [PATCH 131/141] [PR #12238/24cb8c9c backport][3.14] Skip TLS-in-TLS
warning when proxy is not HTTPS (#12248)
**This is a backport of PR #12238 as merged into master
(24cb8c9ca414dcda009717c0872019e62fc4ef3a).**
Co-authored-by: wavebyrd <160968744+wavebyrd@users.noreply.github.com>
---
CHANGES/10683.bugfix.rst | 1 +
aiohttp/connector.py | 6 ++++++
2 files changed, 7 insertions(+)
create mode 100644 CHANGES/10683.bugfix.rst
diff --git a/CHANGES/10683.bugfix.rst b/CHANGES/10683.bugfix.rst
new file mode 100644
index 00000000000..9631cc5fa05
--- /dev/null
+++ b/CHANGES/10683.bugfix.rst
@@ -0,0 +1 @@
+Fixed misleading TLS-in-TLS warning being emitted when sending HTTPS requests through an HTTP proxy. The warning now only fires when the proxy itself uses HTTPS, which is the only case where TLS-in-TLS actually applies -- by :user:`wavebyrd`.
diff --git a/aiohttp/connector.py b/aiohttp/connector.py
index 0e9e3a5f22a..66cfaaca10a 100644
--- a/aiohttp/connector.py
+++ b/aiohttp/connector.py
@@ -1387,6 +1387,12 @@ def _warn_about_tls_in_tls(
if req.request_info.url.scheme != "https":
return
+ # TLS-in-TLS only applies when the proxy itself is HTTPS.
+ # When the proxy is HTTP, start_tls upgrades a plain TCP connection,
+ # which is standard TLS and works on all event loops and Python versions.
+ if req.proxy is None or req.proxy.scheme != "https":
+ return
+
# Check if uvloop is being used, which supports TLS in TLS,
# otherwise assume that asyncio's native transport is being used.
if type(underlying_transport).__module__.startswith("uvloop"):
From 7ae276067af4fedd4ab8dbb21e94d4d309770776 Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Wed, 18 Mar 2026 22:08:39 -0300
Subject: [PATCH 132/141] [Backport 3.14] Tokenize Connection header values in
Python HTTP parser (#12249) (#12257)
---
.pre-commit-config.yaml | 2 +-
CHANGES/12249.bugfix.rst | 3 +++
aiohttp/http_parser.py | 22 ++++++++++------
tests/test_http_parser.py | 53 +++++++++++++++++++++++++++++++++++++++
4 files changed, 72 insertions(+), 8 deletions(-)
create mode 100644 CHANGES/12249.bugfix.rst
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index b5a67394b80..cf1200682da 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -101,7 +101,7 @@ repos:
- id: detect-private-key
exclude: ^examples/
- repo: https://github.com/asottile/pyupgrade
- rev: 'v3.15.2'
+ rev: 'v3.21.2'
hooks:
- id: pyupgrade
args: ['--py37-plus']
diff --git a/CHANGES/12249.bugfix.rst b/CHANGES/12249.bugfix.rst
new file mode 100644
index 00000000000..42d90314110
--- /dev/null
+++ b/CHANGES/12249.bugfix.rst
@@ -0,0 +1,3 @@
+Aligned the pure-Python HTTP request parser with the C parser by splitting
+comma-separated and repeated ``Connection`` header values for keep-alive,
+close, and upgrade handling -- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index 0119153cfce..d181eefca8d 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -535,16 +535,24 @@ def parse_headers(
if bad_hdr is not None:
raise BadHttpMessage(f"Duplicate '{bad_hdr}' header found.")
- # keep-alive
- conn = headers.get(hdrs.CONNECTION)
- if conn:
- v = conn.lower()
- if v == "close":
+ # keep-alive and protocol switching
+ # RFC 9110 section 7.6.1 defines Connection as a comma-separated list.
+ conn_values = headers.getall(hdrs.CONNECTION, ())
+ if conn_values:
+ conn_tokens = {
+ token.lower()
+ for conn_value in conn_values
+ for token in (part.strip(" \t") for part in conn_value.split(","))
+ if token and token.isascii()
+ }
+
+ if "close" in conn_tokens:
close_conn = True
- elif v == "keep-alive":
+ elif "keep-alive" in conn_tokens:
close_conn = False
+
# https://www.rfc-editor.org/rfc/rfc9110.html#name-101-switching-protocols
- elif v == "upgrade" and headers.get(hdrs.UPGRADE):
+ if "upgrade" in conn_tokens and headers.get(hdrs.UPGRADE):
upgrade = True
# encoding
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 569f3cd29fe..2e37e584310 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -497,6 +497,24 @@ def test_conn_keep_alive_1_1(parser) -> None:
assert not msg.should_close
+def test_conn_close_comma_list(parser) -> None:
+ text = b"GET /test HTTP/1.1\r\nconnection: close, keep-alive\r\n\r\n"
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert msg.should_close
+
+
+def test_conn_close_multiple_headers(parser) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\n"
+ b"connection: keep-alive\r\n"
+ b"connection: close\r\n\r\n"
+ )
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert msg.should_close
+
+
def test_conn_other_1_0(parser) -> None:
text = b"GET /test HTTP/1.0\r\nconnection: test\r\n\r\n"
messages, upgrade, tail = parser.feed_data(text)
@@ -586,6 +604,33 @@ def test_conn_upgrade(parser: Any) -> None:
assert upgrade
+def test_conn_upgrade_comma_list(parser) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\n"
+ b"connection: keep-alive, upgrade\r\n"
+ b"upgrade: websocket\r\n\r\n"
+ )
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert not msg.should_close
+ assert msg.upgrade
+ assert upgrade
+
+
+def test_conn_upgrade_multiple_headers(parser) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\n"
+ b"connection: keep-alive\r\n"
+ b"connection: upgrade\r\n"
+ b"upgrade: websocket\r\n\r\n"
+ )
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert not msg.should_close
+ assert msg.upgrade
+ assert upgrade
+
+
def test_bad_upgrade(parser) -> None:
"""Test not upgraded if missing Upgrade header."""
text = b"GET /test HTTP/1.1\r\nconnection: upgrade\r\n\r\n"
@@ -956,6 +1001,14 @@ def test_http_request_message_after_close(parser: HttpRequestParser) -> None:
parser.feed_data(text)
+def test_http_request_message_after_close_comma_list(parser: HttpRequestParser) -> None:
+ text = b"GET / HTTP/1.1\r\nConnection: close, keep-alive\r\n\r\nInvalid\r\n\r\n"
+ with pytest.raises(
+ http_exceptions.BadHttpMessage, match="Data after `Connection: close`"
+ ):
+ parser.feed_data(text)
+
+
def test_http_request_upgrade(parser: HttpRequestParser) -> None:
text = (
b"GET /test HTTP/1.1\r\n"
From 5279fbd1241aa3d1ccaa09de28a174f42c6e030b Mon Sep 17 00:00:00 2001
From: Rodrigo Nogueira
Date: Wed, 18 Mar 2026 22:09:08 -0300
Subject: [PATCH 133/141] [Backport 3.13] Tokenize Connection header values in
Python HTTP parser (#12249) (#12256)
---
.pre-commit-config.yaml | 2 +-
CHANGES/12249.bugfix.rst | 3 +++
aiohttp/http_parser.py | 22 ++++++++++------
tests/test_http_parser.py | 53 +++++++++++++++++++++++++++++++++++++++
4 files changed, 72 insertions(+), 8 deletions(-)
create mode 100644 CHANGES/12249.bugfix.rst
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index b5a67394b80..cf1200682da 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -101,7 +101,7 @@ repos:
- id: detect-private-key
exclude: ^examples/
- repo: https://github.com/asottile/pyupgrade
- rev: 'v3.15.2'
+ rev: 'v3.21.2'
hooks:
- id: pyupgrade
args: ['--py37-plus']
diff --git a/CHANGES/12249.bugfix.rst b/CHANGES/12249.bugfix.rst
new file mode 100644
index 00000000000..42d90314110
--- /dev/null
+++ b/CHANGES/12249.bugfix.rst
@@ -0,0 +1,3 @@
+Aligned the pure-Python HTTP request parser with the C parser by splitting
+comma-separated and repeated ``Connection`` header values for keep-alive,
+close, and upgrade handling -- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/http_parser.py b/aiohttp/http_parser.py
index ec7db0868e1..4889404bebc 100644
--- a/aiohttp/http_parser.py
+++ b/aiohttp/http_parser.py
@@ -549,16 +549,24 @@ def parse_headers(
if bad_hdr is not None:
raise BadHttpMessage(f"Duplicate '{bad_hdr}' header found.")
- # keep-alive
- conn = headers.get(hdrs.CONNECTION)
- if conn:
- v = conn.lower()
- if v == "close":
+ # keep-alive and protocol switching
+ # RFC 9110 section 7.6.1 defines Connection as a comma-separated list.
+ conn_values = headers.getall(hdrs.CONNECTION, ())
+ if conn_values:
+ conn_tokens = {
+ token.lower()
+ for conn_value in conn_values
+ for token in (part.strip(" \t") for part in conn_value.split(","))
+ if token and token.isascii()
+ }
+
+ if "close" in conn_tokens:
close_conn = True
- elif v == "keep-alive":
+ elif "keep-alive" in conn_tokens:
close_conn = False
+
# https://www.rfc-editor.org/rfc/rfc9110.html#name-101-switching-protocols
- elif v == "upgrade" and headers.get(hdrs.UPGRADE):
+ if "upgrade" in conn_tokens and headers.get(hdrs.UPGRADE):
upgrade = True
# encoding
diff --git a/tests/test_http_parser.py b/tests/test_http_parser.py
index 2bdcdb3b97a..f3ab7e26d66 100644
--- a/tests/test_http_parser.py
+++ b/tests/test_http_parser.py
@@ -497,6 +497,24 @@ def test_conn_keep_alive_1_1(parser) -> None:
assert not msg.should_close
+def test_conn_close_comma_list(parser) -> None:
+ text = b"GET /test HTTP/1.1\r\nconnection: close, keep-alive\r\n\r\n"
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert msg.should_close
+
+
+def test_conn_close_multiple_headers(parser) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\n"
+ b"connection: keep-alive\r\n"
+ b"connection: close\r\n\r\n"
+ )
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert msg.should_close
+
+
def test_conn_other_1_0(parser) -> None:
text = b"GET /test HTTP/1.0\r\nconnection: test\r\n\r\n"
messages, upgrade, tail = parser.feed_data(text)
@@ -586,6 +604,33 @@ def test_conn_upgrade(parser: Any) -> None:
assert upgrade
+def test_conn_upgrade_comma_list(parser) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\n"
+ b"connection: keep-alive, upgrade\r\n"
+ b"upgrade: websocket\r\n\r\n"
+ )
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert not msg.should_close
+ assert msg.upgrade
+ assert upgrade
+
+
+def test_conn_upgrade_multiple_headers(parser) -> None:
+ text = (
+ b"GET /test HTTP/1.1\r\n"
+ b"connection: keep-alive\r\n"
+ b"connection: upgrade\r\n"
+ b"upgrade: websocket\r\n\r\n"
+ )
+ messages, upgrade, tail = parser.feed_data(text)
+ msg = messages[0][0]
+ assert not msg.should_close
+ assert msg.upgrade
+ assert upgrade
+
+
def test_bad_upgrade(parser) -> None:
"""Test not upgraded if missing Upgrade header."""
text = b"GET /test HTTP/1.1\r\nconnection: upgrade\r\n\r\n"
@@ -956,6 +1001,14 @@ def test_http_request_message_after_close(parser: HttpRequestParser) -> None:
parser.feed_data(text)
+def test_http_request_message_after_close_comma_list(parser: HttpRequestParser) -> None:
+ text = b"GET / HTTP/1.1\r\nConnection: close, keep-alive\r\n\r\nInvalid\r\n\r\n"
+ with pytest.raises(
+ http_exceptions.BadHttpMessage, match="Data after `Connection: close`"
+ ):
+ parser.feed_data(text)
+
+
def test_http_request_upgrade(parser: HttpRequestParser) -> None:
text = (
b"GET /test HTTP/1.1\r\n"
From dedf508ffe9c7a393eee3919b2af44d367828e6e Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Thu, 19 Mar 2026 10:59:19 +0000
Subject: [PATCH 134/141] Bump actions/cache from 5.0.3 to 5.0.4 (#12259)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps [actions/cache](https://github.com/actions/cache) from 5.0.3 to
5.0.4.
Release notes
Sourced from actions/cache's
releases.
v5.0.4
What's Changed
New Contributors
Full Changelog: https://github.com/actions/cache/compare/v5...v5.0.4
Changelog
Sourced from actions/cache's
changelog.
5.0.4
- Bump
minimatch to v3.1.5 (fixes ReDoS via globstar
patterns)
- Bump
undici to v6.24.1 (WebSocket decompression bomb
protection, header validation fixes)
- Bump
fast-xml-parser to v5.5.6
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 0cc547d887e..d74f8031b65 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -71,7 +71,7 @@ jobs:
with:
python-version: 3.11
- name: Cache PyPI
- uses: actions/cache@v5.0.3
+ uses: actions/cache@v5.0.4
with:
key: pip-lint-${{ hashFiles('requirements/*.txt') }}
path: ~/.cache/pip
@@ -120,7 +120,7 @@ jobs:
with:
submodules: true
- name: Cache llhttp generated files
- uses: actions/cache@v5.0.3
+ uses: actions/cache@v5.0.4
id: cache
with:
key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -184,7 +184,7 @@ jobs:
echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
shell: bash
- name: Cache PyPI
- uses: actions/cache@v5.0.3
+ uses: actions/cache@v5.0.4
with:
key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
path: ${{ steps.pip-cache.outputs.dir }}
@@ -295,7 +295,7 @@ jobs:
echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
shell: bash
- name: Cache PyPI
- uses: actions/cache@v5.0.3
+ uses: actions/cache@v5.0.4
with:
key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
path: ${{ steps.pip-cache.outputs.dir }}
From 625f29e11d580f7826a2646a959d629bd8363e46 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Tue, 24 Mar 2026 19:34:48 +0000
Subject: [PATCH 135/141] [PR #12265/b5a51707 backport][3.13] Avoid accessing
Py_buffer after release in HTTP parser (#12269)
Co-authored-by: Dexter.k <164054284+rootvector2@users.noreply.github.com>
---
aiohttp/_http_parser.pyx | 9 ++++++---
1 file changed, 6 insertions(+), 3 deletions(-)
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index 18c293bfda2..d53550b1007 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -556,20 +556,23 @@ cdef class HttpParser:
cdef:
size_t data_len
size_t nb
+ char* base
cdef cparser.llhttp_errno_t errno
PyObject_GetBuffer(data, &self.py_buf, PyBUF_SIMPLE)
+ # Cache buffer pointer before PyBuffer_Release to avoid use-after-release.
+ base = self.py_buf.buf
data_len = self.py_buf.len
errno = cparser.llhttp_execute(
self._cparser,
- self.py_buf.buf,
+ base,
data_len)
if errno is cparser.HPE_PAUSED_UPGRADE:
cparser.llhttp_resume_after_upgrade(self._cparser)
- nb = cparser.llhttp_get_error_pos(self._cparser) - self.py_buf.buf
+ nb = cparser.llhttp_get_error_pos(self._cparser) - base
PyBuffer_Release(&self.py_buf)
@@ -580,7 +583,7 @@ cdef class HttpParser:
self._last_error = None
else:
after = cparser.llhttp_get_error_pos(self._cparser)
- before = data[:after - self.py_buf.buf]
+ before = data[:after - base]
after_b = after.split(b"\r\n", 1)[0]
before = before.rsplit(b"\r\n", 1)[-1]
data = before + after_b
From f983c51665c691f11f9a8e2942030d9d1634f0e0 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Tue, 24 Mar 2026 19:39:27 +0000
Subject: [PATCH 136/141] [PR #12265/b5a51707 backport][3.14] Avoid accessing
Py_buffer after release in HTTP parser (#12270)
Co-authored-by: Dexter.k <164054284+rootvector2@users.noreply.github.com>
---
aiohttp/_http_parser.pyx | 9 ++++++---
1 file changed, 6 insertions(+), 3 deletions(-)
diff --git a/aiohttp/_http_parser.pyx b/aiohttp/_http_parser.pyx
index 18c293bfda2..d53550b1007 100644
--- a/aiohttp/_http_parser.pyx
+++ b/aiohttp/_http_parser.pyx
@@ -556,20 +556,23 @@ cdef class HttpParser:
cdef:
size_t data_len
size_t nb
+ char* base
cdef cparser.llhttp_errno_t errno
PyObject_GetBuffer(data, &self.py_buf, PyBUF_SIMPLE)
+ # Cache buffer pointer before PyBuffer_Release to avoid use-after-release.
+ base = self.py_buf.buf
data_len = self.py_buf.len
errno = cparser.llhttp_execute(
self._cparser,
- self.py_buf.buf,
+ base,
data_len)
if errno is cparser.HPE_PAUSED_UPGRADE:
cparser.llhttp_resume_after_upgrade(self._cparser)
- nb = cparser.llhttp_get_error_pos(self._cparser) - self.py_buf.buf
+ nb = cparser.llhttp_get_error_pos(self._cparser) - base
PyBuffer_Release(&self.py_buf)
@@ -580,7 +583,7 @@ cdef class HttpParser:
self._last_error = None
else:
after = cparser.llhttp_get_error_pos(self._cparser)
- before = data[:after - self.py_buf.buf]
+ before = data[:after - base]
after_b = after.split(b"\r\n", 1)[0]
before = before.rsplit(b"\r\n", 1)[-1]
data = before + after_b
From 6a86e0f87e18ffaf81d6d40e92bba0596f6a5ed2 Mon Sep 17 00:00:00 2001
From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com>
Date: Wed, 25 Mar 2026 14:49:22 +0000
Subject: [PATCH 137/141] [PR #12271/e04da11f backport][3.14] Copy unsafe
setting from session cookie jar to ad-hoc request cookie jar (#12274)
**This is a backport of PR #12271 as merged into master
(e04da11f7a075547bfe8abf6b1361adae19949bf).**
Co-authored-by: Krishna Chaitanya
---
CHANGES/12011.bugfix.rst | 1 +
aiohttp/abc.py | 5 +++++
aiohttp/client.py | 3 ++-
aiohttp/cookiejar.py | 8 ++++++++
tests/test_client_session.py | 26 ++++++++++++++++++++++++++
tests/test_cookiejar.py | 9 +++++++++
6 files changed, 51 insertions(+), 1 deletion(-)
create mode 100644 CHANGES/12011.bugfix.rst
diff --git a/CHANGES/12011.bugfix.rst b/CHANGES/12011.bugfix.rst
new file mode 100644
index 00000000000..701a2ed01e3
--- /dev/null
+++ b/CHANGES/12011.bugfix.rst
@@ -0,0 +1 @@
+Fixed ad-hoc cookies passed to individual requests not being sent when the session's cookie jar has ``unsafe=True`` and the target URL uses an IP address, by copying the ``unsafe`` setting from the session's cookie jar to the temporary cookie jar -- by :user:`Krishnachaitanyakc`.
diff --git a/aiohttp/abc.py b/aiohttp/abc.py
index 00e9280c3b1..d5dcdc00233 100644
--- a/aiohttp/abc.py
+++ b/aiohttp/abc.py
@@ -163,6 +163,11 @@ class AbstractCookieJar(Sized, IterableBase):
def __init__(self, *, loop: asyncio.AbstractEventLoop | None = None) -> None:
self._loop = loop or asyncio.get_running_loop()
+ @property
+ @abstractmethod
+ def unsafe(self) -> bool:
+ """Return True if cookies can be used with IP addresses."""
+
@property
@abstractmethod
def quote_cookie(self) -> bool:
diff --git a/aiohttp/client.py b/aiohttp/client.py
index be44fb1ef2c..d797aa31021 100644
--- a/aiohttp/client.py
+++ b/aiohttp/client.py
@@ -724,7 +724,8 @@ async def _request(
if cookies is not None:
tmp_cookie_jar = CookieJar(
- quote_cookie=self._cookie_jar.quote_cookie
+ unsafe=self._cookie_jar.unsafe,
+ quote_cookie=self._cookie_jar.quote_cookie,
)
tmp_cookie_jar.update_cookies(cookies)
req_cookies = tmp_cookie_jar.filter_cookies(url)
diff --git a/aiohttp/cookiejar.py b/aiohttp/cookiejar.py
index 8a11bdd53ec..098e7ee33a5 100644
--- a/aiohttp/cookiejar.py
+++ b/aiohttp/cookiejar.py
@@ -143,6 +143,10 @@ def __init__(
self._expire_heap: list[tuple[float, tuple[str, str, str]]] = []
self._expirations: dict[tuple[str, str, str], float] = {}
+ @property
+ def unsafe(self) -> bool:
+ return self._unsafe
+
@property
def quote_cookie(self) -> bool:
return self._quote_cookie
@@ -600,6 +604,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
def __len__(self) -> int:
return 0
+ @property
+ def unsafe(self) -> bool:
+ return False
+
@property
def quote_cookie(self) -> bool:
return True
diff --git a/tests/test_client_session.py b/tests/test_client_session.py
index 7ab98c2bee4..c316170d4a0 100644
--- a/tests/test_client_session.py
+++ b/tests/test_client_session.py
@@ -747,6 +747,10 @@ def __init__(self) -> None:
self._clear_domain_mock = mock.Mock()
self._items: list[Any] = []
+ @property
+ def unsafe(self) -> bool:
+ return False
+
@property
def quote_cookie(self) -> bool:
return True
@@ -772,6 +776,7 @@ def __iter__(self) -> Iterator[Any]:
jar = MockCookieJar()
assert jar.quote_cookie is True
+ assert jar.unsafe is False
assert len(jar) == 0
assert list(jar) == []
jar.clear()
@@ -828,6 +833,27 @@ async def handler(_: web.Request) -> web.Response:
assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+async def test_cookies_with_unsafe_cookie_jar(
+ aiohttp_server: AiohttpServer,
+) -> None:
+ async def handler(request: web.Request) -> web.Response:
+ return web.Response()
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ server = await aiohttp_server(app)
+ jar = CookieJar(unsafe=True)
+ # Use an IP-based URL to verify that ad-hoc cookies are sent
+ # when the session cookie jar has unsafe=True.
+ ip_url = server.make_url("/")
+ assert ip_url.host is not None
+ assert ip_url.host.count(".") == 3 # Sanity check it looks like an IP address
+ cookies = {"adhoc": "value"}
+ async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+ async with sess.request("GET", ip_url, cookies=cookies) as resp:
+ assert "adhoc=value" in resp.request_info.headers.get("Cookie", "")
+
+
async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
session = aiohttp.ClientSession()
assert session.version == aiohttp.HttpVersion11
diff --git a/tests/test_cookiejar.py b/tests/test_cookiejar.py
index 9e4f7500afc..00586145524 100644
--- a/tests/test_cookiejar.py
+++ b/tests/test_cookiejar.py
@@ -827,6 +827,7 @@ async def make_jar():
async def test_dummy_cookie_jar() -> None:
cookie = SimpleCookie("foo=bar; Domain=example.com;")
dummy_jar = DummyCookieJar()
+ assert dummy_jar.unsafe is False
assert dummy_jar.quote_cookie is True
assert len(dummy_jar) == 0
dummy_jar.update_cookies(cookie)
@@ -1809,3 +1810,11 @@ async def test_save_load_json_secure_cookies(tmp_path: Path) -> None:
assert cookie["secure"] is True
assert cookie["httponly"] is True
assert cookie["domain"] == "example.com"
+
+
+async def test_cookie_jar_unsafe_property() -> None:
+ jar_safe = CookieJar()
+ assert jar_safe.unsafe is False
+
+ jar_unsafe = CookieJar(unsafe=True)
+ assert jar_unsafe.unsafe is True
From a8ff7b5afcf75a36db65478a3c55f65a303488f5 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Fri, 27 Mar 2026 11:14:00 +0000
Subject: [PATCH 138/141] Bump codecov/codecov-action from 5 to 6 (#12287)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Bumps
[codecov/codecov-action](https://github.com/codecov/codecov-action) from
5 to 6.
Release notes
Sourced from codecov/codecov-action's
releases.
v6.0.0
⚠️ This version introduces support for node24 which make cause
breaking changes for systems that do not currently support node24.
⚠️
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.5.4...v6.0.0
v5.5.4
This is a mirror of v5.5.2. v6 will be
released which requires node24
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.5.3...v5.5.4
v5.5.3
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.5.2...v5.5.3
v5.5.2
What's Changed
New Contributors
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.5.1...v5.5.2
v5.5.1
What's Changed
... (truncated)
Changelog
Sourced from codecov/codecov-action's
changelog.
v5.5.2
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.5.1..v5.5.2
v5.5.1
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.5.0..v5.5.1
v5.5.0
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.4.3..v5.5.0
v5.4.3
What's Changed
Full Changelog: https://github.com/codecov/codecov-action/compare/v5.4.2..v5.4.3
v5.4.2
... (truncated)
Commits
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
Signed-off-by: dependabot[bot]
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
---
.github/workflows/ci-cd.yml | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index d74f8031b65..8c28e13baea 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -247,7 +247,7 @@ jobs:
run: |
python -m coverage xml
- name: Upload coverage
- uses: codecov/codecov-action@v5
+ uses: codecov/codecov-action@v6
with:
files: ./coverage.xml
flags: >-
@@ -339,7 +339,7 @@ jobs:
run: |
python -m coverage xml
- name: Upload coverage
- uses: codecov/codecov-action@v5
+ uses: codecov/codecov-action@v6
with:
files: ./coverage.xml
flags: >-
@@ -417,7 +417,7 @@ jobs:
with:
jobs: ${{ toJSON(needs) }}
- name: Trigger codecov notification
- uses: codecov/codecov-action@v5
+ uses: codecov/codecov-action@v6
with:
token: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: true
From 8bdc7c7fc002c4a7774b295923be42aac6dd9eb7 Mon Sep 17 00:00:00 2001
From: Stefan Agner
Date: Fri, 27 Mar 2026 14:11:31 +0100
Subject: [PATCH 139/141] Raise ConnectionResetError if transport is None
(#11761) (#12283)
Backport cherry picked from commit
b26c9aee4cf1cb280fa771855ab7be95f1ae1d22
---
CHANGES/11761.bugfix.rst | 4 ++
aiohttp/web_fileresponse.py | 3 +-
aiohttp/web_ws.py | 3 +-
tests/test_web_sendfile_functional.py | 63 ++++++++++++++++++++++++++
tests/test_web_websocket_functional.py | 57 ++++++++++++++++++++++-
5 files changed, 127 insertions(+), 3 deletions(-)
create mode 100644 CHANGES/11761.bugfix.rst
diff --git a/CHANGES/11761.bugfix.rst b/CHANGES/11761.bugfix.rst
new file mode 100644
index 00000000000..d4661c6d4a1
--- /dev/null
+++ b/CHANGES/11761.bugfix.rst
@@ -0,0 +1,4 @@
+Fixed ``AssertionError`` when the transport is ``None`` during WebSocket
+preparation or file response sending (e.g. when a client disconnects
+immediately after connecting). A ``ConnectionResetError`` is now raised
+instead -- by :user:`agners`.
diff --git a/aiohttp/web_fileresponse.py b/aiohttp/web_fileresponse.py
index 73be9b3f090..d0a6972fdea 100644
--- a/aiohttp/web_fileresponse.py
+++ b/aiohttp/web_fileresponse.py
@@ -141,7 +141,8 @@ async def _sendfile(
loop = request._loop
transport = request.transport
- assert transport is not None
+ if transport is None:
+ raise ConnectionResetError("Connection lost")
try:
await loop.sendfile(transport, fobj, offset, count)
diff --git a/aiohttp/web_ws.py b/aiohttp/web_ws.py
index 1f305c16193..4c540eea422 100644
--- a/aiohttp/web_ws.py
+++ b/aiohttp/web_ws.py
@@ -358,7 +358,8 @@ def _pre_start(self, request: BaseRequest) -> tuple[str | None, WebSocketWriter]
self.force_close()
self._compress = compress
transport = request._protocol.transport
- assert transport is not None
+ if transport is None:
+ raise ConnectionResetError("Connection lost")
writer = WebSocketWriter(
request._protocol,
transport,
diff --git a/tests/test_web_sendfile_functional.py b/tests/test_web_sendfile_functional.py
index 56899d118ee..b0a51b08dbb 100644
--- a/tests/test_web_sendfile_functional.py
+++ b/tests/test_web_sendfile_functional.py
@@ -1,5 +1,6 @@
import asyncio
import bz2
+import contextlib
import gzip
import pathlib
import socket
@@ -13,6 +14,7 @@
from aiohttp import web
from aiohttp.compression_utils import ZLibBackend
from aiohttp.pytest_plugin import AiohttpClient
+from aiohttp.web_fileresponse import NOSENDFILE
try:
import brotlicffi as brotli
@@ -1151,3 +1153,64 @@ async def handler(request):
await resp.release()
await client.close()
+
+
+@pytest.mark.skipif(NOSENDFILE, reason="OS sendfile not available")
+async def test_sendfile_after_client_disconnect(
+ aiohttp_client: AiohttpClient, tmp_path: pathlib.Path
+) -> None:
+ """Test ConnectionResetError when client disconnects before sendfile.
+
+ Reproduces the race condition where:
+ - Client sends a GET request for a file
+ - Handler does async work (e.g. auth check) before returning a FileResponse
+ - Client disconnects while the handler is busy
+ - Server then calls sendfile() → ConnectionResetError (not AssertionError)
+
+ _send_headers_immediately is set to False so that super().prepare()
+ only buffers the headers without writing to the transport. Otherwise
+ _write() raises ClientConnectionResetError first and _sendfile()'s own
+ transport check is never reached.
+ """
+ filepath = tmp_path / "test.txt"
+ filepath.write_bytes(b"x" * 1024)
+
+ handler_started = asyncio.Event()
+ prepare_done = asyncio.Event()
+ captured_protocol = None
+
+ async def handler(request: web.Request) -> web.Response:
+ nonlocal captured_protocol
+ resp = web.FileResponse(filepath)
+ resp._send_headers_immediately = False
+ captured_protocol = request._protocol
+ handler_started.set()
+ # Simulate async work (e.g., auth check) during which client disconnects.
+ await asyncio.sleep(0)
+ with pytest.raises(ConnectionResetError, match="Connection lost"):
+ await resp.prepare(request)
+ prepare_done.set()
+ return web.Response(status=503)
+
+ app = web.Application()
+ app.router.add_get("/", handler)
+ client = await aiohttp_client(app)
+
+ request_task = asyncio.create_task(client.get("/"))
+
+ # Wait until the handler is running but has not yet returned the response.
+ await handler_started.wait()
+ assert captured_protocol is not None
+
+ # Simulate the client disconnecting by setting transport to None directly.
+ # We cannot use force_close() because closing the TCP transport triggers
+ # connection_lost() which cancels the handler task before it can call
+ # prepare() and hit the ConnectionResetError in _sendfile().
+ captured_protocol.transport = None
+
+ # Wait for the handler to resume, call prepare(), and hit ConnectionResetError.
+ await asyncio.wait_for(prepare_done.wait(), timeout=1)
+
+ request_task.cancel()
+ with contextlib.suppress(asyncio.CancelledError):
+ await request_task
diff --git a/tests/test_web_websocket_functional.py b/tests/test_web_websocket_functional.py
index d52282ab774..1f34631279d 100644
--- a/tests/test_web_websocket_functional.py
+++ b/tests/test_web_websocket_functional.py
@@ -11,7 +11,7 @@
import pytest
import aiohttp
-from aiohttp import web
+from aiohttp import hdrs, web
from aiohttp.http import WSCloseCode, WSMsgType
from aiohttp.pytest_plugin import AiohttpClient
@@ -1608,3 +1608,58 @@ async def websocket_handler(
assert msg.type is aiohttp.WSMsgType.TEXT
assert msg.data == "success"
await ws.close()
+
+
+async def test_prepare_after_client_disconnect(aiohttp_client: AiohttpClient) -> None:
+ """Test ConnectionResetError when client disconnects before ws.prepare().
+
+ Reproduces the race condition where:
+ - Client connects and sends a WebSocket upgrade request
+ - Handler starts async work (e.g. authentication) before calling ws.prepare()
+ - Client disconnects while the handler is busy
+ - Handler then calls ws.prepare() → ConnectionResetError (not AssertionError)
+ """
+ handler_started = asyncio.Event()
+ captured_protocol = None
+
+ async def handler(request: web.Request) -> web.Response:
+ nonlocal captured_protocol
+ ws = web.WebSocketResponse()
+ captured_protocol = request._protocol
+ handler_started.set()
+ # Simulate async work (e.g., auth check) during which client disconnects.
+ await asyncio.sleep(0)
+ with pytest.raises(ConnectionResetError, match="Connection lost"):
+ await ws.prepare(request)
+ return web.Response(status=503)
+
+ app = web.Application()
+ app.router.add_route("GET", "/", handler)
+ client = await aiohttp_client(app)
+
+ request_task = asyncio.create_task(
+ client.session.get(
+ client.make_url("/"),
+ headers={
+ hdrs.UPGRADE: "websocket",
+ hdrs.CONNECTION: "Upgrade",
+ hdrs.SEC_WEBSOCKET_KEY: "dGhlIHNhbXBsZSBub25jZQ==",
+ hdrs.SEC_WEBSOCKET_VERSION: "13",
+ },
+ )
+ )
+
+ # Wait until the handler is running but has not yet called ws.prepare().
+ await handler_started.wait()
+ assert captured_protocol is not None
+
+ # Simulate the client disconnecting abruptly.
+ captured_protocol.force_close()
+
+ # Yield so the handler can resume and hit the ConnectionResetError.
+ await asyncio.sleep(0)
+
+ with contextlib.suppress(
+ aiohttp.ServerDisconnectedError, aiohttp.ClientConnectionResetError
+ ):
+ await request_task
From e65d1ff8cd5c666cb90185d694f0b1c495515eff Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com>
Date: Sat, 28 Mar 2026 15:42:42 +0000
Subject: [PATCH 140/141] Bump sigstore/gh-action-sigstore-python from 3.2.0 to
3.3.0 (#12288)
---
.github/workflows/ci-cd.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.github/workflows/ci-cd.yml b/.github/workflows/ci-cd.yml
index 8c28e13baea..0e513530162 100644
--- a/.github/workflows/ci-cd.yml
+++ b/.github/workflows/ci-cd.yml
@@ -620,7 +620,7 @@ jobs:
uses: pypa/gh-action-pypi-publish@release/v1
- name: Sign the dists with Sigstore
- uses: sigstore/gh-action-sigstore-python@v3.2.0
+ uses: sigstore/gh-action-sigstore-python@v3.3.0
with:
inputs: >-
./dist/*.tar.gz
From 9f7c7ab6f2924824e4a61a3b5c5c36475f4e25c6 Mon Sep 17 00:00:00 2001
From: Sam Bull
Date: Sat, 28 Mar 2026 16:08:45 +0000
Subject: [PATCH 141/141] Release v3.13.4 (#12291)
---
CHANGES.rst | 217 ++++++++++++++++++++++++++++++++++++
CHANGES/10596.bugfix.rst | 4 -
CHANGES/10795.doc.rst | 4 -
CHANGES/11283.bugfix.rst | 3 -
CHANGES/11701.bugfix.rst | 3 -
CHANGES/11737.contrib.rst | 3 -
CHANGES/11859.bugfix.rst | 1 -
CHANGES/11898.bugfix.rst | 10 --
CHANGES/11955.feature.rst | 1 -
CHANGES/11972.bugfix.rst | 2 -
CHANGES/11992.contrib.rst | 1 -
CHANGES/12027.misc.rst | 1 -
CHANGES/12042.doc.rst | 2 -
CHANGES/12069.packaging.rst | 1 -
CHANGES/12096.bugfix.rst | 1 -
CHANGES/12097.bugfix.rst | 1 -
CHANGES/12106.feature.rst | 1 -
CHANGES/12136.bugfix.rst | 2 -
CHANGES/12170.misc.rst | 1 -
CHANGES/12231.bugfix.rst | 2 -
CHANGES/12240.bugfix.rst | 5 -
CHANGES/12249.bugfix.rst | 3 -
aiohttp/__init__.py | 2 +-
23 files changed, 218 insertions(+), 53 deletions(-)
delete mode 100644 CHANGES/10596.bugfix.rst
delete mode 100644 CHANGES/10795.doc.rst
delete mode 100644 CHANGES/11283.bugfix.rst
delete mode 100644 CHANGES/11701.bugfix.rst
delete mode 100644 CHANGES/11737.contrib.rst
delete mode 100644 CHANGES/11859.bugfix.rst
delete mode 100644 CHANGES/11898.bugfix.rst
delete mode 100644 CHANGES/11955.feature.rst
delete mode 100644 CHANGES/11972.bugfix.rst
delete mode 100644 CHANGES/11992.contrib.rst
delete mode 100644 CHANGES/12027.misc.rst
delete mode 100644 CHANGES/12042.doc.rst
delete mode 100644 CHANGES/12069.packaging.rst
delete mode 100644 CHANGES/12096.bugfix.rst
delete mode 100644 CHANGES/12097.bugfix.rst
delete mode 100644 CHANGES/12106.feature.rst
delete mode 100644 CHANGES/12136.bugfix.rst
delete mode 100644 CHANGES/12170.misc.rst
delete mode 100644 CHANGES/12231.bugfix.rst
delete mode 100644 CHANGES/12240.bugfix.rst
delete mode 100644 CHANGES/12249.bugfix.rst
diff --git a/CHANGES.rst b/CHANGES.rst
index 466fce8795d..156b34231c1 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -10,6 +10,223 @@
.. towncrier release notes start
+3.13.4 (2026-03-28)
+===================
+
+Features
+--------
+
+- Added ``max_headers`` parameter to limit the number of headers that should be read from a response -- by :user:`Dreamsorcerer`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11955`.
+
+
+
+- Added a ``dns_cache_max_size`` parameter to ``TCPConnector`` to limit the size of the cache -- by :user:`Dreamsorcerer`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12106`.
+
+
+
+Bug fixes
+---------
+
+- Fixed server hanging indefinitely when chunked transfer encoding chunk-size
+ does not match actual data length. The server now raises
+ ``TransferEncodingError`` instead of waiting forever for data that will
+ never arrive -- by :user:`Fridayai700`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`10596`.
+
+
+
+- Fixed access log timestamps ignoring daylight saving time (DST) changes. The
+ previous implementation used :py:data:`time.timezone` which is a constant and
+ does not reflect DST transitions -- by :user:`nightcityblade`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11283`.
+
+
+
+- Fixed ``RuntimeError: An event loop is running`` error when using ``aiohttp.GunicornWebWorker``
+ or ``aiohttp.GunicornUVLoopWebWorker`` on Python >=3.14.
+ -- by :user:`Tasssadar`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11701`.
+
+
+
+- Fixed :exc:`ValueError` when creating a TLS connection with ``ClientTimeout(total=0)`` by converting ``0`` to ``None`` before passing to ``ssl_handshake_timeout`` in :py:meth:`asyncio.loop.start_tls` -- by :user:`veeceey`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11859`.
+
+
+
+- Restored :py:meth:`~aiohttp.BodyPartReader.decode` as a synchronous method
+ for backward compatibility. The method was inadvertently changed to async
+ in 3.13.3 as part of the decompression bomb security fix. A new
+ :py:meth:`~aiohttp.BodyPartReader.decode_iter` method is now available
+ for non-blocking decompression of large payloads using an async generator.
+ Internal aiohttp code uses the async variant to maintain security protections.
+
+ Changed multipart processing chunk sizes from 64 KiB to 256KiB, to better
+ match aiohttp internals
+ -- by :user:`bdraco` and :user:`Dreamsorcerer`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11898`.
+
+
+
+- Fixed false-positive :py:class:`DeprecationWarning` for passing ``enable_cleanup_closed=True`` to :py:class:`~aiohttp.TCPConnector` specifically on Python 3.12.7.
+ -- by :user:`Robsdedude`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11972`.
+
+
+
+- Fixed _sendfile_fallback over-reading beyond requested count -- by :user:`bysiber`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12096`.
+
+
+
+- Fixed digest auth dropping challenge fields with empty string values -- by :user:`bysiber`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12097`.
+
+
+
+- ``ClientConnectorCertificateError.os_error`` no longer raises :exc:`AttributeError`
+ -- by :user:`themylogin`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12136`.
+
+
+
+- Adjusted pure-Python request header value validation to align with RFC 9110 control-character handling, while preserving lax response parser behavior, and added regression tests for Host/header control-character cases.
+ -- by :user:`rodrigobnogueira`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12231`.
+
+
+
+- Rejected duplicate singleton headers (``Host``, ``Content-Type``,
+ ``Content-Length``, etc.) in the C extension HTTP parser to match
+ the pure Python parser behaviour, preventing potential host-based
+ access control bypasses via parser differentials
+ -- by :user:`rodrigobnogueira`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12240`.
+
+
+
+- Aligned the pure-Python HTTP request parser with the C parser by splitting
+ comma-separated and repeated ``Connection`` header values for keep-alive,
+ close, and upgrade handling -- by :user:`rodrigobnogueira`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12249`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Documented :exc:`asyncio.TimeoutError` for ``WebSocketResponse.receive()``
+ and related methods -- by :user:`veeceey`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12042`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Upgraded llhttp to 3.9.1 -- by :user:`Dreamsorcerer`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12069`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The benchmark CI job now runs only in the upstream repository -- by :user:`Cycloctane`.
+
+ It used to always fail in forks, which this change fixed.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11737`.
+
+
+
+- Fixed flaky performance tests by using appropriate fixed thresholds that account for CI variability -- by :user:`rodrigobnogueira`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`11992`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed ``test_invalid_idna`` to work with ``idna`` 3.11 by using an invalid character (``\u0080``) that is rejected by ``yarl`` during URL construction -- by :user:`rodrigobnogueira`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12027`.
+
+
+
+- Fixed race condition in ``test_data_file`` on Python 3.14 free-threaded builds -- by :user:`rodrigobnogueira`.
+
+
+ *Related issues and pull requests on GitHub:*
+ :issue:`12170`.
+
+
+
+
+----
+
+
3.13.3 (2026-01-03)
===================
diff --git a/CHANGES/10596.bugfix.rst b/CHANGES/10596.bugfix.rst
deleted file mode 100644
index f96a0215de3..00000000000
--- a/CHANGES/10596.bugfix.rst
+++ /dev/null
@@ -1,4 +0,0 @@
-Fixed server hanging indefinitely when chunked transfer encoding chunk-size
-does not match actual data length. The server now raises
-``TransferEncodingError`` instead of waiting forever for data that will
-never arrive -- by :user:`Fridayai700`.
diff --git a/CHANGES/10795.doc.rst b/CHANGES/10795.doc.rst
deleted file mode 100644
index 91094f70c13..00000000000
--- a/CHANGES/10795.doc.rst
+++ /dev/null
@@ -1,4 +0,0 @@
-Replaced the deprecated ``ujson`` library with ``orjson`` in the
-client quickstart documentation. ``ujson`` has been put into
-maintenance-only mode; ``orjson`` is the recommended alternative.
--- by :user:`indoor47`
diff --git a/CHANGES/11283.bugfix.rst b/CHANGES/11283.bugfix.rst
deleted file mode 100644
index 966b9afbd00..00000000000
--- a/CHANGES/11283.bugfix.rst
+++ /dev/null
@@ -1,3 +0,0 @@
-Fixed access log timestamps ignoring daylight saving time (DST) changes. The
-previous implementation used :py:data:`time.timezone` which is a constant and
-does not reflect DST transitions -- by :user:`nightcityblade`.
diff --git a/CHANGES/11701.bugfix.rst b/CHANGES/11701.bugfix.rst
deleted file mode 100644
index adf1e529d57..00000000000
--- a/CHANGES/11701.bugfix.rst
+++ /dev/null
@@ -1,3 +0,0 @@
-Fixed ``RuntimeError: An event loop is running`` error when using ``aiohttp.GunicornWebWorker``
-or ``aiohttp.GunicornUVLoopWebWorker`` on Python >=3.14.
--- by :user:`Tasssadar`.
diff --git a/CHANGES/11737.contrib.rst b/CHANGES/11737.contrib.rst
deleted file mode 100644
index 2b793f41d6c..00000000000
--- a/CHANGES/11737.contrib.rst
+++ /dev/null
@@ -1,3 +0,0 @@
-The benchmark CI job now runs only in the upstream repository -- by :user:`Cycloctane`.
-
-It used to always fail in forks, which this change fixed.
diff --git a/CHANGES/11859.bugfix.rst b/CHANGES/11859.bugfix.rst
deleted file mode 100644
index 7e32de1a4fa..00000000000
--- a/CHANGES/11859.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed :exc:`ValueError` when creating a TLS connection with ``ClientTimeout(total=0)`` by converting ``0`` to ``None`` before passing to ``ssl_handshake_timeout`` in :py:meth:`asyncio.loop.start_tls` -- by :user:`veeceey`.
diff --git a/CHANGES/11898.bugfix.rst b/CHANGES/11898.bugfix.rst
deleted file mode 100644
index 2a2e41d037b..00000000000
--- a/CHANGES/11898.bugfix.rst
+++ /dev/null
@@ -1,10 +0,0 @@
-Restored :py:meth:`~aiohttp.BodyPartReader.decode` as a synchronous method
-for backward compatibility. The method was inadvertently changed to async
-in 3.13.3 as part of the decompression bomb security fix. A new
-:py:meth:`~aiohttp.BodyPartReader.decode_iter` method is now available
-for non-blocking decompression of large payloads using an async generator.
-Internal aiohttp code uses the async variant to maintain security protections.
-
-Changed multipart processing chunk sizes from 64 KiB to 256KiB, to better
-match aiohttp internals
--- by :user:`bdraco` and :user:`Dreamsorcerer`.
diff --git a/CHANGES/11955.feature.rst b/CHANGES/11955.feature.rst
deleted file mode 100644
index eaea1016e60..00000000000
--- a/CHANGES/11955.feature.rst
+++ /dev/null
@@ -1 +0,0 @@
-Added ``max_headers`` parameter to limit the number of headers that should be read from a response -- by :user:`Dreamsorcerer`.
diff --git a/CHANGES/11972.bugfix.rst b/CHANGES/11972.bugfix.rst
deleted file mode 100644
index 8a6c2f56f28..00000000000
--- a/CHANGES/11972.bugfix.rst
+++ /dev/null
@@ -1,2 +0,0 @@
-Fixed false-positive :py:class:`DeprecationWarning` for passing ``enable_cleanup_closed=True`` to :py:class:`~aiohttp.TCPConnector` specifically on Python 3.12.7.
--- by :user:`Robsdedude`.
diff --git a/CHANGES/11992.contrib.rst b/CHANGES/11992.contrib.rst
deleted file mode 100644
index c56c2ab7059..00000000000
--- a/CHANGES/11992.contrib.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed flaky performance tests by using appropriate fixed thresholds that account for CI variability -- by :user:`rodrigobnogueira`.
diff --git a/CHANGES/12027.misc.rst b/CHANGES/12027.misc.rst
deleted file mode 100644
index 0b14de408a9..00000000000
--- a/CHANGES/12027.misc.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed ``test_invalid_idna`` to work with ``idna`` 3.11 by using an invalid character (``\u0080``) that is rejected by ``yarl`` during URL construction -- by :user:`rodrigobnogueira`.
diff --git a/CHANGES/12042.doc.rst b/CHANGES/12042.doc.rst
deleted file mode 100644
index 50c30f28cdf..00000000000
--- a/CHANGES/12042.doc.rst
+++ /dev/null
@@ -1,2 +0,0 @@
-Documented :exc:`asyncio.TimeoutError` for ``WebSocketResponse.receive()``
-and related methods -- by :user:`veeceey`.
diff --git a/CHANGES/12069.packaging.rst b/CHANGES/12069.packaging.rst
deleted file mode 100644
index a5c79a7cebe..00000000000
--- a/CHANGES/12069.packaging.rst
+++ /dev/null
@@ -1 +0,0 @@
-Upgraded llhttp to 3.9.1 -- by :user:`Dreamsorcerer`.
diff --git a/CHANGES/12096.bugfix.rst b/CHANGES/12096.bugfix.rst
deleted file mode 100644
index 945b452309a..00000000000
--- a/CHANGES/12096.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed _sendfile_fallback over-reading beyond requested count -- by :user:`bysiber`.
diff --git a/CHANGES/12097.bugfix.rst b/CHANGES/12097.bugfix.rst
deleted file mode 100644
index 1ea88a8c087..00000000000
--- a/CHANGES/12097.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed digest auth dropping challenge fields with empty string values -- by :user:`bysiber`.
diff --git a/CHANGES/12106.feature.rst b/CHANGES/12106.feature.rst
deleted file mode 100644
index daa9088eed6..00000000000
--- a/CHANGES/12106.feature.rst
+++ /dev/null
@@ -1 +0,0 @@
-Added a ``dns_cache_max_size`` parameter to ``TCPConnector`` to limit the size of the cache -- by :user:`Dreamsorcerer`.
diff --git a/CHANGES/12136.bugfix.rst b/CHANGES/12136.bugfix.rst
deleted file mode 100644
index 14ad7edf326..00000000000
--- a/CHANGES/12136.bugfix.rst
+++ /dev/null
@@ -1,2 +0,0 @@
-``ClientConnectorCertificateError.os_error`` no longer raises :exc:`AttributeError`
--- by :user:`themylogin`.
diff --git a/CHANGES/12170.misc.rst b/CHANGES/12170.misc.rst
deleted file mode 100644
index 23c63db50e9..00000000000
--- a/CHANGES/12170.misc.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed race condition in ``test_data_file`` on Python 3.14 free-threaded builds -- by :user:`rodrigobnogueira`.
diff --git a/CHANGES/12231.bugfix.rst b/CHANGES/12231.bugfix.rst
deleted file mode 100644
index cd74bd1e7e5..00000000000
--- a/CHANGES/12231.bugfix.rst
+++ /dev/null
@@ -1,2 +0,0 @@
-Adjusted pure-Python request header value validation to align with RFC 9110 control-character handling, while preserving lax response parser behavior, and added regression tests for Host/header control-character cases.
--- by :user:`rodrigobnogueira`.
diff --git a/CHANGES/12240.bugfix.rst b/CHANGES/12240.bugfix.rst
deleted file mode 100644
index 49508b3f595..00000000000
--- a/CHANGES/12240.bugfix.rst
+++ /dev/null
@@ -1,5 +0,0 @@
-Rejected duplicate singleton headers (``Host``, ``Content-Type``,
-``Content-Length``, etc.) in the C extension HTTP parser to match
-the pure Python parser behavior, preventing potential host-based
-access control bypasses via parser differentials
--- by :user:`rodrigobnogueira`.
diff --git a/CHANGES/12249.bugfix.rst b/CHANGES/12249.bugfix.rst
deleted file mode 100644
index 42d90314110..00000000000
--- a/CHANGES/12249.bugfix.rst
+++ /dev/null
@@ -1,3 +0,0 @@
-Aligned the pure-Python HTTP request parser with the C parser by splitting
-comma-separated and repeated ``Connection`` header values for keep-alive,
-close, and upgrade handling -- by :user:`rodrigobnogueira`.
diff --git a/aiohttp/__init__.py b/aiohttp/__init__.py
index 357baf019de..1a22b728e38 100644
--- a/aiohttp/__init__.py
+++ b/aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.13.3"
+__version__ = "3.13.4"
from typing import TYPE_CHECKING, Tuple