Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Removelist.__add__ overloads.#14282

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Draft
randolf-scholz wants to merge1 commit intopython:main
base:main
Choose a base branch
Loading
fromrandolf-scholz:polymorphic_overload_test

Conversation

randolf-scholz
Copy link
Contributor

@randolf-scholzrandolf-scholz commentedJun 16, 2025
edited
Loading

See#14283.

The overloads onlist.__add__ lead to divergence ofmypy andpyright. Code sample inpyright playground,https://mypy-play.net/?mypy=latest&python=3.12&gist=abf6a8834020af17a16bd8cfb44b2f10

fromtypingimportAny,overloadclassListA[T]:# emulates builtins list@overloaddef__add__(self,other:"ListA[T]",/)->"ListA[T]":returnListA()@overloaddef__add__[S](self,other:"ListA[S]",/)->"ListA[T | S]":returnListA()classListB[T]:# without overloadsdef__add__[S](self,other:"ListB[S]",/)->"ListB[T | S]":returnListB()# mypy              | pyrightreveal_type(list[str]()+list[str]() )# list[str]         | list[str]          ✅reveal_type(list[str]()+list[int]() )# list[str | int]   | list[str | int]    ✅reveal_type(list[str]()+list[Any]() )# list[Any]         | list[str]          ❌reveal_type(ListA[str]()+ListA[str]() )# ListA[str]        | ListA[str]         ✅reveal_type(ListA[str]()+ListA[int]() )# ListA[str | int]  | ListA[str | int]   ✅reveal_type(ListA[str]()+ListA[Any]() )# ListA[Any]        | ListA[str]         ❌reveal_type(ListB[str]()+ListB[str]() )# ListB[str]        | ListB[str]         ✅reveal_type(ListB[str]()+ListB[int]() )# ListB[str | int]  | ListB[str | int]   ✅reveal_type(ListB[str]()+ListB[Any]() )# ListB[str | Any]  | ListB[str | Any]   ✅

A comment from 3 years added in#8293 states that

# Overloading looks unnecessary, but is needed to work around complex mypy problems

I want to see the impact on mypy primer, and whether this comment is still valid given that mypy has had several major releases since.

@randolf-scholzrandolf-scholz changed the titledo not mergeCheck usefulness oflist.__add__ overloads.Jun 16, 2025
@github-actionsGitHub Actions
Copy link
Contributor

Diff frommypy_primer, showing the effect of this PR on open source code:

meson (https://github.com/mesonbuild/meson)+ mesonbuild/scripts/gettext.py:54:70: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/utils/universal.py:2365:87: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/cmake/interpreter.py:1061:69: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/cmake/interpreter.py:1199:60: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ run_mypy.py:154:62: error: Unsupported operand types for + ("list[str | Any]" and "list[str]")  [operator]+ mesonbuild/modules/gnome.py:1548:23: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/modules/gnome.py:1565:77: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/depfixer.py:436:66: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:62:40: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:73:40: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:83:35: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:120:35: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:128:35: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:134:59: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:140:64: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:147:64: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:156:47: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/scripts/coverage.py:172:40: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/modules/_qt.py:453:114: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/modules/_qt.py:474:126: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/modules/_qt.py:872:86: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/modules/_qt.py:916:173: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ mesonbuild/mtest.py:2250:47: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]pydantic (https://github.com/pydantic/pydantic)- pydantic/aliases.py:29: error: Incompatible types in assignment (expression has type "list[str]", variable has type "list[int | str]")  [assignment]- pydantic/aliases.py:29: note: "list" is invariant -- see https://mypy.readthedocs.io/en/stable/common_issues.html#variance- pydantic/aliases.py:29: note: Consider using "Sequence" instead, which is covariant- pydantic/aliases.py:29: error: Argument 1 to "list" has incompatible type "tuple[str | int, ...]"; expected "Iterable[str]"  [arg-type]+ pydantic/aliases.py:29: error: Argument 1 to "list" has incompatible type "tuple[str | int, ...]"; expected "Iterable[int]"  [arg-type]cloud-init (https://github.com/canonical/cloud-init)+ cloudinit/config/cc_mounts.py:420: error: Incompatible return value type (got "list[list[str | Any | None]]", expected "list[list[str]]")  [return-value]zulip (https://github.com/zulip/zulip)+ zerver/lib/export.py:1703: error: Unsupported operand types for + ("list[dict[str, Any]]" and "list[dict[str, Any]]")  [operator]pandas (https://github.com/pandas-dev/pandas)+ pandas/io/stata.py:1458: error: List comprehension has incompatible type List[object]; expected List[str | dtype[Any]]  [misc]+ pandas/core/frame.py:11090: error: Incompatible return value type (got "Any | DataFrame | Series", expected "DataFrame")  [return-value]+ pandas/core/groupby/ops.py:800: error: No overload variant of "unique" matches argument types "list[int]", "bool"  [call-overload]+ pandas/core/groupby/ops.py:800: note: Possible overload variants:+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> ndarray[tuple[Any, ...], dtype[_ScalarT]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> ndarray[tuple[Any, ...], dtype[Any]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]+ pandas/tests/strings/conftest.py:6: error: Need type annotation for "_any_string_method"  [var-annotated]+ pandas/tests/groupby/conftest.py:130: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ pandas/tests/frame/test_constructors.py:2318: error: Unsupported operand types for + ("list[ExtensionDtype | str | dtype[Any] | type[object] | type[str] | type[complex] | type[bool]]" and "list[ExtensionDtype | str | dtype[Any] | type[str] | type[complex] | type[bool] | type[object]]")  [operator]+ pandas/tests/frame/test_constructors.py:2328: error: Unsupported operand types for + ("list[ExtensionDtype | str | dtype[Any] | type[object] | type[str] | type[complex] | type[bool]]" and "list[ExtensionDtype | str | dtype[Any] | type[str] | type[complex] | type[bool] | type[object]]")  [operator]+ pandas/tests/arrays/test_datetimes.py:69: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ pandas/conftest.py:1638: error: Unsupported operand types for + ("list[ExtensionDtype | str | dtype[Any] | type[object] | type[str] | type[complex] | type[bool]]" and "list[ExtensionDtype | str | dtype[Any] | type[str] | type[complex] | type[bool] | type[object]]")  [operator]freqtrade (https://github.com/freqtrade/freqtrade)+ freqtrade/exchange/exchange_utils.py:119: error: Incompatible types (expression has type "list[dict[str, Any] | dict[str, str]]", TypedDict item "trade_modes" has type "list[TradeModeType]")  [typeddict-item]+ freqtrade/data/entryexitanalysis.py:287: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ freqtrade/data/entryexitanalysis.py:292: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ freqtrade/data/entryexitanalysis.py:298: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]+ freqtrade/data/entryexitanalysis.py:299: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]spark (https://github.com/apache/spark)+ python/pyspark/pandas/groupby.py:3682: error: Unsupported operand types for + ("list[Series[Any]]" and "list[Series[Any]]")  [operator]mkosi (https://github.com/systemd/mkosi)+ mkosi/qemu.py:447:22: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]paasta (https://github.com/yelp/paasta)+ paasta_tools/config_utils.py:101: error: Unsupported operand types for + ("List[str]" and "List[str]")  [operator]mypy (https://github.com/python/mypy)+ mypyc/crash.py:29: error: Unsupported operand types for + ("list[FrameSummary]" and "list[FrameSummary]")  [operator]+ mypyc/crash.py:29: note: See https://mypy.rtfd.io/en/stable/_refs.html#code-operator for more info+ mypy/strconv.py:454: error: Unsupported operand types for + ("list[Any]" and "list[Union[str, tuple[str, list[Any]]]]")  [operator]+ mypy/errors.py:1285: error: Unsupported operand types for + ("list[FrameSummary]" and "StackSummary")  [operator]+ mypy/fastparse.py:1612: error: Unsupported operand types for + ("list[expr]" and "list[expr]")  [operator]+ mypy/test/testcmdline.py:77: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]core (https://github.com/home-assistant/core)+ homeassistant/components/monzo/sensor.py:91: error: Unsupported operand types for + ("list[MonzoSensor]" and "list[MonzoSensor]")  [operator]materialize (https://github.com/MaterializeInc/materialize)+ misc/python/materialize/cli/ci_upload_heap_profiles.py:42: error: Unsupported operand types for + ("list[str | Any]" and "list[str]")  [operator]spack (https://github.com/spack/spack)+ lib/spack/spack/environment/environment.py:1525: error: Incompatible return value type (got "tuple[list[Any], list[Any], list[tuple[Any, Any] | tuple[Any, None]]]", expected "tuple[list[Spec], list[Spec], list[tuple[Spec, Spec]]]")  [return-value]prefect (https://github.com/PrefectHQ/prefect)- src/prefect/utilities/callables.py:579: error: Incompatible types in assignment (expression has type "list[None]", variable has type "list[Optional[expr]]")  [assignment]- src/prefect/utilities/callables.py:579: note: "list" is invariant -- see https://mypy.readthedocs.io/en/stable/common_issues.html#variance- src/prefect/utilities/callables.py:579: note: Consider using "Sequence" instead, which is covarianttornado (https://github.com/tornadoweb/tornado)+ tornado/netutil.py:158: error: Incompatible types in assignment (expression has type "tuple[Union[int, Any], ...]", variable has type "Union[tuple[str, int], tuple[str, int, int, int], tuple[int, bytes]]")  [assignment]+ tornado/autoreload.py:234: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]scipy (https://github.com/scipy/scipy)+ scipy/fft/_pocketfft/tests/test_basic.py:473: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fft/_pocketfft/tests/test_basic.py:483: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fft/_pocketfft/tests/test_basic.py:502: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fft/_pocketfft/tests/test_basic.py:512: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fftpack/tests/test_basic.py:429: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fftpack/tests/test_basic.py:439: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fftpack/tests/test_basic.py:458: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/fftpack/tests/test_basic.py:468: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]+ scipy/optimize/tests/test_optimize.py:1289: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]graphql-core (https://github.com/graphql-python/graphql-core)+ src/graphql/type/validate.py:409: error: Unsupported operand types for + ("list[NamedTypeNode]" and "list[NamedTypeNode]")  [operator]+ src/graphql/validation/rules/overlapping_fields_can_be_merged.py:89: error: Unsupported operand types for + ("list[FieldNode]" and "list[FieldNode]")  [operator]

@randolf-scholzrandolf-scholz changed the titleCheck usefulness oflist.__add__ overloads.Removelist.__add__ overloads.Jun 16, 2025
@randolf-scholz
Copy link
ContributorAuthor

randolf-scholz commentedJun 16, 2025
edited
Loading

There must be a strange bug in mypy.Take this example from the mypy primer above

If I change the offending line (Unsupported operand types for + ("list[FrameSummary]" and "list[FrameSummary]")) from

forsintraceback.format_list(tb+tb2):

to

dummy=tb+tb2forsintraceback.format_list(dummy):

Then the file passes without issue. (usingpython -m mypy.stubtest --custom-typeshed-dir=../typeshed/ tmp)

# Overloading looks unnecessary, but is needed to work around complex mypy problems
@overload
def __add__(self, value: list[_T], /) -> list[_T]: ...
@overload
def __add__(self, value: list[_S], /) -> list[_S | _T]: ...
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

maybe something like this could work?

Suggested change
def__add__(self,value:list[_S],/)->list[_S|_T]: ...
def__add__(self:list[_S],value:list[_S],/)->list[_S]: ...

Copy link
ContributorAuthor

@randolf-scholzrandolf-scholzJun 26, 2025
edited
Loading

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

With this, wouldn't we getlist[str] + list[int] = list[object] when checking withmypy, rather thenlist[str | int]?

I'd preferpython/mypy#19304 be fixed; then I think we would see most mypy-primer issues here go away. If you have seen this before and know and easier way to reproduce, please comment it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

With this, wouldn't we getlist[str] + list[int] = list[object] when checking withmypy, rather thenlist[str | int]?

With this, wouldn't we getlist[str] + list[int] = list[object] when checking withmypy, rather thenlist[str | int]?

I'd preferpython/mypy#19304 be fixed; then I think we would see most mypy-primer issues here go away. If you have seen this before and know and easier way to reproduce, please comment it.

That's what I indeed expect here, given mypy's join vs union behavior. But if that's how they decide to behave, then that's up to them, afaik. At least it won't be rejected this way (is the hope)

Copy link
ContributorAuthor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Well, my goal with this PR was to reduce type-checker divergence, not to cement it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

fair enough

Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@jorenhamjorenhamjorenham left review comments

Assignees
No one assigned
Labels
None yet
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

2 participants
@randolf-scholz@jorenham

[8]ページ先頭

©2009-2025 Movatter.jp