This PEP proposes extending list, set, and dictionary comprehensions, as wellas generator expressions, to allow unpacking notation (* and**) at thestart of the expression, providing a concise way of combining an arbitrarynumber of iterables into one list or set or generator, or an arbitrary numberof dictionaries into one dictionary, for example:
[*itforitinits]# list with the concatenation of iterables in 'its'{*itforitinits}# set with the union of iterables in 'its'{**dfordindicts}# dict with the combination of dicts in 'dicts'(*itforitinits)# generator of the concatenation of iterables in 'its'
Extended unpacking notation (* and**) fromPEP 448 makes iteasy to combine a few iterables or dictionaries:
[*it1,*it2,*it3]# list with the concatenation of three iterables{*it1,*it2,*it3}# set with the union of three iterables{**dict1,**dict2,**dict3}# dict with the combination of three dicts
But if we want to similarly combine an arbitrary number of iterables, we cannotuse unpacking in this same way.
That said, we do have a few options for combining multiple iterables. Wecould, for example, use explicit looping structures and built-in means ofcombination:
new_list=[]foritinits:new_list.extend(it)new_set=set()foritinits:new_set.update(it)new_dict={}fordindicts:new_dict.update(d)defnew_generator():foritinits:yield fromit
Or, we could be more concise by using a comprehension with two loops:
[xforitinitsforxinit]{xforitinitsforxinit}{key:valuefordindictsforkey,valueind.items()}(xforitinitsforxinit)
Or, we could useitertools.chain oritertools.chain.from_iterable:
list(itertools.chain(*its))set(itertools.chain(*its))dict(itertools.chain(*(d.items()fordindicts)))itertools.chain(*its)list(itertools.chain.from_iterable(its))set(itertools.chain.from_iterable(its))dict(itertools.chain.from_iterable(d.items()fordindicts))itertools.chain.from_iterable(its)
Or, for all but the generator, we could usefunctools.reduce:
functools.reduce(operator.iconcat,its,(new_list:=[]))functools.reduce(operator.ior,its,(new_set:=set()))functools.reduce(operator.ior,its,(new_dict:={}))
This PEP proposes allowing unpacking operations to be used in comprehensions asan additional alternative:
[*itforitinits]# list with the concatenation of iterables in 'its'{*itforitinits}# set with the union of iterables in 'its'{**dfordindicts}# dict with the combination of dicts in 'dicts'(*itforitinits)# generator of the concatenation of iterables in 'its'
This proposal also extends to asynchronous comprehensions and generatorexpressions, such that, for example,(*aitasyncforaitinaits()) isequivalent to(xasyncforaitinaits()forxinait).
Combining multiple iterable objects together into a single object is a commontask. For example, oneStackOverflow postasking about flattening a list of lists has been viewed 4.6 million times, andthere are several examples of code from the standard library that perform thisoperation (seeCode Examples). While Python provides a means ofcombining a small, known number of iterables using extended unpacking fromPEP 448, no comparable syntax currently exists for combining an arbitrarynumber of iterables.
This proposal represents a natural extension of the language, parallelingexisting syntactic structures: where[x,y,z] creates a list from a fixednumber of values,[itemforiteminitems] creates a list from an arbitrarynumber of values; this proposal extends that notion to the construction oflists that involve unpacking, making[*itemforiteminitems] analogous to[*x,*y,*z].
We expect this syntax to be intuitive and familiar to programmers alreadycomfortable with both comprehensions and unpacking notation. This proposal wasmotivated in part by a written exam in a Python programming class, whereseveral students used the proposed notation (specifically theset version)in their solutions, assuming that it already existed in Python. This suggeststhat the notation represents a logical, consistent extension to Python’sexisting syntax. By contrast, the existing double-loop version[xforitinitsforxinit] is one that students often get wrong, the natural impulsefor many students being to reverse the order of thefor clauses. Theintuitiveness of the proposed syntax is further supported by the commentsection of aReddit postmade following the initial publication of this PEP, which demonstrates supportfrom a broader community.
The grammar should be changed to allow the expression in list/setcomprehensions and generator expressions to be preceded by a*, andallowing an alternative form of dictionary comprehension in which adouble-starred expression can be used in place of akey:value pair.
This can be accomplished by updating thelistcomp andsetcomp rules tousestar_named_expression instead ofnamed_expression:
listcomp[expr_ty]: | '[' a=star_named_expression b=for_if_clauses ']'setcomp[expr_ty]: | '{' a=star_named_expression b=for_if_clauses '}'The rule forgenexp would similarly need to be modified to allow astarred_expression:
genexp[expr_ty]: | '(' a=(assignment_expression | expression !':=' | starred_expression) b=for_if_clauses ')'The rule for dictionary comprehensions would need to be adjusted as well, toallow for this new form:
dictcomp[expr_ty]: | '{' a=double_starred_kvpair b=for_if_clauses '}'No change should be made to the way that argument unpacking is handled infunction calls, i.e., the general rule that generator expressions provided asthe sole argument to functions do not require additional redundant parenthesesshould be retained. Note that this implies that, for example,f(*xforxinit) is equivalent tof((*xforxinit)) (seeStarred Generators as Function Argumentsfor more discussion).
* and** should only be allowed at the top-most level of the expressionin the comprehension (seeFurther Generalizing Unpacking Operators for more discussion).
The meaning of a starred expression in a list comprehension[*exprforxinit] is to treat each expression as an iterable, and concatenate them, in thesame way as if they were explicitly listed via[*expr1,*expr2,...].Similarly,{*exprforxinit} forms a set union, as if the expressionswere explicitly listed via{*expr1,*expr2,...}; and{**exprforxinit} combines dictionaries, as if the expressions were explicitly listed via{**expr1,**expr2,...}. These operations should retain all of theequivalent semantics for combining collections in this way (including, forexample, later values replacing earlier ones in the case of a duplicated keywhen combining dictionaries).
Said another way, the objects created by the following comprehensions:
new_list=[*exprforxinits]new_set={*exprforxinits}new_dict={**exprfordindicts}
should be equivalent to the objects created by the following pieces of code,respectively:
new_list=[]forxinits:new_list.extend(expr)new_set=set()forxinits:new_set.update(expr)new_dict={}forxindicts:new_dict.update(expr)
Generator expressions using the unpacking syntax should form new generatorsproducing values from the concatenation of the iterables given by theexpressions. Specifically, the behavior is defined to be equivalent to thefollowing (though without defining or referencing the looping variablei):
# equivalent to g = (*expr for x in it)defgenerator():forxinit:foriinexpr:yieldig=generator()
# equivalent to g = (*expr async for x in ait())asyncdefgenerator():asyncforxinait():foriinexpr:yieldig=generator()
SeeAlternative Generator Expression Semantics for more discussion of these semantics.
Note that this proposal does not suggest changing the order of evaluation ofthe various pieces of the comprehension, nor any rules about scoping. This isparticularly relevant for generator expressions that make use of the “walrusoperator”:= fromPEP 572, which, when used in a comprehension or agenerator expression, performs its variable binding in the containing scoperather than locally to the comprehension.
As an example, consider the generator that results from evaluating theexpression(*(y:=[i,i+1])foriin(0,2,4)). This is approximatelyequivalent to the following generator, except that in its generator expressionform,y will be bound in the containing scope instead of locally:
defgenerator():foriin(0,2,4):forjin(y:=[i,i+1]):yieldj
In this example, the subexpression(y:=[i,i+1]) is evaluated exactlythree times before the generator is exhausted: just after assigningi inthe comprehension to0,2, and4, respectively. Thus,y (inthe containing scope) will be modified at those points in time:
>>>g=(*(y:=[i,i+1])foriin(0,2,4))>>>yTraceback (most recent call last): File"<python-input-1>", line1, in<module>yNameError:name 'y' is not defined>>>next(g)0>>>y[0, 1]>>>next(g)1>>>y[0, 1]>>>next(g)2>>>y[2, 3]
Currently, the proposed syntax generates aSyntaxError. Allowing theseforms to be recognized as syntactically valid requires adjusting the grammarrules forinvalid_comprehension andinvalid_dict_comprehension to allowthe use of* and**, respectively.
Additional specific error messages should be provided in at least the followingcases:
** in a list comprehension or generator expressionshould report that dictionary unpacking cannot be used in those structures,for example:>>>[**xforxiny] File"<stdin>", line1[**xforxiny]^^^SyntaxError:cannot use dict unpacking in list comprehension>>>(**xforxiny) File"<stdin>", line1(**xforxiny)^^^SyntaxError:cannot use dict unpacking in generator expression
* in a dictionarykey/value should be retained, but similar messages should be reportedwhen attempting to use** unpacking on a dictionary key or value, forexample:>>>{*k:vfork,vinitems} File"<stdin>", line1{*k:vfork,vinitems}^^SyntaxError:cannot use a starred expression in a dictionary key>>>{k:*vfork,vinitems} File"<stdin>", line1{k:*vfork,vinitems}^^SyntaxError:cannot use a starred expression in a dictionary value>>>{**k:vfork,vinitems} File"<stdin>", line1{**k:vfork,vinitems}^^^SyntaxError:cannot use dict unpacking in a dictionary key>>>{k:**vfork,vinitems} File"<stdin>", line1{k:**vfork,vinitems}^^^SyntaxError:cannot use dict unpacking in a dictionary value
>>>[*xifxelsey] File"<stdin>", line1[*xifxelsey]^^^^^^^^^^^^^^SyntaxError:invalid starred expression. Did you forget to wrap the conditional expression in parentheses? >>> {**x if x else y} File"<stdin>", line1{**xifxelsey}^^^^^^^^^^^^^^^SyntaxError:invalid double starred expression. Did you forget to wrap the conditional expression in parentheses?>>>[xifxelse*y] File"<stdin>", line1[xifxelse*y]^SyntaxError:cannot unpack only part of a conditional expression>>>{xifxelse**y} File"<stdin>", line1{xifxelse**y}^^SyntaxError:cannot use dict unpacking on only part of a conditional expression
Thereference implementationimplements this functionality, including draft documentation and additionaltest cases.
The behavior of all comprehensions that are currently syntactically valid wouldbe unaffected by this change, so we do not anticipate much in the way ofbackwards-incompatibility concerns. In principle, this change would onlyaffect code that relied on the fact that attempting to use unpacking operationsin comprehensions would raise aSyntaxError, or that relied on theparticular phrasing of any of the old error messages being replaced, which weexpect to be rare.
One related concern is that a hypothetical future decision to change thesemantics of generator expressions to make use ofyieldfrom duringunpacking (delegating to generators that are being unpacked) would not bebackwards-compatible because it would affect the behavior of the resultinggenerators when used with.send()/.asend(),.throw()/.athrow(),and.close()/.aclose(). That said, despite beingbackwards-incompatible, such a change would be unlikely to have a large impactbecause it would only affect the behavior of structures that, under thisproposal, are not particularly useful. SeeAlternative Generator Expression Semantics for more discussion.
This section shows some illustrative examples of how small pieces of code fromthe standard library could be rewritten to make use of this new syntax. TheReference Implementation continues to pass all tests with these replacementsmade.
Replacing explicit loops compresses multiple lines into one, and avoids theneed for defining and referencing an auxiliary variable.
email/_header_value_parser.py:# current:comments=[]fortokeninself:comments.extend(token.comments)returncomments# proposed:return[*token.commentsfortokeninself]
shutil.py:# current:ignored_names=[]forpatterninpatterns:ignored_names.extend(fnmatch.filter(names,pattern))returnset(ignored_names)# proposed:return{*fnmatch.filter(names,pattern)forpatterninpatterns}
http/cookiejar.py:# current:cookies=[]fordomaininself._cookies.keys():cookies.extend(self._cookies_for_domain(domain,request))returncookies# proposed:return[*self._cookies_for_domain(domain,request)fordomaininself._cookies.keys()]
While not always the right choice, replacingitertools.chain.from_iterableandmap can avoid an extra level of redirection, resulting in code thatfollows conventional wisdom that comprehensions are generally more readablethan map/filter.
dataclasses.py:# current:inherited_slots=set(itertools.chain.from_iterable(map(_get_slots,cls.__mro__[1:-1])))# proposed:inherited_slots={*_get_slots(c)forcincls.__mro__[1:-1]}
importlib/metadata/__init__.py:# current:returnitertools.chain.from_iterable(path.search(prepared)forpathinmap(FastPath,paths))# proposed:return(*FastPath(path).search(prepared)forpathinpaths)
collections/__init__.py (Counter class):# current:return_chain.from_iterable(_starmap(_repeat,self.items()))# proposed:return(*_repeat(elt,num)forelt,numinself.items())
zipfile/_path/__init__.py:# current:parents=itertools.chain.from_iterable(map(_parents,names))# proposed:parents=(*_parents(name)fornameinnames)
_pyrepl/_module_completer.py:# current:search_locations=set(chain.from_iterable(getattr(spec,'submodule_search_locations',[])forspecinspecsifspec))# proposed:search_locations={*getattr(spec,'submodule_search_locations',[])forspecinspecsifspec}
Replacing double loops in comprehensions avoids the need for defining andreferencing an auxiliary variable.
importlib/resources/readers.py:# current:children=(childforpathinself._pathsforchildinpath.iterdir())# proposed:children=(*path.iterdir()forpathinself._paths)
asyncio/base_events.py:# current:exceptions=[excforsubinexceptionsforexcinsub]# proposed:exceptions=[*subforsubinexceptions]
_weakrefset.py:# current:returnself.__class__(eforsin(self,other)foreins)# proposed:returnself.__class__(*sforsin(self,other))
Currently, a common way to introduce the notion of comprehensions (which isemployed by the Python Tutorial) is to demonstrate equivalent code. Forexample, this method would say that, for example,out=[exprforxinit]is equivalent to the following code:
out=[]forxinit:out.append(expr)
Taking this approach, we can introduceout=[*exprforxinit] as insteadbeing equivalent to the following (which usesextend instead ofappend):
out=[]forxinit:out.extend(expr)
Set and dict comprehensions that make use of unpacking can also be introducedby a similar analogy:
# equivalent to out = {expr for x in it}out=set()forxinit:out.add(expr)# equivalent to out = {*expr for x in it}out=set()forxinit:out.update(expr)# equivalent to out = {k_expr: v_expr for x in it}out={}forxinit:out[k_expr]=v_expr# equivalent to out = {**expr for x in it}, provided that expr evaluates to# a mapping that can be unpacked with **out={}forxinit:out.update(expr)
And we can take a similar approach to illustrate the behavior of generatorexpressions that involve unpacking:
# equivalent to g = (expr for x in it)defgenerator():forxinit:yieldexprg=generator()# equivalent to g = (*expr for x in it)defgenerator():forxinit:yield fromexprg=generator()
We can then generalize from these specific examples to the idea that, wherevera non-starred comprehension/genexp would use an operator that adds a singleelement to a collection, the starred would instead use an operator that addsmultiple elements to that collection.
Alternatively, we don’t need to think of the two ideas as separate; instead,with the new syntax, we can think ofout=[...x...forxinit] asequivalent to the following[1] (where...x... is a stand-infor arbitrary code), regardless of whether or not...x... uses*:
out=[]forxinit:out.extend([...x...])
Similarly, we can think ofout={...x...forxinit} as equivalent to thefollowing code, regardless of whether or not...x... uses* or**or::
out=set()# or out = {}forxinit:out.update({...x...})
These examples are equivalent in the sense that the output they produce wouldbe the same in both the version with the comprehension and the version withoutit, but note that the non-comprehension version is slightly less efficient dueto making new lists/sets/dictionaries before eachextend orupdate,which is unnecessary in the version that uses comprehensions.
The primary goal when thinking through the specification above was consistencywith existing norms around unpacking and comprehensions / generatorexpressions. One way to interpret this is that the goal was to write thespecification so as to require the smallest possible change(s) to the existinggrammar and code generation, letting the existing code inform the surroundingsemantics.
Below we discuss some of the common concerns/alternative proposals that came upin discussions but that are not included in this proposal.
One common concern that has arisen multiple times (not only in the discussionthreads linked above but also in previous discussions around this same idea) isa possible syntactical ambiguity when passing a starred generator as the soleargument tof(*xforxiny). In the originalPEP 448, this ambiguitywas cited as a reason for not including a similar generalization as part of theproposal.
This proposal suggests thatf(*xforxiny) should be interpreted asf((*xforxiny)) and should not attempt further unpacking of theresulting generator, but several alternatives were suggested in our discussion(and/or have been suggested in the past), including:
f(*xforxiny) asf(*(xforxiny),f(*xforxiny) asf(*(*xforxiny)), orSyntaxError forf(*xforxiny) even if theother aspects of this proposal are accepted.The reason to prefer this proposal over these alternatives is the preservationof existing conventions for punctuation around generator expressions.Currently, the general rule is that generator expressions must be wrapped inparentheses except when provided as the sole argument to a function, and thisproposal suggests maintaining that rule even as we allow more kinds ofgenerator expressions. This option maintains a full symmetry betweencomprehensions and generator expressions that use unpacking and those thatdon’t.
Currently, we have the following conventions:
f([xforxiny])# pass in a single listf({xforxiny})# pass in a single setf(xforxiny)# pass in a single generator (no additional parentheses required around genexp)f(*[xforxiny])# pass in elements from the list separatelyf(*{xforxiny})# pass in elements from the set separatelyf(*(xforxiny))# pass in elements from the generator separately (parentheses required)
This proposal opts to maintain those conventions even when the comprehensionsmake use of unpacking:
f([*xforxiny])# pass in a single listf({*xforxiny})# pass in a single setf(*xforxiny)# pass in a single generator (no additional parentheses required around genexp)f(*[*xforxiny])# pass in elements from the list separatelyf(*{*xforxiny})# pass in elements from the set separatelyf(*(*xforxiny))# pass in elements from the generator separately (parentheses required)
Another suggestion that came out of the discussion involved furthergeneralizing the* beyond simply allowing it to be used to unpack theexpression in a comprehension. Two main flavors of this extension wereconsidered:
* and** true unary operators that create a new kind ofUnpackable object (or similar), which comprehensions could treat byunpacking it but which could also be used in other contexts; or* and** only in the places they are allowedelsewhere in this proposal (expression lists, comprehensions, generatorexpressions, and argument lists), but also allow them to be used insubexpressions within a comprehension, allowing, for example, the followingas a way to flatten a list that contains some iterables but some non-iterableobjects:[*xifisinstance(x,Iterable)elsexforxin[[1,2,3],4]]
These variants were considered substantially more complex (both to understandand to implement) and of only marginal utility, so neither is included in thisPEP. As such, these forms should continue to raise aSyntaxError, but witha new error message as described above, though it should not be ruled out as aconsideration for future proposals.
Another point of discussion centered around the semantics of unpacking ingenerator expressions, particularly the relationship between the semantics ofsynchronous and asynchronous generator expressions given that async generatorsdo not supportyieldfrom (see the section ofPEP 525 on Asynchronousyieldfrom).
The core question centered around whether sync and async generator expressionsshould useyieldfrom (or an equivalent) when unpacking, as opposed to anexplicit loop. The main difference between these options is whether theresulting generator delegates to the objects being unpacked, which would affectthe behavior of these generator expressions when used with.send()/.asend(),.throw()/.athrow(), and.close()/.aclose() in thecase where the objects being unpacked are themselves generators. Thedifferences between these options are summarized inAppendix: Semantics of Generator Delegation.
Several reasonable options were considered, none of which was a clear winner inapoll in the Discourse thread.Beyond the proposal outlined above, the following were also considered:
yieldfrom for unpacking in synchronous generator expressions butusing an explicit loop in asynchronous generator expressions (as proposed inthe original draft of this PEP).This strategy would have allowed unpacking in generator expressions toclosely mimic a popular way of writing generators that perform thisoperation (usingyieldfrom), but it would also have created anasymmetry between synchronous and asynchronous versions, and also betweenthis new syntax anditertools.chain and the double-loop version.
yieldfrom for unpacking in synchronous generator expressions andmimicking the behavior ofyieldfrom for unpacking in async generatorexpressions.This strategy would also make unpacking in synchronous and asynchronousgenerators behave symmetrically, but it would also be more complex, enoughso that the cost may not be worth the benefit, particularly in the absenceof a compelling use case for delegation.
yieldfrom for unpacking in synchronous generator expressions, anddisallowing unpacking in asynchronous generator expressions until theysupportyieldfrom.This strategy could possibly reduce friction if asynchronous generatorexpressions do gain support foryieldfrom in the future by making surethat any decision made at that point would be fully backwards-compatible,but in the meantime, it would result in an even bigger discrepancy betweensynchronous and asynchronous generator expressions than option 1.
This would retain symmetry between the two cases, but with the downside oflosing an expressive form and reducing symmetry between list/setcomprehensions and generator expressions.
Each of these options (including the one presented in this PEP) has itsbenefits and drawbacks, with no option being clearly superior on all fronts.The semantics proposed inSemantics: Generator Expressions above represent areasonable compromise by allowing exactly the same kind of unpacking insynchronous and asynchronous generator expressions and retaining an existingproperty of generator expressions (that they do not delegate to subgenerators).
This decision should be revisited in the event that asynchronous generatorsreceive support foryieldfrom in the future, in which case adjusting thesemantics of unpacking in generator expressions to useyieldfrom should beconsidered.
Although the general consensus from the discussion thread seemed to be thatthis syntax was clear and intuitive, several concerns and potential downsideswere raised as well. This section aims to summarize those concerns.
f(*xforxiny) may initially appear ambiguous, as it’snot obvious whether the intent is to unpack the generator or to pass it as asingle argument. Although this proposal retains existing conventions bytreating that form as equivalent tof((*xforxiny)), that equivalencemay not be immediately obvious.* and** may makeparticularly complex uses even more difficult to read and understand at aglance. For example, while these situations are likely rare, comprehensionsthat use unpacking in multiple ways can make it difficult to know what’sbeing unpacked and when, e.g.,f(*(*xfor*x,_inlist_of_lists)).Quite a few other languages support this kind of flattening with syntax similarto what is already available in Python, but support for using unpacking syntaxwithin comprehensions is rare. This section provides a brief summary ofsupport for similar syntax in a few other languages.
Many languages that support comprehensions support double loops:
# python[xforxsin[[1,2,3],[],[4,5]]forxinxs*2]
-- haskell[x|xs<-[[1,2,3],[],[4,5]],x<-xs++xs]
# julia[xforxsin[[1,2,3],[],[4,5]]forxin[xs;xs]]
; clojure(for[xs[[123][][45]]x(concatxsxs)]x)
Several other languages (even those without comprehensions) support theseoperations via a built-in function or method to support flattening of nestedstructures:
# pythonlist(itertools.chain(*(xs*2forxsin[[1,2,3],[],[4,5]])))
// javascript[[1,2,3],[],[4,5]].flatMap(xs=>[...xs,...xs])
-- haskellconcat(map(\x->x++x)[[1,2,3],[],[4,5]])
# ruby[[1,2,3],[],[4,5]].flat_map{|e|e*2}
However, languages that support both comprehension and unpacking do not tend toallow unpacking within a comprehension. For example, the following expressionin Julia currently leads to a syntax error:
[xs...forxsin[[1,2,3],[],[4,5]]]
As one counterexample, support for a similar syntax was recently added toCivet. For example, the following is a valid comprehension inCivet, making use of JavaScript’s... syntax for unpacking:
forxsof[[1,2,3],[],[4,5]]then...(xs++xs)
One of the common questions about the semantics outlined above had to do withthe difference between usingyieldfrom when unpacking inside of agenerator expression, versus using an explicit loop. Because this is afairly-advanced feature of generators, this appendix attempts to summarize someof the key differences between generators that useyieldfrom and thosethat use explicit loops.
For simple iteration over values, which we expect to be by far the most-commonuse of unpacking in generator expressions, both approaches produce identicalresults:
defyield_from(iterables):foriterableiniterables:yield fromiterabledefexplicit_loop(iterables):foriterableiniterables:foriteminiterable:yielditem# Both produce the same sequence of valuesx=list(yield_from([[1,2],[3,4]]))y=list(explicit_loop([[1,2],[3,4]]))print(x==y)# prints True
The differences become apparent when using the advanced generator protocolmethods.send(),.throw(), and.close(), and when the sub-iterablesare themselves generators rather than simple sequences. In these cases, theyieldfrom version results in the associated signal reaching thesubgenerator, but the version with the explicit loop does not.
.send()defsub_generator():x=yield"first"yieldf"received:{x}"yield"last"defyield_from():yield fromsub_generator()defexplicit_loop():foriteminsub_generator():yielditem# With yield from, values are passed through to sub-generatorgen1=yield_from()print(next(gen1))# prints "first"print(gen1.send("hello"))# prints "received: hello"print(next(gen1))# prints "last"# With explicit loop, .send() affects the outer generator; values don't reach the sub-generatorgen2=explicit_loop()print(next(gen2))# prints "first"print(gen2.send("hello"))# prints "received: None" (sub-generator receives None instead of "hello")print(next(gen2))# prints "last"
.throw()defsub_generator_with_exception_handling():try:yield"first"yield"second"exceptValueErrorase:yieldf"caught:{e}"defyield_from():yield fromsub_generator_with_exception_handling()defexplicit_loop():foriteminsub_generator_with_exception_handling():yielditem# With yield from, exceptions are passed to sub-generatorgen1=yield_from()print(next(gen1))# prints "first"print(gen1.throw(ValueError("test")))# prints "caught: test"# With explicit loop, exceptions affect the outer generator onlygen2=explicit_loop()print(next(gen2))# prints "first"print(gen2.throw(ValueError("test")))# ValueError is raised; sub-generator doesn't see it
.close()# hold references to sub-generators so GC doesn't close the explicit loop versionreferences=[]defsub_generator_with_cleanup():try:yield"first"yield"second"finally:print("sub-generator received GeneratorExit")defyield_from():try:g=sub_generator_with_cleanup()references.append(g)yield fromgfinally:print("outer generator received GeneratorExit")defexplicit_loop():try:g=sub_generator_with_cleanup()references.append(g)foriteming:yielditemfinally:print("outer generator received GeneratorExit")# With yield from, GeneratorExit is passed through to sub-generatorgen1=yield_from()print(next(gen1))# prints "first"gen1.close()# closes sub-generator and then outer generator# With explicit loop, GeneratorExit goes to outer generator onlygen2=explicit_loop()print(next(gen2))# prints "first"gen2.close()# only closes outer generatorprint('program finished; GC will close the explicit loop subgenerator')# second inner generator closes when GC closes it at the end
This document is placed in the public domain or under the CC0-1.0-Universallicense, whichever is more permissive.
Source:https://github.com/python/peps/blob/main/peps/pep-0798.rst
Last modified:2025-12-27 18:15:18 GMT