TypeAdapter
Bases:Generic[T]
Usage Documentation
Type adapters provide a flexible way to perform validation and serialization based on a Python type.
ATypeAdapter
instance exposes some of the functionality fromBaseModel
instance methodsfor types that do not have such methods (such as dataclasses, primitive types, and more).
Note:TypeAdapter
instances are not types, and cannot be used as type annotations for fields.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
type | Any | The type associated with the | required |
config | ConfigDict | None | Configuration for the Note You cannot provide a configuration when instantiating a | None |
_parent_depth | int | Depth at which to search for theparent frame. This frame is used whenresolving forward annotations during schema building, by looking for the globals and locals of thisframe. Defaults to 2, which will result in the frame where the Note This parameter is named with an underscore to suggest its private nature and discourage use.It may be deprecated in a minor version, so we only recommend using it if you're comfortablewith potential change in behavior/support. It's default value is 2 because internally,the | 2 |
module | str | None | The module that passes to plugin if provided. | None |
Attributes:
Name | Type | Description |
---|---|---|
core_schema | CoreSchema | The core schema for the type. |
validator | SchemaValidator |PluggableSchemaValidator | The schema validator for the type. |
serializer | SchemaSerializer | The schema serializer for the type. |
pydantic_complete | bool | Whether the core schema for the type is successfully built. |
Compatibility withmypy
Depending on the type used,mypy
might raise an error when instantiating aTypeAdapter
. As a workaround, you can explicitlyannotate your variable:
fromtypingimportUnionfrompydanticimportTypeAdapterta:TypeAdapter[Union[str,int]]=TypeAdapter(Union[str,int])# type: ignore[arg-type]
Namespace management nuances and implementation details
Here, we collect some notes on namespace management, and subtle differences fromBaseModel
:
BaseModel
uses its own__module__
to find out where it was definedand then looks for symbols to resolve forward references in those globals.On the other hand,TypeAdapter
can be initialized with arbitrary objects,which may not be types and thus do not have a__module__
available.So instead we look at the globals in our parent stack frame.
It is expected that thens_resolver
passed to this function will have the correctnamespace for the type we're adapting. See the source code forTypeAdapter.__init__
andTypeAdapter.rebuild
for various ways to construct this namespace.
This works for the case where this function is called in a module thathas the target of forward references in its scope, butdoes not always work for more complex cases.
For example, take the following:
IntList=list[int]OuterDict=dict[str,'IntList']
fromaimportOuterDictfrompydanticimportTypeAdapterIntList=int# replaces the symbol the forward reference is looking forv=TypeAdapter(OuterDict)v({'x':1})# should fail but doesn't
IfOuterDict
were aBaseModel
, this would work because it would resolvethe forward reference within thea.py
namespace.ButTypeAdapter(OuterDict)
can't determine what moduleOuterDict
came from.
In other words, the assumption thatall forward references exist in themodule we are being called from is not technically always true.Although most of the time it is and it works fine for recursive models and such,BaseModel
's behavior isn't perfect either andcan break in similar ways,so there is no right or wrong between the two.
But at the very least this behavior issubtly different fromBaseModel
's.
Source code inpydantic/type_adapter.py
195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233 |
|
rebuild¶
rebuild(*,force:bool=False,raise_errors:bool=True,_parent_namespace_depth:int=2,_types_namespace:MappingNamespace|None=None)->bool|None
Try to rebuild the pydantic-core schema for the adapter's type.
This may be necessary when one of the annotations is a ForwardRef which could not be resolved duringthe initial attempt to build the schema, and automatic rebuilding fails.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
force | bool | Whether to force the rebuilding of the type adapter's schema, defaults to | False |
raise_errors | bool | Whether to raise errors, defaults to | True |
_parent_namespace_depth | int | Depth at which to search for theparent frame. Thisframe is used when resolving forward annotations during schema rebuilding, by looking forthe locals of this frame. Defaults to 2, which will result in the frame where the methodwas called. | 2 |
_types_namespace | MappingNamespace | None | An explicit types namespace to use, instead of using the local namespacefrom the parent frame. Defaults to | None |
Returns:
Type | Description |
---|---|
bool | None | Returns |
bool | None | If rebuildingwas required, returns |
Source code inpydantic/type_adapter.py
335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379 |
|
validate_python¶
validate_python(object:Any,/,*,strict:bool|None=None,from_attributes:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:(bool|Literal["off","on","trailing-strings"])=False,by_alias:bool|None=None,by_name:bool|None=None,)->T
Validate a Python object against the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
object | Any | The Python object to validate against the model. | required |
strict | bool | None | Whether to strictly check types. | None |
from_attributes | bool | None | Whether to extract data from object attributes. | None |
context | dict[str,Any] | None | Additional context to pass to the validator. | None |
experimental_allow_partial | bool |Literal['off', 'on', 'trailing-strings'] | Experimental whether to enablepartial validation, e.g. to process streams.* False / 'off': Default behavior, no partial validation.* True / 'on': Enable partial validation.* 'trailing-strings': Enable partial validation and allow trailing strings in the input. | False |
by_alias | bool | None | Whether to use the field's alias when validating against the provided input data. | None |
by_name | bool | None | Whether to use the field's name when validating against the provided input data. | None |
Note
When usingTypeAdapter
with a Pydanticdataclass
, the use of thefrom_attributes
argument is not supported.
Returns:
Type | Description |
---|---|
T | The validated object. |
Source code inpydantic/type_adapter.py
381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429 |
|
validate_json¶
validate_json(data:str|bytes|bytearray,/,*,strict:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:(bool|Literal["off","on","trailing-strings"])=False,by_alias:bool|None=None,by_name:bool|None=None,)->T
Usage Documentation
Validate a JSON string or bytes against the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
data | str |bytes |bytearray | The JSON data to validate against the model. | required |
strict | bool | None | Whether to strictly check types. | None |
context | dict[str,Any] | None | Additional context to use during validation. | None |
experimental_allow_partial | bool |Literal['off', 'on', 'trailing-strings'] | Experimental whether to enablepartial validation, e.g. to process streams.* False / 'off': Default behavior, no partial validation.* True / 'on': Enable partial validation.* 'trailing-strings': Enable partial validation and allow trailing strings in the input. | False |
by_alias | bool | None | Whether to use the field's alias when validating against the provided input data. | None |
by_name | bool | None | Whether to use the field's name when validating against the provided input data. | None |
Returns:
Type | Description |
---|---|
T | The validated object. |
Source code inpydantic/type_adapter.py
431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475 |
|
validate_strings¶
validate_strings(obj:Any,/,*,strict:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:(bool|Literal["off","on","trailing-strings"])=False,by_alias:bool|None=None,by_name:bool|None=None,)->T
Validate object contains string data against the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
obj | Any | The object contains string data to validate. | required |
strict | bool | None | Whether to strictly check types. | None |
context | dict[str,Any] | None | Additional context to use during validation. | None |
experimental_allow_partial | bool |Literal['off', 'on', 'trailing-strings'] | Experimental whether to enablepartial validation, e.g. to process streams.* False / 'off': Default behavior, no partial validation.* True / 'on': Enable partial validation.* 'trailing-strings': Enable partial validation and allow trailing strings in the input. | False |
by_alias | bool | None | Whether to use the field's alias when validating against the provided input data. | None |
by_name | bool | None | Whether to use the field's name when validating against the provided input data. | None |
Returns:
Type | Description |
---|---|
T | The validated object. |
Source code inpydantic/type_adapter.py
477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518 |
|
get_default_value¶
Get the default value for the wrapped type.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
strict | bool | None | Whether to strictly check types. | None |
context | dict[str,Any] | None | Additional context to pass to the validator. | None |
Returns:
Type | Description |
---|---|
Some[T] | None | The default value wrapped in a |
Source code inpydantic/type_adapter.py
520521522523524525526527528529530 |
|
dump_python¶
dump_python(instance:T,/,*,mode:Literal["json","python"]="python",include:IncEx|None=None,exclude:IncEx|None=None,by_alias:bool|None=None,exclude_unset:bool=False,exclude_defaults:bool=False,exclude_none:bool=False,round_trip:bool=False,warnings:(bool|Literal["none","warn","error"])=True,fallback:Callable[[Any],Any]|None=None,serialize_as_any:bool=False,context:dict[str,Any]|None=None,)->Any
Dump an instance of the adapted type to a Python object.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
instance | T | The Python object to serialize. | required |
mode | Literal['json', 'python'] | The output format. | 'python' |
include | IncEx | None | Fields to include in the output. | None |
exclude | IncEx | None | Fields to exclude from the output. | None |
by_alias | bool | None | Whether to use alias names for field names. | None |
exclude_unset | bool | Whether to exclude unset fields. | False |
exclude_defaults | bool | Whether to exclude fields with default values. | False |
exclude_none | bool | Whether to exclude fields with None values. | False |
round_trip | bool | Whether to output the serialized data in a way that is compatible with deserialization. | False |
warnings | bool |Literal['none', 'warn', 'error'] | How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,"error" raises a | True |
fallback | Callable[[Any],Any] | None | A function to call when an unknown value is encountered. If not provided,a | None |
serialize_as_any | bool | Whether to serialize fields with duck-typing serialization behavior. | False |
context | dict[str,Any] | None | Additional context to pass to the serializer. | None |
Returns:
Type | Description |
---|---|
Any | The serialized object. |
Source code inpydantic/type_adapter.py
532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586 |
|
dump_json¶
dump_json(instance:T,/,*,indent:int|None=None,include:IncEx|None=None,exclude:IncEx|None=None,by_alias:bool|None=None,exclude_unset:bool=False,exclude_defaults:bool=False,exclude_none:bool=False,round_trip:bool=False,warnings:(bool|Literal["none","warn","error"])=True,fallback:Callable[[Any],Any]|None=None,serialize_as_any:bool=False,context:dict[str,Any]|None=None,)->bytes
Usage Documentation
Serialize an instance of the adapted type to JSON.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
instance | T | The instance to be serialized. | required |
indent | int | None | Number of spaces for JSON indentation. | None |
include | IncEx | None | Fields to include. | None |
exclude | IncEx | None | Fields to exclude. | None |
by_alias | bool | None | Whether to use alias names for field names. | None |
exclude_unset | bool | Whether to exclude unset fields. | False |
exclude_defaults | bool | Whether to exclude fields with default values. | False |
exclude_none | bool | Whether to exclude fields with a value of | False |
round_trip | bool | Whether to serialize and deserialize the instance to ensure round-tripping. | False |
warnings | bool |Literal['none', 'warn', 'error'] | How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,"error" raises a | True |
fallback | Callable[[Any],Any] | None | A function to call when an unknown value is encountered. If not provided,a | None |
serialize_as_any | bool | Whether to serialize fields with duck-typing serialization behavior. | False |
context | dict[str,Any] | None | Additional context to pass to the serializer. | None |
Returns:
Type | Description |
---|---|
bytes | The JSON representation of the given instance as bytes. |
Source code inpydantic/type_adapter.py
588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645 |
|
json_schema¶
json_schema(*,by_alias:bool=True,ref_template:str=DEFAULT_REF_TEMPLATE,schema_generator:type[GenerateJsonSchema]=GenerateJsonSchema,mode:JsonSchemaMode="validation")->dict[str,Any]
Generate a JSON schema for the adapted type.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
by_alias | bool | Whether to use alias names for field names. | True |
ref_template | str | The format string used for generating $ref strings. | DEFAULT_REF_TEMPLATE |
schema_generator | type[GenerateJsonSchema] | The generator class used for creating the schema. | GenerateJsonSchema |
mode | JsonSchemaMode | The mode to use for schema generation. | 'validation' |
Returns:
Type | Description |
---|---|
dict[str,Any] | The JSON schema for the model as a dictionary. |
Source code inpydantic/type_adapter.py
647648649650651652653654655656657658659660661662663664665666667668669670 |
|
json_schemasstaticmethod
¶
json_schemas(inputs:Iterable[tuple[JsonSchemaKeyT,JsonSchemaMode,TypeAdapter[Any]]],/,*,by_alias:bool=True,title:str|None=None,description:str|None=None,ref_template:str=DEFAULT_REF_TEMPLATE,schema_generator:type[GenerateJsonSchema]=GenerateJsonSchema,)->tuple[dict[tuple[JsonSchemaKeyT,JsonSchemaMode],JsonSchemaValue,],JsonSchemaValue,]
Generate a JSON schema including definitions from multiple type adapters.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
inputs | Iterable[tuple[JsonSchemaKeyT,JsonSchemaMode,TypeAdapter[Any]]] | Inputs to schema generation. The first two items will form the keys of the (first)output mapping; the type adapters will provide the core schemas that get converted intodefinitions in the output JSON schema. | required |
by_alias | bool | Whether to use alias names. | True |
title | str | None | The title for the schema. | None |
description | str | None | The description for the schema. | None |
ref_template | str | The format string used for generating $ref strings. | DEFAULT_REF_TEMPLATE |
schema_generator | type[GenerateJsonSchema] | The generator class used for creating the schema. | GenerateJsonSchema |
Returns:
Type | Description |
---|---|
tuple[dict[tuple[JsonSchemaKeyT,JsonSchemaMode],JsonSchemaValue],JsonSchemaValue] | A tuple where:
|
Source code inpydantic/type_adapter.py
672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727 |
|