Movatterモバイル変換


[0]ホーム

URL:


Skip to content
What's new — we've launchedPydantic Logfire🔥to help you monitor and understand yourPydantic validations.

TypeAdapter

Bases:Generic[T]

Usage Documentation

TypeAdapter

Type adapters provide a flexible way to perform validation and serialization based on a Python type.

ATypeAdapter instance exposes some of the functionality fromBaseModel instance methodsfor types that do not have such methods (such as dataclasses, primitive types, and more).

Note:TypeAdapter instances are not types, and cannot be used as type annotations for fields.

Parameters:

NameTypeDescriptionDefault
typeAny

The type associated with theTypeAdapter.

required
configConfigDict | None

Configuration for theTypeAdapter, should be a dictionary conforming toConfigDict.

Note

You cannot provide a configuration when instantiating aTypeAdapter if the type you're usinghas its own config that cannot be overridden (ex:BaseModel,TypedDict, anddataclass). Atype-adapter-config-unused error willbe raised in this case.

None
_parent_depthint

Depth at which to search for theparent frame. This frame is used whenresolving forward annotations during schema building, by looking for the globals and locals of thisframe. Defaults to 2, which will result in the frame where theTypeAdapter was instantiated.

Note

This parameter is named with an underscore to suggest its private nature and discourage use.It may be deprecated in a minor version, so we only recommend using it if you're comfortablewith potential change in behavior/support. It's default value is 2 because internally,theTypeAdapter class makes another call to fetch the frame.

2
modulestr | None

The module that passes to plugin if provided.

None

Attributes:

NameTypeDescription
core_schemaCoreSchema

The core schema for the type.

validatorSchemaValidator |PluggableSchemaValidator

The schema validator for the type.

serializerSchemaSerializer

The schema serializer for the type.

pydantic_completebool

Whether the core schema for the type is successfully built.

Compatibility withmypy

Depending on the type used,mypy might raise an error when instantiating aTypeAdapter. As a workaround, you can explicitlyannotate your variable:

fromtypingimportUnionfrompydanticimportTypeAdapterta:TypeAdapter[Union[str,int]]=TypeAdapter(Union[str,int])# type: ignore[arg-type]
Namespace management nuances and implementation details

Here, we collect some notes on namespace management, and subtle differences fromBaseModel:

BaseModel uses its own__module__ to find out where it was definedand then looks for symbols to resolve forward references in those globals.On the other hand,TypeAdapter can be initialized with arbitrary objects,which may not be types and thus do not have a__module__ available.So instead we look at the globals in our parent stack frame.

It is expected that thens_resolver passed to this function will have the correctnamespace for the type we're adapting. See the source code forTypeAdapter.__init__andTypeAdapter.rebuild for various ways to construct this namespace.

This works for the case where this function is called in a module thathas the target of forward references in its scope, butdoes not always work for more complex cases.

For example, take the following:

a.py
IntList=list[int]OuterDict=dict[str,'IntList']
b.py
fromaimportOuterDictfrompydanticimportTypeAdapterIntList=int# replaces the symbol the forward reference is looking forv=TypeAdapter(OuterDict)v({'x':1})# should fail but doesn't

IfOuterDict were aBaseModel, this would work because it would resolvethe forward reference within thea.py namespace.ButTypeAdapter(OuterDict) can't determine what moduleOuterDict came from.

In other words, the assumption thatall forward references exist in themodule we are being called from is not technically always true.Although most of the time it is and it works fine for recursive models and such,BaseModel's behavior isn't perfect either andcan break in similar ways,so there is no right or wrong between the two.

But at the very least this behavior issubtly different fromBaseModel's.

Source code inpydantic/type_adapter.py
195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233
def__init__(self,type:Any,*,config:ConfigDict|None=None,_parent_depth:int=2,module:str|None=None,)->None:if_type_has_config(type)andconfigisnotNone:raisePydanticUserError('Cannot use `config` when the type is a BaseModel, dataclass or TypedDict.'' These types can have their own config and setting the config via the `config`'' parameter to TypeAdapter will not override it, thus the `config` you passed to'' TypeAdapter becomes meaningless, which is probably not what you want.',code='type-adapter-config-unused',)self._type=typeself._config=configself._parent_depth=_parent_depthself.pydantic_complete=Falseparent_frame=self._fetch_parent_frame()ifparent_frameisnotNone:globalns=parent_frame.f_globals# Do not provide a local ns if the type adapter happens to be instantiated at the module level:localns=parent_frame.f_localsifparent_frame.f_localsisnotglobalnselse{}else:globalns={}localns={}self._module_name=moduleorcast(str,globalns.get('__name__',''))self._init_core_attrs(ns_resolver=_namespace_utils.NsResolver(namespaces_tuple=_namespace_utils.NamespacesTuple(locals=localns,globals=globalns),parent_namespace=localns,),force=False,)

rebuild

rebuild(*,force:bool=False,raise_errors:bool=True,_parent_namespace_depth:int=2,_types_namespace:MappingNamespace|None=None)->bool|None

Try to rebuild the pydantic-core schema for the adapter's type.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved duringthe initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

NameTypeDescriptionDefault
forcebool

Whether to force the rebuilding of the type adapter's schema, defaults toFalse.

False
raise_errorsbool

Whether to raise errors, defaults toTrue.

True
_parent_namespace_depthint

Depth at which to search for theparent frame. Thisframe is used when resolving forward annotations during schema rebuilding, by looking forthe locals of this frame. Defaults to 2, which will result in the frame where the methodwas called.

2
_types_namespaceMappingNamespace | None

An explicit types namespace to use, instead of using the local namespacefrom the parent frame. Defaults toNone.

None

Returns:

TypeDescription
bool | None

ReturnsNone if the schema is already "complete" and rebuilding was not required.

bool | None

If rebuildingwas required, returnsTrue if rebuilding was successful, otherwiseFalse.

Source code inpydantic/type_adapter.py
335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379
defrebuild(self,*,force:bool=False,raise_errors:bool=True,_parent_namespace_depth:int=2,_types_namespace:_namespace_utils.MappingNamespace|None=None,)->bool|None:"""Try to rebuild the pydantic-core schema for the adapter's type.    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during    the initial attempt to build the schema, and automatic rebuilding fails.    Args:        force: Whether to force the rebuilding of the type adapter's schema, defaults to `False`.        raise_errors: Whether to raise errors, defaults to `True`.        _parent_namespace_depth: Depth at which to search for the [parent frame][frame-objects]. This            frame is used when resolving forward annotations during schema rebuilding, by looking for            the locals of this frame. Defaults to 2, which will result in the frame where the method            was called.        _types_namespace: An explicit types namespace to use, instead of using the local namespace            from the parent frame. Defaults to `None`.    Returns:        Returns `None` if the schema is already "complete" and rebuilding was not required.        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.    """ifnotforceandself.pydantic_complete:returnNoneif_types_namespaceisnotNone:rebuild_ns=_types_namespaceelif_parent_namespace_depth>0:rebuild_ns=_typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth,force=True)or{}else:rebuild_ns={}# we have to manually fetch globals here because there's no type on the stack of the NsResolver# and so we skip the globalns = get_module_ns_of(typ) call that would normally happenglobalns=sys._getframe(max(_parent_namespace_depth-1,1)).f_globalsns_resolver=_namespace_utils.NsResolver(namespaces_tuple=_namespace_utils.NamespacesTuple(locals=rebuild_ns,globals=globalns),parent_namespace=rebuild_ns,)returnself._init_core_attrs(ns_resolver=ns_resolver,force=True,raise_errors=raise_errors)

validate_python

validate_python(object:Any,/,*,strict:bool|None=None,from_attributes:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:(bool|Literal["off","on","trailing-strings"])=False,by_alias:bool|None=None,by_name:bool|None=None,)->T

Validate a Python object against the model.

Parameters:

NameTypeDescriptionDefault
objectAny

The Python object to validate against the model.

required
strictbool | None

Whether to strictly check types.

None
from_attributesbool | None

Whether to extract data from object attributes.

None
contextdict[str,Any] | None

Additional context to pass to the validator.

None
experimental_allow_partialbool |Literal['off', 'on', 'trailing-strings']

Experimental whether to enablepartial validation, e.g. to process streams.* False / 'off': Default behavior, no partial validation.* True / 'on': Enable partial validation.* 'trailing-strings': Enable partial validation and allow trailing strings in the input.

False
by_aliasbool | None

Whether to use the field's alias when validating against the provided input data.

None
by_namebool | None

Whether to use the field's name when validating against the provided input data.

None

Note

When usingTypeAdapter with a Pydanticdataclass, the use of thefrom_attributesargument is not supported.

Returns:

TypeDescription
T

The validated object.

Source code inpydantic/type_adapter.py
381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429
defvalidate_python(self,object:Any,/,*,strict:bool|None=None,from_attributes:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:bool|Literal['off','on','trailing-strings']=False,by_alias:bool|None=None,by_name:bool|None=None,)->T:"""Validate a Python object against the model.    Args:        object: The Python object to validate against the model.        strict: Whether to strictly check types.        from_attributes: Whether to extract data from object attributes.        context: Additional context to pass to the validator.        experimental_allow_partial: **Experimental** whether to enable            [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams.            * False / 'off': Default behavior, no partial validation.            * True / 'on': Enable partial validation.            * 'trailing-strings': Enable partial validation and allow trailing strings in the input.        by_alias: Whether to use the field's alias when validating against the provided input data.        by_name: Whether to use the field's name when validating against the provided input data.    !!! note        When using `TypeAdapter` with a Pydantic `dataclass`, the use of the `from_attributes`        argument is not supported.    Returns:        The validated object.    """ifby_aliasisFalseandby_nameisnotTrue:raisePydanticUserError('At least one of `by_alias` or `by_name` must be set to True.',code='validate-by-alias-and-name-false',)returnself.validator.validate_python(object,strict=strict,from_attributes=from_attributes,context=context,allow_partial=experimental_allow_partial,by_alias=by_alias,by_name=by_name,)

validate_json

validate_json(data:str|bytes|bytearray,/,*,strict:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:(bool|Literal["off","on","trailing-strings"])=False,by_alias:bool|None=None,by_name:bool|None=None,)->T

Usage Documentation

JSON Parsing

Validate a JSON string or bytes against the model.

Parameters:

NameTypeDescriptionDefault
datastr |bytes |bytearray

The JSON data to validate against the model.

required
strictbool | None

Whether to strictly check types.

None
contextdict[str,Any] | None

Additional context to use during validation.

None
experimental_allow_partialbool |Literal['off', 'on', 'trailing-strings']

Experimental whether to enablepartial validation, e.g. to process streams.* False / 'off': Default behavior, no partial validation.* True / 'on': Enable partial validation.* 'trailing-strings': Enable partial validation and allow trailing strings in the input.

False
by_aliasbool | None

Whether to use the field's alias when validating against the provided input data.

None
by_namebool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

TypeDescription
T

The validated object.

Source code inpydantic/type_adapter.py
431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475
defvalidate_json(self,data:str|bytes|bytearray,/,*,strict:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:bool|Literal['off','on','trailing-strings']=False,by_alias:bool|None=None,by_name:bool|None=None,)->T:"""!!! abstract "Usage Documentation"        [JSON Parsing](../concepts/json.md#json-parsing)    Validate a JSON string or bytes against the model.    Args:        data: The JSON data to validate against the model.        strict: Whether to strictly check types.        context: Additional context to use during validation.        experimental_allow_partial: **Experimental** whether to enable            [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams.            * False / 'off': Default behavior, no partial validation.            * True / 'on': Enable partial validation.            * 'trailing-strings': Enable partial validation and allow trailing strings in the input.        by_alias: Whether to use the field's alias when validating against the provided input data.        by_name: Whether to use the field's name when validating against the provided input data.    Returns:        The validated object.    """ifby_aliasisFalseandby_nameisnotTrue:raisePydanticUserError('At least one of `by_alias` or `by_name` must be set to True.',code='validate-by-alias-and-name-false',)returnself.validator.validate_json(data,strict=strict,context=context,allow_partial=experimental_allow_partial,by_alias=by_alias,by_name=by_name,)

validate_strings

validate_strings(obj:Any,/,*,strict:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:(bool|Literal["off","on","trailing-strings"])=False,by_alias:bool|None=None,by_name:bool|None=None,)->T

Validate object contains string data against the model.

Parameters:

NameTypeDescriptionDefault
objAny

The object contains string data to validate.

required
strictbool | None

Whether to strictly check types.

None
contextdict[str,Any] | None

Additional context to use during validation.

None
experimental_allow_partialbool |Literal['off', 'on', 'trailing-strings']

Experimental whether to enablepartial validation, e.g. to process streams.* False / 'off': Default behavior, no partial validation.* True / 'on': Enable partial validation.* 'trailing-strings': Enable partial validation and allow trailing strings in the input.

False
by_aliasbool | None

Whether to use the field's alias when validating against the provided input data.

None
by_namebool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

TypeDescription
T

The validated object.

Source code inpydantic/type_adapter.py
477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518
defvalidate_strings(self,obj:Any,/,*,strict:bool|None=None,context:dict[str,Any]|None=None,experimental_allow_partial:bool|Literal['off','on','trailing-strings']=False,by_alias:bool|None=None,by_name:bool|None=None,)->T:"""Validate object contains string data against the model.    Args:        obj: The object contains string data to validate.        strict: Whether to strictly check types.        context: Additional context to use during validation.        experimental_allow_partial: **Experimental** whether to enable            [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams.            * False / 'off': Default behavior, no partial validation.            * True / 'on': Enable partial validation.            * 'trailing-strings': Enable partial validation and allow trailing strings in the input.        by_alias: Whether to use the field's alias when validating against the provided input data.        by_name: Whether to use the field's name when validating against the provided input data.    Returns:        The validated object.    """ifby_aliasisFalseandby_nameisnotTrue:raisePydanticUserError('At least one of `by_alias` or `by_name` must be set to True.',code='validate-by-alias-and-name-false',)returnself.validator.validate_strings(obj,strict=strict,context=context,allow_partial=experimental_allow_partial,by_alias=by_alias,by_name=by_name,)

get_default_value

get_default_value(*,strict:bool|None=None,context:dict[str,Any]|None=None)->Some[T]|None

Get the default value for the wrapped type.

Parameters:

NameTypeDescriptionDefault
strictbool | None

Whether to strictly check types.

None
contextdict[str,Any] | None

Additional context to pass to the validator.

None

Returns:

TypeDescription
Some[T] | None

The default value wrapped in aSome if there is one or None if not.

Source code inpydantic/type_adapter.py
520521522523524525526527528529530
defget_default_value(self,*,strict:bool|None=None,context:dict[str,Any]|None=None)->Some[T]|None:"""Get the default value for the wrapped type.    Args:        strict: Whether to strictly check types.        context: Additional context to pass to the validator.    Returns:        The default value wrapped in a `Some` if there is one or None if not.    """returnself.validator.get_default_value(strict=strict,context=context)

dump_python

dump_python(instance:T,/,*,mode:Literal["json","python"]="python",include:IncEx|None=None,exclude:IncEx|None=None,by_alias:bool|None=None,exclude_unset:bool=False,exclude_defaults:bool=False,exclude_none:bool=False,round_trip:bool=False,warnings:(bool|Literal["none","warn","error"])=True,fallback:Callable[[Any],Any]|None=None,serialize_as_any:bool=False,context:dict[str,Any]|None=None,)->Any

Dump an instance of the adapted type to a Python object.

Parameters:

NameTypeDescriptionDefault
instanceT

The Python object to serialize.

required
modeLiteral['json', 'python']

The output format.

'python'
includeIncEx | None

Fields to include in the output.

None
excludeIncEx | None

Fields to exclude from the output.

None
by_aliasbool | None

Whether to use alias names for field names.

None
exclude_unsetbool

Whether to exclude unset fields.

False
exclude_defaultsbool

Whether to exclude fields with default values.

False
exclude_nonebool

Whether to exclude fields with None values.

False
round_tripbool

Whether to output the serialized data in a way that is compatible with deserialization.

False
warningsbool |Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,"error" raises aPydanticSerializationError.

True
fallbackCallable[[Any],Any] | None

A function to call when an unknown value is encountered. If not provided,aPydanticSerializationError error is raised.

None
serialize_as_anybool

Whether to serialize fields with duck-typing serialization behavior.

False
contextdict[str,Any] | None

Additional context to pass to the serializer.

None

Returns:

TypeDescription
Any

The serialized object.

Source code inpydantic/type_adapter.py
532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586
defdump_python(self,instance:T,/,*,mode:Literal['json','python']='python',include:IncEx|None=None,exclude:IncEx|None=None,by_alias:bool|None=None,exclude_unset:bool=False,exclude_defaults:bool=False,exclude_none:bool=False,round_trip:bool=False,warnings:bool|Literal['none','warn','error']=True,fallback:Callable[[Any],Any]|None=None,serialize_as_any:bool=False,context:dict[str,Any]|None=None,)->Any:"""Dump an instance of the adapted type to a Python object.    Args:        instance: The Python object to serialize.        mode: The output format.        include: Fields to include in the output.        exclude: Fields to exclude from the output.        by_alias: Whether to use alias names for field names.        exclude_unset: Whether to exclude unset fields.        exclude_defaults: Whether to exclude fields with default values.        exclude_none: Whether to exclude fields with None values.        round_trip: Whether to output the serialized data in a way that is compatible with deserialization.        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].        fallback: A function to call when an unknown value is encountered. If not provided,            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.        context: Additional context to pass to the serializer.    Returns:        The serialized object.    """returnself.serializer.to_python(instance,mode=mode,by_alias=by_alias,include=include,exclude=exclude,exclude_unset=exclude_unset,exclude_defaults=exclude_defaults,exclude_none=exclude_none,round_trip=round_trip,warnings=warnings,fallback=fallback,serialize_as_any=serialize_as_any,context=context,)

dump_json

dump_json(instance:T,/,*,indent:int|None=None,include:IncEx|None=None,exclude:IncEx|None=None,by_alias:bool|None=None,exclude_unset:bool=False,exclude_defaults:bool=False,exclude_none:bool=False,round_trip:bool=False,warnings:(bool|Literal["none","warn","error"])=True,fallback:Callable[[Any],Any]|None=None,serialize_as_any:bool=False,context:dict[str,Any]|None=None,)->bytes

Usage Documentation

JSON Serialization

Serialize an instance of the adapted type to JSON.

Parameters:

NameTypeDescriptionDefault
instanceT

The instance to be serialized.

required
indentint | None

Number of spaces for JSON indentation.

None
includeIncEx | None

Fields to include.

None
excludeIncEx | None

Fields to exclude.

None
by_aliasbool | None

Whether to use alias names for field names.

None
exclude_unsetbool

Whether to exclude unset fields.

False
exclude_defaultsbool

Whether to exclude fields with default values.

False
exclude_nonebool

Whether to exclude fields with a value ofNone.

False
round_tripbool

Whether to serialize and deserialize the instance to ensure round-tripping.

False
warningsbool |Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,"error" raises aPydanticSerializationError.

True
fallbackCallable[[Any],Any] | None

A function to call when an unknown value is encountered. If not provided,aPydanticSerializationError error is raised.

None
serialize_as_anybool

Whether to serialize fields with duck-typing serialization behavior.

False
contextdict[str,Any] | None

Additional context to pass to the serializer.

None

Returns:

TypeDescription
bytes

The JSON representation of the given instance as bytes.

Source code inpydantic/type_adapter.py
588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645
defdump_json(self,instance:T,/,*,indent:int|None=None,include:IncEx|None=None,exclude:IncEx|None=None,by_alias:bool|None=None,exclude_unset:bool=False,exclude_defaults:bool=False,exclude_none:bool=False,round_trip:bool=False,warnings:bool|Literal['none','warn','error']=True,fallback:Callable[[Any],Any]|None=None,serialize_as_any:bool=False,context:dict[str,Any]|None=None,)->bytes:"""!!! abstract "Usage Documentation"        [JSON Serialization](../concepts/json.md#json-serialization)    Serialize an instance of the adapted type to JSON.    Args:        instance: The instance to be serialized.        indent: Number of spaces for JSON indentation.        include: Fields to include.        exclude: Fields to exclude.        by_alias: Whether to use alias names for field names.        exclude_unset: Whether to exclude unset fields.        exclude_defaults: Whether to exclude fields with default values.        exclude_none: Whether to exclude fields with a value of `None`.        round_trip: Whether to serialize and deserialize the instance to ensure round-tripping.        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].        fallback: A function to call when an unknown value is encountered. If not provided,            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.        context: Additional context to pass to the serializer.    Returns:        The JSON representation of the given instance as bytes.    """returnself.serializer.to_json(instance,indent=indent,include=include,exclude=exclude,by_alias=by_alias,exclude_unset=exclude_unset,exclude_defaults=exclude_defaults,exclude_none=exclude_none,round_trip=round_trip,warnings=warnings,fallback=fallback,serialize_as_any=serialize_as_any,context=context,)

json_schema

json_schema(*,by_alias:bool=True,ref_template:str=DEFAULT_REF_TEMPLATE,schema_generator:type[GenerateJsonSchema]=GenerateJsonSchema,mode:JsonSchemaMode="validation")->dict[str,Any]

Generate a JSON schema for the adapted type.

Parameters:

NameTypeDescriptionDefault
by_aliasbool

Whether to use alias names for field names.

True
ref_templatestr

The format string used for generating $ref strings.

DEFAULT_REF_TEMPLATE
schema_generatortype[GenerateJsonSchema]

The generator class used for creating the schema.

GenerateJsonSchema
modeJsonSchemaMode

The mode to use for schema generation.

'validation'

Returns:

TypeDescription
dict[str,Any]

The JSON schema for the model as a dictionary.

Source code inpydantic/type_adapter.py
647648649650651652653654655656657658659660661662663664665666667668669670
defjson_schema(self,*,by_alias:bool=True,ref_template:str=DEFAULT_REF_TEMPLATE,schema_generator:type[GenerateJsonSchema]=GenerateJsonSchema,mode:JsonSchemaMode='validation',)->dict[str,Any]:"""Generate a JSON schema for the adapted type.    Args:        by_alias: Whether to use alias names for field names.        ref_template: The format string used for generating $ref strings.        schema_generator: The generator class used for creating the schema.        mode: The mode to use for schema generation.    Returns:        The JSON schema for the model as a dictionary.    """schema_generator_instance=schema_generator(by_alias=by_alias,ref_template=ref_template)ifisinstance(self.core_schema,_mock_val_ser.MockCoreSchema):self.core_schema.rebuild()assertnotisinstance(self.core_schema,_mock_val_ser.MockCoreSchema),'this is a bug! please report it'returnschema_generator_instance.generate(self.core_schema,mode=mode)

json_schemasstaticmethod

json_schemas(inputs:Iterable[tuple[JsonSchemaKeyT,JsonSchemaMode,TypeAdapter[Any]]],/,*,by_alias:bool=True,title:str|None=None,description:str|None=None,ref_template:str=DEFAULT_REF_TEMPLATE,schema_generator:type[GenerateJsonSchema]=GenerateJsonSchema,)->tuple[dict[tuple[JsonSchemaKeyT,JsonSchemaMode],JsonSchemaValue,],JsonSchemaValue,]

Generate a JSON schema including definitions from multiple type adapters.

Parameters:

NameTypeDescriptionDefault
inputsIterable[tuple[JsonSchemaKeyT,JsonSchemaMode,TypeAdapter[Any]]]

Inputs to schema generation. The first two items will form the keys of the (first)output mapping; the type adapters will provide the core schemas that get converted intodefinitions in the output JSON schema.

required
by_aliasbool

Whether to use alias names.

True
titlestr | None

The title for the schema.

None
descriptionstr | None

The description for the schema.

None
ref_templatestr

The format string used for generating $ref strings.

DEFAULT_REF_TEMPLATE
schema_generatortype[GenerateJsonSchema]

The generator class used for creating the schema.

GenerateJsonSchema

Returns:

TypeDescription
tuple[dict[tuple[JsonSchemaKeyT,JsonSchemaMode],JsonSchemaValue],JsonSchemaValue]

A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
Source code inpydantic/type_adapter.py
672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727
@staticmethoddefjson_schemas(inputs:Iterable[tuple[JsonSchemaKeyT,JsonSchemaMode,TypeAdapter[Any]]],/,*,by_alias:bool=True,title:str|None=None,description:str|None=None,ref_template:str=DEFAULT_REF_TEMPLATE,schema_generator:type[GenerateJsonSchema]=GenerateJsonSchema,)->tuple[dict[tuple[JsonSchemaKeyT,JsonSchemaMode],JsonSchemaValue],JsonSchemaValue]:"""Generate a JSON schema including definitions from multiple type adapters.    Args:        inputs: Inputs to schema generation. The first two items will form the keys of the (first)            output mapping; the type adapters will provide the core schemas that get converted into            definitions in the output JSON schema.        by_alias: Whether to use alias names.        title: The title for the schema.        description: The description for the schema.        ref_template: The format string used for generating $ref strings.        schema_generator: The generator class used for creating the schema.    Returns:        A tuple where:            - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and                whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have                JsonRef references to definitions that are defined in the second returned element.)            - The second element is a JSON schema containing all definitions referenced in the first returned                element, along with the optional title and description keys.    """schema_generator_instance=schema_generator(by_alias=by_alias,ref_template=ref_template)inputs_=[]forkey,mode,adapterininputs:# This is the same pattern we follow for model json schemas - we attempt a core schema rebuild if we detect a mockifisinstance(adapter.core_schema,_mock_val_ser.MockCoreSchema):adapter.core_schema.rebuild()assertnotisinstance(adapter.core_schema,_mock_val_ser.MockCoreSchema),('this is a bug! please report it')inputs_.append((key,mode,adapter.core_schema))json_schemas_map,definitions=schema_generator_instance.generate_definitions(inputs_)json_schema:dict[str,Any]={}ifdefinitions:json_schema['$defs']=definitionsiftitle:json_schema['title']=titleifdescription:json_schema['description']=descriptionreturnjson_schemas_map,json_schema

[8]ページ先頭

©2009-2025 Movatter.jp