JSON¶
Json Parsing¶
API Documentation
pydantic.main.BaseModel.model_validate_json
pydantic.type_adapter.TypeAdapter.validate_json
pydantic_core.from_json
Pydantic provides builtin JSON parsing, which helps achieve:
- Significant performance improvements without the cost of using a 3rd party library
- Support for custom errors
- Support for
strict
specifications
Here's an example of Pydantic's builtin JSON parsing via themodel_validate_json
method, showcasing the support forstrict
specifications while parsing JSON data that doesn't match the model's type annotations:
fromdatetimeimportdatefrompydanticimportBaseModel,ConfigDict,ValidationErrorclassEvent(BaseModel):model_config=ConfigDict(strict=True)when:datewhere:tuple[int,int]json_data='{"when": "1987-01-28", "where": [51, -1]}'print(Event.model_validate_json(json_data))# (1)!#> when=datetime.date(1987, 1, 28) where=(51, -1)try:Event.model_validate({'when':'1987-01-28','where':[51,-1]})# (2)!exceptValidationErrorase:print(e)""" 2 validation errors for Event when Input should be a valid date [type=date_type, input_value='1987-01-28', input_type=str] where Input should be a valid tuple [type=tuple_type, input_value=[51, -1], input_type=list] """
- JSON has no
date
or tuple types, but Pydantic knows that so allows strings and arrays as inputs respectively when parsing JSON directly. - If you pass the same values to the
model_validate
method, Pydantic will raise a validation error because thestrict
configuration is enabled.
In v2.5.0 and above, Pydantic usesjiter
, a fast and iterable JSON parser, to parse JSON data.Usingjiter
compared toserde
results in modest performance improvements that will get even better in the future.
Thejiter
JSON parser is almost entirely compatible with theserde
JSON parser,with one noticeable enhancement being thatjiter
supports deserialization ofinf
andNaN
values.In the future,jiter
is intended to enable support validation errors to include the locationin the original JSON input which contained the invalid value.
Partial JSON Parsing¶
Starting in v2.7.0, Pydantic'sJSON parser offers support for partial JSON parsing, which is exposed viapydantic_core.from_json
. Here's an example of this feature in action:
frompydantic_coreimportfrom_jsonpartial_json_data='["aa", "bb", "c'# (1)!try:result=from_json(partial_json_data,allow_partial=False)exceptValueErrorase:print(e)# (2)!#> EOF while parsing a string at line 1 column 15result=from_json(partial_json_data,allow_partial=True)print(result)# (3)!#> ['aa', 'bb']
- The JSON list is incomplete - it's missing a closing
"]
- When
allow_partial
is set toFalse
(the default), a parsing error occurs. - When
allow_partial
is set toTrue
, part of the input is deserialized successfully.
This also works for deserializing partial dictionaries. For example:
frompydantic_coreimportfrom_jsonpartial_dog_json='{"breed": "lab", "name": "fluffy", "friends": ["buddy", "spot", "rufus"], "age'dog_dict=from_json(partial_dog_json,allow_partial=True)print(dog_dict)#> {'breed': 'lab', 'name': 'fluffy', 'friends': ['buddy', 'spot', 'rufus']}
Validating LLM Output
This feature is particularly beneficial for validating LLM outputs.We've written some blog posts about this topic, which you can findhere.
In future versions of Pydantic, we expect to expand support for this feature through either Pydantic's other JSON validation functions(pydantic.main.BaseModel.model_validate_json
andpydantic.type_adapter.TypeAdapter.validate_json
) or model configuration. Stay tuned 🚀!
For now, you can usepydantic_core.from_json
in combination withpydantic.main.BaseModel.model_validate
to achieve the same result. Here's an example:
frompydantic_coreimportfrom_jsonfrompydanticimportBaseModelclassDog(BaseModel):breed:strname:strfriends:listpartial_dog_json='{"breed": "lab", "name": "fluffy", "friends": ["buddy", "spot", "rufus"], "age'dog=Dog.model_validate(from_json(partial_dog_json,allow_partial=True))print(repr(dog))#> Dog(breed='lab', name='fluffy', friends=['buddy', 'spot', 'rufus'])
Tip
For partial JSON parsing to work reliably, all fields on the model should have default values.
Check out the following example for a more in-depth look at how to use default values with partial JSON parsing:
Using default values with partial JSON parsing
fromtypingimportAnnotated,Any,Optionalimportpydantic_corefrompydanticimportBaseModel,ValidationError,WrapValidatordefdefault_on_error(v,handler)->Any:""" Raise a PydanticUseDefault exception if the value is missing. This is useful for avoiding errors from partial JSON preventing successful validation. """try:returnhandler(v)exceptValidationErrorasexc:# there might be other types of errors resulting from partial JSON parsing# that you allow here, feel free to customize as neededifall(e['type']=='missing'foreinexc.errors()):raisepydantic_core.PydanticUseDefault()else:raiseclassNestedModel(BaseModel):x:inty:strclassMyModel(BaseModel):foo:Optional[str]=Nonebar:Annotated[Optional[tuple[str,int]],WrapValidator(default_on_error)]=Nonenested:Annotated[Optional[NestedModel],WrapValidator(default_on_error)]=Nonem=MyModel.model_validate(pydantic_core.from_json('{"foo": "x", "bar": ["world",',allow_partial=True))print(repr(m))#> MyModel(foo='x', bar=None, nested=None)m=MyModel.model_validate(pydantic_core.from_json('{"foo": "x", "bar": ["world", 1], "nested": {"x":',allow_partial=True))print(repr(m))#> MyModel(foo='x', bar=('world', 1), nested=None)
Caching Strings¶
Starting in v2.7.0, Pydantic'sJSON parser offers support for configuring how Python strings are cached during JSON parsing and validation (when Python strings are constructed from Rust strings during Python validation, e.g. afterstrip_whitespace=True
).Thecache_strings
setting is exposed via bothmodel config andpydantic_core.from_json
.
Thecache_strings
setting can take any of the following values:
True
or'all'
(the default): cache all strings'keys'
: cache only dictionary keys, thisonly applies when used withpydantic_core.from_json
or when parsing JSON usingJson
False
or'none'
: no caching
Using the string caching feature results in performance improvements, but increases memory usage slightly.
String Caching Details
- Strings are cached using a fully associative cache with a size of16,384.
- Only strings where
len(string) < 64
are cached. - There is some overhead to looking up the cache, which is normally worth it to avoid constructing strings.However, if you know there will be very few repeated strings in your data, you might get a performance boost by disabling this setting with
cache_strings=False
.
JSON Serialization¶
API Documentation
pydantic.main.BaseModel.model_dump_json
pydantic.type_adapter.TypeAdapter.dump_json
pydantic_core.to_json
For more information on JSON serialization, see theSerialization Concepts page.