Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Pydantic Evals report can't be displayed if logfire is not installed #2209

Open
Assignees
Kludex
@snakingfire

Description

@snakingfire

Initial Checks

Description

Similar to:
#1399

When attempting to print the report object from an evaluation run, if logfire is not installed, the report will fail to render with this exception:

  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 258, in __str__    table = self.console_table()            ^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 252, in console_table    return renderer.build_table(self)           ^^^^^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 923, in build_table    table.add_row(*case_renderer.build_row(case))                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 596, in build_row    row.append(self._render_durations(case))               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 739, in _render_durations    return self._render_dict(           ^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 791, in _render_dict    rendered = renderers[key].render_value(key if include_names else None, val)               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 403, in render_value    result = self._get_value_str(v)             ^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/__init__.py", line 475, in _get_value_str    return self.value_formatter(value)           ^^^^^^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/render_numbers.py", line 104, in default_render_duration    return _render_duration(seconds, False)           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^  File "venv/lib/python3.11/site-packages/pydantic_evals/reporting/render_numbers.py", line 174, in _render_duration    if (abs_seconds := abs(seconds)) < 1e-3:       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^TypeError: '<' not supported between instances of 'MagicMock' and 'float'

This is because the task_duration and total_duration attributes of the ReportCase are MagicMock objects, which seem to be being returned at some point when processing the spans in order to calculate them.

This only happens when the logfire package is not installed. When the package is installed, the duration values are populated and the report can be rendered correctly.

Example Code

# Create datasetdataset=create_dataset(...)# Create any valid Pydantic Evals datasetreport=awaitdataset.evaluate()# Print the reportprint("\n=== Evaluation Results ===")print(report)# Error occurs on this line

Python, Pydantic AI & LLM client version

Python 3.11pydantic-ai == 0.4.2pydantic-evals == 0.4.2

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions


    [8]ページ先頭

    ©2009-2025 Movatter.jp