Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

[Feature Request] Example of using onnxruntime models within .net TPL data pipelines #16148

Open
Labels
feature requestrequest for unsupported feature or enhancement
@vadimkantorov

Description

@vadimkantorov

Describe the feature request

I have an existing TPL pipeline (https://learn.microsoft.com/en-us/dotnet/api/system.threading.tasks.dataflow.dataflowblockoptions.taskscheduler?view=net-7.0https://github.com/dotnet/runtime/tree/main/src/libraries/System.Threading.Tasks.Dataflow). I'd like to use CPU-backed ONNX models in its threads.

How can I do it properly?

I guess there should be some custom ThreadPool threads init function that loads the ONNX model?

If the functions are lightweight, then it's critical that every thread has its own loaded ONNX model and loads it only once

Describe scenario use case

Migrating existing TPL pipeline to using python/ONNX-exported graphs in its components

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature requestrequest for unsupported feature or enhancement

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions


      [8]ページ先頭

      ©2009-2026 Movatter.jp