Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Rust library for OpenAI

License

NotificationsYou must be signed in to change notification settings

64bit/async-openai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

async-openai

Async Rust library for OpenAI

Logo created by thisrepo itself

Overview

async-openai is an unofficial Rust library for OpenAI.

  • It's based onOpenAI OpenAPI spec
  • Current features:
    • Assistants (v2)
    • Audio
    • Batch
    • Chat
    • Completions (Legacy)
    • Embeddings
    • Files
    • Fine-Tuning
    • Images
    • Models
    • Moderations
    • Organizations | Administration (partially implemented)
    • Realtime (Beta) (partially implemented)
    • Responses (partially implemented)
    • Uploads
  • Bring your own custom types for Request or Response objects.
  • SSE streaming on available APIs
  • Requests (except SSE streaming) including form submissions are retried with exponential backoff whenrate limited.
  • Ergonomic builder pattern for all request objects.
  • Microsoft Azure OpenAI Service (only for APIs matching OpenAI spec)

Usage

The library readsAPI key from the environment variableOPENAI_API_KEY.

# On macOS/Linuxexport OPENAI_API_KEY='sk-...'
# On Windows Powershell$Env:OPENAI_API_KEY='sk-...'

Realtime API

Only types for Realtime API are implemented, and can be enabled with feature flagrealtime.These types were written before OpenAI released official specs.

Image Generation Example

use async_openai::{    types::{CreateImageRequestArgs,ImageSize,ImageResponseFormat},Client,};use std::error::Error;#[tokio::main]asyncfnmain() ->Result<(),Box<dynError>>{// create client, reads OPENAI_API_KEY environment variable for API key.let client =Client::new();let request =CreateImageRequestArgs::default().prompt("cats on sofa and carpet in living room").n(2).response_format(ImageResponseFormat::Url).size(ImageSize::S256x256).user("async-openai").build()?;let response = client.images().create(request).await?;// Download and save images to ./data directory.// Each url is downloaded and saved in dedicated Tokio task.// Directory is created if it doesn't exist.let paths = response.save("./data").await?;    paths.iter().for_each(|path|println!("Image file path: {}", path.display()));Ok(())}

Scaled up for README, actual size 256x256

Bring Your Own Types

Enable methods whose input and outputs are generics withbyot feature. It creates a new method with same name and_byot suffix.

For example, to useserde_json::Value as request and response type:

let response:Value = client.chat().create_byot(json!({"messages":[{"role":"developer","content":"You are a helpful assistant"},{"role":"user","content":"What do you think about life?"}],"model":"gpt-4o","store":false})).await?;

This can be useful in many scenarios:

  • To use this library with other OpenAI compatible APIs whose types don't exactly match OpenAI.
  • Extend existing types in this crate with new fields withserde.
  • To avoid verbose types.
  • To escape deserialization errors.

Visitexamples/bring-your-own-typedirectory to learn more.

Dynamic Dispatch for Different Providers

For any struct that implementsConfig trait, you can wrap it in a smart pointer and cast the pointer todyn Configtrait object, then your client can accept any wrapped configuration type.

For example,

use async_openai::{Client, config::Config, config::OpenAIConfig};let openai_config =OpenAIConfig::default();// You can use `std::sync::Arc` to wrap the config as welllet config =Box::new(openai_config)asBox<dynConfig>;let client:Client<Box<dynConfig>> =Client::with_config(config);

Contributing

Thank you for taking the time to contribute and improve the project. I'd be happy to have you!

All forms of contributions, such as new features requests, bug fixes, issues, documentation, testing, comments,examples etc. are welcome.

A good starting point would be to look at existingopen issues.

To maintain quality of the project, a minimum of the following is a must for code contribution:

  • Names & Documentation: All struct names, field names and doc comments are from OpenAPI spec. Nested objects in spec without names leaves room for making appropriate name.
  • Tested: For changes supporting test(s) and/or example is required. Existing examples, doc tests, unit tests, and integration tests should be made to work with the changes if applicable.
  • Scope: Keep scope limited to APIs available in official documents such asAPI Reference orOpenAPI spec. Other LLMs or AI Providers offer OpenAI-compatible APIs, yet they may not always have full parity. In such cases, the OpenAI spec takes precedence.
  • Consistency: Keep code style consistent across all the "APIs" that library exposes; it creates a great developer experience.

This project adheres toRust Code of Conduct

Complimentary Crates

  • openai-func-enums provides procedural macros that make it easier to use this library with OpenAI API's tool calling feature. It also provides derive macros you can add to existingclap application subcommands for natural language use of command line tools. It also supports openai'sparallel tool calls and allows you to choose between running multiple tool calls concurrently or own their own OS threads.
  • async-openai-wasm provides WASM support.

License

This project is licensed underMIT license.


[8]ページ先頭

©2009-2025 Movatter.jp