Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Async Rust library for OpenAI and others on WASM

License

NotificationsYou must be signed in to change notification settings

ifsheldon/async-openai-wasm

 
 

Repository files navigation

Async Rust library for OpenAI on WASM

Overview

async-openai-wasmis a FORK ofasync-openai that supports WASM targets by targetingwasm32-unknown-unknown.That means >99% of the codebase should be attributed to the original project. The synchronization with the originalproject is and will be done manually whenasync-openai releases a new version. Versions are kept in syncwithasync-openai releases, which means whenasync-openai releasesx.y.z,async-openai-wasm also releasesax.y.z version.

async-openai-wasm is an unofficial Rust library for OpenAI.

  • It's based onOpenAI OpenAPI spec
  • Current features:
    • Assistants (v2)
    • Audio
    • Batch
    • Chat
    • Completions (Legacy)
    • Embeddings
    • Files
    • Fine-Tuning
    • Images
    • Models
    • Moderations
    • Organizations | Administration (partially implemented)
    • Realtime (Beta) (partially implemented)
    • Uploads
    • Responses (partially implemented)
    • WASM support
    • Reasoning Model Support: support models like DeepSeek R1 via broader support for OpenAI-compatible endpoints, seeexamples/reasoning
  • SSE streaming on available APIs
  • Ergonomic builder pattern for all request objects.
  • Microsoft Azure OpenAI Service (only for APIs matching OpenAI spec)
  • Bring your own custom types for Request or Response objects.

Note on Azure OpenAI Service (AOS):async-openai-wasm primarily implements OpenAI spec, and doesn't try tomaintain parity with spec of AOS. Just likeasync-openai.

Differences fromasync-openai

+ * WASM support+ * WASM examples+ * Realtime API: Does not bundle with a specific WS implementation. Need to convert a client event into a WS message by yourself, which is just simple `your_ws_impl::Message::Text(some_client_event.into_text())`+ * Broader support for OpenAI-compatible Endpoints+ * Reasoning Model Support- * Tokio- * Non-wasm examples: please refer to the original project [async-openai](https://github.com/64bit/async-openai/).- * Builtin backoff retries: due to [this issue](https://github.com/ihrwein/backoff/issues/61).-   * Recommend: use `backon` with `gloo-timers-sleep` feature instead.- * File saving: `wasm32-unknown-unknown` on browsers doesn't have access to filesystem.

Usage

The library readsAPI key from the environmentvariableOPENAI_API_KEY.

# On macOS/Linuxexport OPENAI_API_KEY='sk-...'
# On Windows Powershell$Env:OPENAI_API_KEY='sk-...'

Realtime API

Only types for Realtime API are implemented, and can be enabled with feature flagrealtimeThese types were written before OpenAI released official specs.

Again, the types do not bundle with a specific WS implementation. Need to convert a client event into a WS message by yourself, which is just simpleyour_ws_impl::Message::Text(some_client_event.into_text()).

Image Generation Example

use async_openai_wasm::{    types::{CreateImageRequestArgs,ImageSize,ImageResponseFormat},Client,};use std::error::Error;#[tokio::main]asyncfnmain() ->Result<(),Box<dynError>>{// create client, reads OPENAI_API_KEY environment variable for API key.let client =Client::new();let request =CreateImageRequestArgs::default().prompt("cats on sofa and carpet in living room").n(2).response_format(ImageResponseFormat::Url).size(ImageSize::S256x256).user("async-openai-wasm").build()?;let response = client.images().create(request).await?;// Download and save images to ./data directory.// Each url is downloaded and saved in dedicated Tokio task.// Directory is created if it doesn't exist.let paths = response.save("./data").await?;    paths.iter().for_each(|path|println!("Image file path: {}", path.display()));Ok(())}

Scaled up for README, actual size 256x256

Dynamic Dispatch for Different Providers

For any struct that implementsConfig trait, you can wrap it in a smart pointer and cast the pointer todyn Configtrait object, then your client can accept any wrapped configuration type.

For example,

use async_openai::{Client, config::Config, config::OpenAIConfig};let openai_config =OpenAIConfig::default();// You can use `std::sync::Arc` to wrap the config as welllet config =Box::new(openai_config)asBox<dynConfig>;let client:Client<Box<dynConfig>> =Client::with_config(config);

Contributing

This repo will only accept issues and PRs related to WASM support. For other issues and PRs, please visit the originalprojectasync-openai.

This project adheres toRust Code of Conduct

Complimentary Crates

  • openai-func-enums provides procedural macros that make it easierto use this library with OpenAI API's tool calling feature. It also provides derive macros you can add toexistingclap application subcommands for natural language use of command linetools. It also supportsopenai'sparallel tool calls andallows you to choose between running multiple tool calls concurrently or own their own OS threads.

Whyasync-openai-wasm

Because I wanted to develop and release a crate that depends on the wasm feature inexperiments branchofasync-openai, but the pace of stabilizing the wasm feature is differentfrom what I expected.

License

The additional modifications are licensed underMIT license.The original project is also licensed underMIT license.

About

Async Rust library for OpenAI and others on WASM

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Rust100.0%

[8]ページ先頭

©2009-2025 Movatter.jp