Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Laminar - open-source all-in-one platform for engineering AI products. Create data flywheel for your AI app. Traces, Evals, Datasets, Labels. YC S24.

License

NotificationsYou must be signed in to change notification settings

lmnr-ai/lmnr

Repository files navigation

Static BadgeX (formerly Twitter) FollowStatic Badge

Laminar logo

Laminar

Laminar is the open-source platform for tracing and evaluating AI applications.

  • Tracing
    • OpenTelemetry-based automatic tracing of common AI frameworks and SDKs (LangChain, OpenAI, Anthropic ...) with just 2 lines of code. (powered byOpenLLMetry).
    • Trace input/output, latency, cost, token count.
    • Function tracing withobserve decorator/wrapper.
    • Image tracing.
  • Evals
    • Run evals in parallel with a simple SDK
  • Datasets
    • Export production trace data to datasets.
    • Run evals on hosted datasets.
  • Built for scale
    • Written in Rust 🦀
    • Traces are sent via gRPC, ensuring the best performance and lowest overhead.
  • Modern Open-Source stack
    • RabbitMQ for message queue, Postgres for data, Clickhouse for analytics.
  • Dashboards for statistics / traces / evaluations / tags.

traces

Documentation

Check out full documentation heredocs.lmnr.ai.

Getting started

The fastest and easiest way to get started is with our managed platform ->lmnr.ai

Self-hosting with Docker compose

For a quick start, clone the repo and start the services with docker compose:

git clone https://github.com/lmnr-ai/lmnrcd lmnrdocker compose up -d

This will spin up a lightweight version of the stack with Postgres, clickhouse, app-server, and frontend. This is good for a quickstartor for lightweight usage. You can access the UI athttp://localhost:5667 in your browser.

You will also need to properly configure the SDK, withbaseUrl and correct ports. Seehttps://docs.lmnr.ai/self-hosting/setup

For production environment, we recommend using ourmanaged platform ordocker compose -f docker-compose-full.yml up -d.

docker-compose-full.yml is heavy but it will enable all the features.

  • app-server – core Rust backend
  • rabbitmq – message queue for reliable trace processing
  • frontend – Next.js frontend and backend
  • postgres – Postgres database for all the application data
  • clickhouse – columnar OLAP database for more efficient trace and tag analytics

Contributing

For running and building Laminar locally, or to learn more about docker compose files,follow the guide inContributing.

TS quickstart

First,create a project and generate a project API key. Then,

npm add @lmnr-ai/lmnr

It will install Laminar TS SDK and all instrumentation packages (OpenAI, Anthropic, LangChain ...)

To start tracing LLM calls just add

import{Laminar}from'@lmnr-ai/lmnr';Laminar.initialize({projectApiKey:process.env.LMNR_PROJECT_API_KEY});

To trace inputs / outputs of functions useobserve wrapper.

import{OpenAI}from'openai';import{observe}from'@lmnr-ai/lmnr';constclient=newOpenAI({apiKey:process.env.OPENAI_API_KEY});constpoemWriter=observe({name:'poemWriter'},async(topic)=>{constresponse=awaitclient.chat.completions.create({model:"gpt-4o-mini",messages:[{role:"user",content:`write a poem about${topic}`}],});returnresponse.choices[0].message.content;});awaitpoemWriter();

Python quickstart

First,create a project and generate a project API key. Then,

pip install --upgrade'lmnr[all]'

It will install Laminar Python SDK and all instrumentation packages. See list of all instrumentshere

To start tracing LLM calls just add

fromlmnrimportLaminarLaminar.initialize(project_api_key="<LMNR_PROJECT_API_KEY>")

To trace inputs / outputs of functions use@observe() decorator.

importosfromopenaiimportOpenAIfromlmnrimportobserve,LaminarLaminar.initialize(project_api_key="<LMNR_PROJECT_API_KEY>")client=OpenAI(api_key=os.environ["OPENAI_API_KEY"])@observe()# annotate all functions you want to tracedefpoem_writer(topic):response=client.chat.completions.create(model="gpt-4o",messages=[            {"role":"user","content":f"write a poem about{topic}"},        ],    )poem=response.choices[0].message.contentreturnpoemif__name__=="__main__":print(poem_writer(topic="laminar flow"))

Running the code above will result in the following trace.

Screenshot 2024-10-29 at 7 52 40 PM

Client libraries

To learn more about instrumenting your code, check out our client libraries:

NPM VersionPyPI - Version

About

Laminar - open-source all-in-one platform for engineering AI products. Create data flywheel for your AI app. Traces, Evals, Datasets, Labels. YC S24.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors23


[8]ページ先頭

©2009-2025 Movatter.jp