- Notifications
You must be signed in to change notification settings - Fork41
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
License
traceloop/openllmetry-js
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Open-source observability for your LLM application
🎉 New:Our semantic conventions are now part of OpenTelemetry! Join thediscussion and help us shape the future of LLM observability.
OpenLLMetry-JS is a set of extensions built on top ofOpenTelemetry that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, it can be connected to your existing observability solutions - Datadog, Honeycomb, and others.
It's built and maintained by Traceloop under the Apache 2.0 license.
The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry-JS, while still outputting standard OpenTelemetry data that can be connected to your observability stack.If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.
The easiest way to get started is to use our SDK.For a complete guide, go to ourdocs.
Install the SDK:
npm install --save @traceloop/node-server-sdk
Then, to start instrumenting your code, just add these 2 lines to your code:
import*astraceloopfrom"@traceloop/node-server-sdk";traceloop.initialize();
Make sure toimport
the SDK before importing any LLM module.
That's it. You're now tracing your code with OpenLLMetry-JS!If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:
traceloop.initialize({disableBatch:true});
Now, you need to decide where to export the traces to.
- ✅Traceloop
- ✅Dynatrace
- ✅Datadog
- ✅New Relic
- ✅Honeycomb
- ✅Grafana Tempo
- ✅HyperDX
- ✅SigNoz
- ✅Splunk
- ✅OpenTelemetry Collector
Seeour docs for instructions on connecting to each one.
OpenLLMetry-JS can instrument everything thatOpenTelemetry already instruments - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.
- ✅ OpenAI
- ✅ Azure OpenAI
- ✅ Anthropic
- ✅ Cohere
- ⏳ Replicate
- ⏳ HuggingFace
- ✅ Vertex AI (GCP)
- ✅ Bedrock (AWS)
- ✅ Pinecone
- ✅ Chroma
- ✅ Qdrant
- ⏳ Weaviate
- ⏳ Milvus
- ✅ LangChain
- ✅ LlamaIndex
The SDK provided with OpenLLMetry (not the instrumentations) contains a telemetry feature that collectsanonymous usage information.
You can opt out of telemetry by setting theTRACELOOP_TELEMETRY
environment variable toFALSE
.
- The primary purpose is to detect exceptions within instrumentations. Since LLM providers frequently update their APIs, this helps us quickly identify and fix any breaking changes.
- We only collect anonymous data, with no personally identifiable information. You can view exactly what data we collect in ourPrivacy documentation.
- Telemetry is only collected in the SDK. If you use the instrumentations directly without the SDK, no telemetry is collected.
Whether it's big or small, we love contributions ❤️ Check out our guide to see how toget started.
Not sure where to get started? You can:
- Book a free pairing session with one of our teammates!
- Join ourSlack, and ask us any questions there.
- Slack (For live discussion with the community and the Traceloop team)
- GitHub Discussions (For help with building and deeper conversations about features)
- GitHub Issues (For any bugs and errors you encounter using OpenLLMetry)
- Twitter (Get news fast)
About
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
Topics
Resources
License
Code of conduct
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Uh oh!
There was an error while loading.Please reload this page.