Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
license-header.txt
NotificationsYou must be signed in to change notification settings

deepset-ai/haystack

Green logo of a stylized white 'H' with the text 'Haystack, by deepset.' Abstract green and yellow diagrams in the background.
CI/CDTeststypes - MypyCoverage StatusRuff
DocsWebsite
PackagePyPIPyPI - DownloadsPyPI - Python VersionConda VersionGitHubLicense Compliance
MetaDiscordTwitter Follow

Haystack is an end-to-end LLM framework that allows you to build applications powered byLLMs, Transformer models, vector search and more. Whether you want to perform retrieval-augmented generation (RAG),document search, question answering or answer generation, Haystack can orchestrate state-of-the-art embedding modelsand LLMs into pipelines to build end-to-end NLP applications and solve your use case.

Installation

The simplest way to get Haystack is via pip:

pip install haystack-ai

Install from themain branch to try the newest features:

pip install git+https://github.com/deepset-ai/haystack.git@main

Haystack supports multiple installation methods including Docker images. For a comprehensive guide please referto thedocumentation.

Documentation

If you're new to the project, check out"What is Haystack?" then gothrough the"Get Started Guide" and build your first LLM applicationin a matter of minutes. Keep learning with thetutorials. For more advanceduse cases, or just to get some inspiration, you can browse our Haystack recipes in theCookbook.

At any given point, hit thedocumentation to learn more about Haystack, what can it do for you and the technology behind.

Features

Important

You are currently looking at the readme of Haystack 2.0. We are still maintaining Haystack 1.x to give everyoneenough time to migrate to 2.0.Switch to Haystack 1.x here.

  • Technology agnostic: Allow users the flexibility to decide what vendor or technology they want and make it easy to switch out any component for another. Haystack allows you to use and compare models available from OpenAI, Cohere and Hugging Face, as well as your own local models or models hosted on Azure, Bedrock and SageMaker.
  • Explicit: Make it transparent how different moving parts can “talk” to each other so it's easier to fit your tech stack and use case.
  • Flexible: Haystack provides all tooling in one place: database access, file conversion, cleaning, splitting, training, eval, inference, and more. And whenever custom behavior is desirable, it's easy to create custom components.
  • Extensible: Provide a uniform and easy way for the community and third parties to build their own components and foster an open ecosystem around Haystack.

Some examples of what you can do with Haystack:

  • Buildretrieval augmented generation (RAG) by making use of one of the available vector databases and customizing your LLM interaction, the sky is the limit 🚀
  • Perform Question Answeringin natural language to find granular answers in your documents.
  • Performsemantic search and retrieve documents according to meaning.
  • Build applications that can make complex decisions making to answer complex queries: such as systems that can resolve complex customer queries, do knowledge search on many disconnected resources and so on.
  • Scale to millions of docs using retrievers and production-scale components.
  • Useoff-the-shelf models orfine-tune them to your data.
  • Useuser feedback to evaluate, benchmark, and continuously improve your models.

Tip

Are you looking for a managed solution that benefits from Haystack?deepset Cloud is our fully managed, end-to-end platform to integrate LLMs with your data, which uses Haystack for the LLM pipelines architecture.

Tip

Would you like to deploy and serve Haystack pipelines as REST APIs yourself?Hayhooks provides a simple way to wrap your pipelines with custom logic and expose them via HTTP endpoints, including OpenAI-compatible chat completion endpoints and compatibility with fully-featured chat interfaces likeopen-webui.

🆕 deepset Studio: Your Development Environment for Haystack

Usedeepset Studio to visually create, deploy, and test your Haystack pipelines. Learn more about it inour announcement post.

studio

👉Sign up!

Telemetry

Haystack collectsanonymous usage statistics of pipeline components. We receive an event every time these components are initialized. This way, we know which components are most relevant to our community.

Read more about telemetry in Haystack or how you can opt out inHaystack docs.

🖖 Community

If you have a feature request or a bug report, feel free to open anissue in Github. We regularly check these and you can expect a quick response. If you'd like to discuss a topic, or get more general advice on how to make Haystack work for your project, you can start a thread inGithub Discussions or ourDiscord channel. We also check𝕏 (Twitter) andStack Overflow.

Contributing to Haystack

We are very open to the community's contributions - be it a quick fix of a typo, or a completely new feature! You don't need to be a Haystack expert to provide meaningful improvements. To learn how to get started, check out ourContributor Guidelines first.

There are several ways you can contribute to Haystack:

Who Uses Haystack

Here's a list of projects and companies using Haystack. Want to add yours? Open a PR, add it to the list and let theworld know that you use Haystack!

About

AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Topics

Resources

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
license-header.txt

Code of conduct

Security policy

Stars

Watchers

Forks

Languages


[8]ページ先頭

©2009-2025 Movatter.jp