OPEA [Open Platform for Enterprise AI]
- 1k followers
- United States of America
- https://opea.dev
- https://wiki.lfaidata.foundation/display/DL/OPEA+Home
- info@opea.dev
OPEA is an open platform project that lets you create open,multi-provider, robust, and composable GenAI solutions that harness the bestinnovation across the ecosystem.
The OPEA platform includes:
- Detailed framework of composable building blocks for state-of-the-artgenerative AI systems including LLMs, data stores, and prompt engines
- Architectural blueprints of retrieval-augmented generative AI component stackstructure and end-to-end workflows
- A four-step assessment for grading generative AI systems around performance,features, trustworthiness, and enterprise-grade readiness
Read more about OPEA atopea.dev and explore the OPEAtechnical documentation atopea-project.github.io
Popular repositoriesLoading
- GenAIExamples
GenAIExamples PublicGenerative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
- GenAIComps
GenAIComps PublicGenAI components at micro-service level; GenAI service composer to create mega-service
- GenAIStudio
GenAIStudio PublicGenAI Studio is a low code platform to enable users to construct, evaluate, and benchmark GenAI applications. The platform also provide capability to export developed application as a ready-to-depl…
- Enterprise-RAG
Enterprise-RAG PublicIntel® AI for Enterprise RAG converts enterprise data into actionable insights with excellent TCO. Utilizing Intel Gaudi AI accelerators and Intel Xeon processors ensuring streamlined deployment.
Repositories
- Enterprise-RAG Public
Intel® AI for Enterprise RAG converts enterprise data into actionable insights with excellent TCO. Utilizing Intel Gaudi AI accelerators and Intel Xeon processors ensuring streamlined deployment.
opea-project/Enterprise-RAG’s past year of commit activity - Enterprise-Inference Public
Intel® AI for Enterprise Inference optimizes AI inference services on Intel hardware using Kubernetes Orchestration. It automates LLM model deployment for faster inference, resource provisioning, and optimal settings to simplify processes and reduce manual work.
opea-project/Enterprise-Inference’s past year of commit activity - opea-project.github.io Public
opea-project/opea-project.github.io’s past year of commit activity - GenAIComps Public
GenAI components at micro-service level; GenAI service composer to create mega-service
opea-project/GenAIComps’s past year of commit activity - GenAIExamples Public
Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
opea-project/GenAIExamples’s past year of commit activity - GenAIStudio Public
GenAI Studio is a low code platform to enable users to construct, evaluate, and benchmark GenAI applications. The platform also provide capability to export developed application as a ready-to-deploy package for immediate enterprise integration.
opea-project/GenAIStudio’s past year of commit activity - LangChain-OPEA Public
opea-project/LangChain-OPEA’s past year of commit activity