- Notifications
You must be signed in to change notification settings - Fork0
Integrate cutting-edge LLM technology quickly and easily into your apps
License
gilbertalgordo/semantic-kernel
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Semantic Kernelis an SDK that integrates Large Language Models (LLMs) likeOpenAI,Azure OpenAI,andHugging Facewith conventional programming languages like C#, Python, and Java. Semantic Kernel achieves thisby allowing you to definepluginsthat can be chained togetherin just afew lines of code.
What makes Semantic Kernelspecial, however, is its ability toautomatically orchestrateplugins with AI. With Semantic Kernelplanners, youcan ask an LLM to generate a plan that achieves a user's unique goal. Afterwards,Semantic Kernel will execute the plan for the user.
It provides:
- abstractions for AI services (such as chat, text to images, audio to text, etc.) and memory stores
- implementations of those abstractions for services fromOpenAI,Azure OpenAI,Hugging Face, local models, and more, and for a multitude of vector databases, such as those fromChroma,Qdrant,Milvus, andAzure
- a common representation forplugins, which can then be orchestrated automatically by AI
- the ability to create such plugins from a multitude of sources, including from OpenAPI specifications, prompts, and arbitrary code written in the target language
- extensible support for prompt management and rendering, including built-in handling of common formats like Handlebars and Liquid
- and a wealth of functionality layered on top of these abstractions, such as filters for responsible AI, dependency injection integration, and more.
Semantic Kernel is utilized by enterprises due to its flexibility, modularity and observability. Backed with security enhancing capabilities like telemetry support, and hooks and filters so you’ll feel confident you’re delivering responsible AI solutions at scale.Semantic Kernel was designed to be future proof, easily connecting your code to the latest AI models evolving with the technology as it advances. When new models are released, you’ll simply swap them out without needing to rewrite your entire codebase.
The Semantic Kernel SDK is available in C#, Python, and Java. To get started, choose your preferred language below. See theFeature Matrix for a breakdown offeature parity between our currently supported languages.
![]() |
The quickest way to get started with the basics is to get an API keyfrom either OpenAI or Azure OpenAI and to run one of the C#, Python, and Java console applications/scripts below.
- Go to the Quick start pagehere and follow the steps to dive in.
- After Installing the SDK, we advise you follow the steps and code detailed to write your first console app.
- Go to the Quick start pagehere and follow the steps to dive in.
- You'll need to ensure that you toggle to Python in the the Choose a programming language table at the top of the page.
The Java code is in thesemantic-kernel-java repository. Seesemantic-kernel-java build for instructions onhow to build and run the Java code.
Please file Java Semantic Kernel specific issues inthesemantic-kernel-java repository.
The fastest way to learn how to use Semantic Kernel is with our C# and Python Jupyter notebooks. These notebooksdemonstrate how to use Semantic Kernel with code snippets that you can run with a push of a button.
Once you've finished the getting started notebooks, you can then check out the main walkthroughson our Learn site. Each sample comes with a completed C# and Python project that you can run locally.
Finally, refer to our API references for more details on the C# and Python APIs:
- C# API reference
- Python API reference
- Java API reference (coming soon)
The Semantic Kernel extension for Visual Studio Code makes it easy to design and test semantic functions. The extension provides an interface for designing semantic functions and allows you to test them with the push of a button with your existing models and data.
We welcome your contributions and suggestions to SK community! One of the easiestways to participate is to engage in discussions in the GitHub repository.Bug reports and fixes are welcome!
For new features, components, or extensions, please open an issue and discuss withus before sending a PR. This is to avoid rejection as we might be taking the corein a different direction, but also to consider the impact on the larger ecosystem.
To learn more and get started:
Read thedocumentation
Learn how tocontribute to the project
Ask questions in theGitHub discussions
Ask questions in theDiscord community
Follow the team on ourblog
This project has adopted theMicrosoft Open Source Code of Conduct.For more information see theCode of Conduct FAQor contactopencode@microsoft.comwith any additional questions or comments.
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under theMIT license.
About
Integrate cutting-edge LLM technology quickly and easily into your apps
Resources
License
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Languages
- C#70.2%
- Python27.6%
- Jupyter Notebook2.1%
- Handlebars0.1%
- PowerShell0.0%
- F#0.0%



