Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Large Language Models (LLMs) tutorials & sample scripts, ft. langchain, openai, llamaindex, gpt, chromadb & pinecone

License

NotificationsYou must be signed in to change notification settings

onlyphantom/llm-python

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A set of instructional materials, code samples and Python scripts featuring LLMs (GPT etc) through interfaces like llamaindex, LangChain, OpenAI's Agent SDK, Chroma (Chromadb), Pinecone etc.

The code examples are aimed at helping you learn how to build LLM applications and Agents using Python. The code is designed to be self-contained and singularly focused, so you can pick and choose the usage patterns most relevant to your needs.

Many examples have accompanying videoson my YouTube channel.

LangChain youtube tutorials

Learn LangChain from my YouTube channel (~9 hours of LLM hands-on building tutorials); Each lesson is accompanied by the corresponding code in this repo and is designed to be self-contained -- while still focused on some key concepts in LLM (large language model) development and tooling.

Feel free to pick and choose your starting point based on your learning goals:

PartLLM TutorialLinkVideo Duration
1OpenAI tutorial and video walkthroughTutorial Video26:56
2LangChain + OpenAI tutorial: Building a Q&A system w/ own text dataTutorial Video20:00
3LangChain + OpenAI to chat w/ (query) own Database / CSVTutorial Video19:30
4LangChain + HuggingFace's Inference API (no OpenAI credits required!)Tutorial Video24:36
5Understanding Embeddings in LLMsTutorial Video29:22
6Query any website with LLamaIndex + GPT3 (ft. Chromadb, Trafilatura)Tutorial Video11:11
7Locally-hosted, offline LLM w/LlamaIndex + OPT (open source, instruction-tuning LLM)Tutorial Video32:27
8Building an AI Language Tutor: Pinecone + LlamaIndex + GPT-3 + BeautifulSoupTutorial Video51:08
9Building a queryable journal 💬 w/ OpenAI, markdown & LlamaIndex 🦙Tutorial Video40:29
10Making a Sci-Fi game w/ Cohere LLM + Stability.ai: Generative AI tutorialTutorial Video1:02:20
11GPT builds entire party invitation app from prompt (ft. SMOL Developer)Tutorial Video41:33
12A language for LLM prompt design: GuidanceTutorial Video43:15
13You should use LangChain's Caching!Tutorial Video25:37
14Build Chat AI apps with Steamlit + LangChainTutorial Video32:11

The full lesson playlist can be foundhere.

Updates

Multi-Agent and Agentic Patterns Update: May 3rd 2025

I've pushed 6 new scripts to the repo,19_agents_handsoff.py to24_agents_guardrails.py, intended to be used as code reference to this public course:

These additions to the repo illustrates 6 key patterns in building AI Agents (especially multi-agent systems) and use the latest version of OpenAI's Agent SDK (openai-agents) as of May 2025.

These 6 Agentic Patterns are (in order of appearance in this repo):

  • 1 The Hand-off and Delegation Pattern (19_agents_handsoff.py)
  • 2 The Tool-Use and Function Calling Pattern (20_agents_tooluse.py)
  • 3 The Deterministic and Sequential Chain Pattern (21_agents_deterministic.py)
  • 4 The Judge and Critic Pattern (22_agents_critic.py)
  • 5 The Parallelization Pattern (23_agents_parallelization.py)
  • 6 The Guardrails Pattern (24_agents_guardrails.py)

Update: Feb 5th 2025

I've pushed 4 new scripts to the repo,15_sql.py to18_chroma.py, which are intended to be used as code reference to this public course:

Additionally, I'm now also hosting example code in this repo for the followingGenerative AI Series by Sectors.

  1. Generative AI for Finance: An overview of designing Generative AI systems for the finance industry and the motivation for retrieval-augmented generation (RAG) systems.

  2. Tool-Use Retrieval Augmented Generation (RAG): Practical guide to building RAG systems leveraging on information retrieval tools (known as "tool-use" or "function-calling" in LLM)

  3. Structured Output from AIs: From using Generative AI to extract from unstructured data or perform actions like database queries, API calls, JSON parsing and more, we need schema and structure in the AI's output.

  4. Tool-use ReAct Agents w/ Streaming: Updated for LangChain v0.3.2, we explore streaming, LCEL expressions and ReAct agents following the most up-to-date practices for creating conversational AI agents.

  5. Conversational Memory AI Agents: Updated for LangChain v0.2.3, we dive into Creating AI Agents with Conversational Memory

Both of these series are public and free to access. The code in this repo is intended to be used as a reference for these courses.

Quick Start

  1. Clone this repo
  2. Install requirements:pip install -r requirements.txt
  3. Some sample data are provided to you in thenews foldeer, but you can use your own data by replacing the content (or adding to it) with your own text files.
  4. Create a.env file which contains your OpenAI API key. You can get one fromhere.HUGGINGFACEHUB_API_TOKEN andPINECONE_API_KEY are optional, but they are used in some of the lessons.
    • Lesson 10 uses Cohere and Stability AI, both of which offers a free tier (no credit card required). You can add the respective keys asCOHERE_API_KEY andSTABILITY_API_KEY in the.env file.
    • Some of the most advanced examples that feature tool-use, function-calling Agents will require you working with a real-world financial data API. My team at Supertype and I built a LLM-first financial API platform calledSectors. You can register for a free account, read ourAPI documentation and Generative AI 5-course series to learn how to use the API to build sophisticated LLM application. Examples of these applications are all in the repo.

Your.env file should look like this:

# recommendedOPENAI_API_KEY=...# optionals but usefulSECTORS_API_KEY=...GROQ_API_KEY=...# completely optional (pick and choose based on your needs)HUGGINGFACEHUB_API_TOKEN=...PINECONE_API_KEY=...DEEPSEEK_API_KEY=...COHERE_API_KEY=...STABILITY_API_KEY=...

HuggingFace and Pinecone are optional but is recommended if you want to use the Inference API and explore those models outside of the OpenAI ecosystem. This is demonstrated in Part 3 of the tutorial series.

  1. Run the examples in any order you want. For example,python 6_team.py will run the website Q&A example, which uses GPT-3 to answer questions about a company and the team of people working at Supertype.ai. Watch the corresponding video to follow along each of the examples.

Dependencies

💡 Thanks to the work of @VanillaMacchiato, this project is updated as of2023-06-30 to use the latest version of LlamaIndex (0.6.31) and LangChain (0.0.209). Installing the dependencies should be as simple aspip install -r requirements.txt. If you encounter any issues, please let me know.

If you're watching the LLM video tutorials, they may have very minor differences (typically 1-2 lines of code that needs to be changed) from the code in this repo since these videos have been released with the respective versions at the time of recording (LlamaIndex 0.5.7 and LangChain 0.0.157). Please refer to the code in this repo for the latest version of the code.

I will try to keep this repo up to date with the latest version of the libraries, but if you encounter any issues, please: (1) raise a discussion through Issues or (2) volunteer a PR to update the code.

NOTE:triton package is supported only for thex86_64 architecture. If you have problems with installing it, see thetriton compatibility guide. Specifically, errors likeERROR: Could not find a version that satisfies the requirement triton (from versions: none) ERROR: No matching distribution found for triton.uname -p should give you the processor's name.

Mentorship and Support

I run a mentorship program underSupertype Fellowship. The program is self-paced and free, with a community of other learners and practitioners around the world (English-speaking). You can optionally book a 1-on-1 session with my team of mentors to help you through video tutoring and code reviews.

License

MIT ©Supertype 2024

About

Large Language Models (LLMs) tutorials & sample scripts, ft. langchain, openai, llamaindex, gpt, chromadb & pinecone

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors5


[8]ページ先頭

©2009-2025 Movatter.jp