Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

Snowflake Cortex

Snowflake Cortex gives you instant access to industry-leading large language models (LLMs) trained by researchers at companies like Mistral, Reka, Meta, and Google, includingSnowflake Arctic, an open enterprise-grade model developed by Snowflake.

This example goes over how to use LangChain to interact with Snowflake Cortex.

Installation and setup

We start by installing thesnowflake-snowpark-python library, using the command below. Then we configure the credentials for connecting to Snowflake, as environment variables or pass them directly.

%pip install--upgrade--quiet snowflake-snowpark-python
import getpass
import os

# First step is to set up the environment variables, to connect to Snowflake,
# you can also pass these snowflake credentials while instantiating the model

if os.environ.get("SNOWFLAKE_ACCOUNT")isNone:
os.environ["SNOWFLAKE_ACCOUNT"]= getpass.getpass("Account: ")

if os.environ.get("SNOWFLAKE_USERNAME")isNone:
os.environ["SNOWFLAKE_USERNAME"]= getpass.getpass("Username: ")

if os.environ.get("SNOWFLAKE_PASSWORD")isNone:
os.environ["SNOWFLAKE_PASSWORD"]= getpass.getpass("Password: ")

if os.environ.get("SNOWFLAKE_DATABASE")isNone:
os.environ["SNOWFLAKE_DATABASE"]= getpass.getpass("Database: ")

if os.environ.get("SNOWFLAKE_SCHEMA")isNone:
os.environ["SNOWFLAKE_SCHEMA"]= getpass.getpass("Schema: ")

if os.environ.get("SNOWFLAKE_WAREHOUSE")isNone:
os.environ["SNOWFLAKE_WAREHOUSE"]= getpass.getpass("Warehouse: ")

if os.environ.get("SNOWFLAKE_ROLE")isNone:
os.environ["SNOWFLAKE_ROLE"]= getpass.getpass("Role: ")
from langchain_community.chat_modelsimport ChatSnowflakeCortex
from langchain_core.messagesimport HumanMessage, SystemMessage

# By default, we'll be using the cortex provided model: `mistral-large`, with function: `complete`
chat= ChatSnowflakeCortex()

The above cell assumes that your Snowflake credentials are set in your environment variables. If you would rather manually specify them, use the following code:

chat= ChatSnowflakeCortex(
# Change the default cortex model and function
model="mistral-large",
cortex_function="complete",

# Change the default generation parameters
temperature=0,
max_tokens=10,
top_p=0.95,

# Specify your Snowflake Credentials
account="YOUR_SNOWFLAKE_ACCOUNT",
username="YOUR_SNOWFLAKE_USERNAME",
password="YOUR_SNOWFLAKE_PASSWORD",
database="YOUR_SNOWFLAKE_DATABASE",
schema="YOUR_SNOWFLAKE_SCHEMA",
role="YOUR_SNOWFLAKE_ROLE",
warehouse="YOUR_SNOWFLAKE_WAREHOUSE"
)

Calling the chat model

We can now call the chat model using theinvoke orstream methods.

messages = [SystemMessage(content="You are a friendly assistant."),HumanMessage(content="What are large language models?"),]chat.invoke(messages)

Stream

# Sample input prompt
messages=[
SystemMessage(content="You are a friendly assistant."),
HumanMessage(content="What are large language models?"),
]

# Invoke the stream method and print each chunk as it arrives
print("Stream Method Response:")
for chunkin chat._stream(messages):
print(chunk.message.content)

Related


[8]ページ先頭

©2009-2025 Movatter.jp