Taiga
This notebook provides a quick overview for getting started with Taiga tooling inlangchain_taiga. For more details on each tool and configuration, see the docstrings in your repository or relevant doc pages.
Overview
Integration details
Class | Package | Serializable | JS support | Package latest |
---|---|---|---|---|
create_entity_tool ,search_entities_tool ,get_entity_by_ref_tool ,update_entity_by_ref_tool ,add_comment_by_ref_tool ,add_attachment_by_ref_tool | langchain-taiga | N/A | TBD |
Tool features
create_entity_tool
: Creates user stories, tasks and issues in Taiga.search_entities_tool
: Searches for user stories, tasks and issues in Taiga.get_entity_by_ref_tool
: Gets a user story, task or issue by reference.update_entity_by_ref_tool
: Updates a user story, task or issue by reference.add_comment_by_ref_tool
: Adds a comment to a user story, task or issue.add_attachment_by_ref_tool
: Adds an attachment to a user story, task or issue.
Setup
The integration lives in thelangchain-taiga
package.
%pip install--quiet-U langchain-taiga
/home/henlein/Workspace/PyCharm/langchain/.venv/bin/python: No module named pip
Note: you may need to restart the kernel to use updated packages.
Credentials
This integration requires you to setTAIGA_URL
,TAIGA_API_URL
,TAIGA_USERNAME
,TAIGA_PASSWORD
andOPENAI_API_KEY
as environment variables to authenticate with Taiga.
export TAIGA_URL="https://taiga.xyz.org/"
export TAIGA_API_URL="https://taiga.xyz.org/"
export TAIGA_USERNAME="username"
export TAIGA_PASSWORD="pw"
export OPENAI_API_KEY="OPENAI_API_KEY"
It's also helpful (but not needed) to set upLangSmith for best-in-class observability:
# os.environ["LANGSMITH_TRACING"] = "true"
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass()
Instantiation
Below is an example showing how to instantiate the Taiga tools inlangchain_taiga
. Adjust as needed for your specific usage.
from langchain_taiga.tools.discord_read_messagesimport create_entity_tool
from langchain_taiga.tools.discord_send_messagesimport search_entities_tool
create_tool= create_entity_tool
search_tool= search_entities_tool
Invocation
Direct invocation with args
Below is a simple example of calling the tool with keyword arguments in a dictionary.
from langchain_taiga.tools.taiga_toolsimport(
add_attachment_by_ref_tool,
add_comment_by_ref_tool,
create_entity_tool,
get_entity_by_ref_tool,
search_entities_tool,
update_entity_by_ref_tool,
)
response= create_entity_tool.invoke(
{
"project_slug":"slug",
"entity_type":"us",
"subject":"subject",
"status":"new",
"description":"desc",
"parent_ref":5,
"assign_to":"user",
"due_date":"2022-01-01",
"tags":["tag1","tag2"],
}
)
response= search_entities_tool.invoke(
{"project_slug":"slug","query":"query","entity_type":"task"}
)
response= get_entity_by_ref_tool.invoke(
{"entity_type":"user_story","project_id":1,"ref":"1"}
)
response= update_entity_by_ref_tool.invoke(
{"project_slug":"slug","entity_ref":555,"entity_type":"us"}
)
response= add_comment_by_ref_tool.invoke(
{"project_slug":"slug","entity_ref":3,"entity_type":"us","comment":"new"}
)
response= add_attachment_by_ref_tool.invoke(
{
"project_slug":"slug",
"entity_ref":3,
"entity_type":"us",
"attachment_url":"url",
"content_type":"png",
"description":"desc",
}
)
Invocation with ToolCall
If you have a model-generatedToolCall
, pass it totool.invoke()
in the format shown below.
# This is usually generated by a model, but we'll create a tool call directly for demo purposes.
model_generated_tool_call={
"args":{"project_slug":"slug","query":"query","entity_type":"task"},
"id":"1",
"name": search_entities_tool.name,
"type":"tool_call",
}
tool.invoke(model_generated_tool_call)
Chaining
Below is a more complete example showing how you might integrate thecreate_entity_tool
andsearch_entities_tool
tools in a chain or agent with an LLM. This example assumes you have a function (likecreate_react_agent
) that sets up a LangChain-style agent capable of calling tools when appropriate.
# Example: Using Taiga Tools in an Agent
from langgraph.prebuiltimport create_react_agent
from langchain_taiga.tools.taiga_toolsimport create_entity_tool, search_entities_tool
# 1. Instantiate or configure your language model
# (Replace with your actual LLM, e.g., ChatOpenAI(temperature=0))
llm=...
# 2. Build an agent that has access to these tools
agent_executor= create_react_agent(llm,[create_entity_tool, search_entities_tool])
# 4. Formulate a user query that may invoke one or both tools
example_query="Please create a new user story with the subject 'subject' in slug project: 'slug'"
# 5. Execute the agent in streaming mode (or however your code is structured)
events= agent_executor.stream(
{"messages":[("user", example_query)]},
stream_mode="values",
)
# 6. Print out the model's responses (and any tool outputs) as they arrive
for eventin events:
event["messages"][-1].pretty_print()
API reference
See the docstrings in:
for usage details, parameters, and advanced configurations.
Related
- Toolconceptual guide
- Toolhow-to guides