- Notifications
You must be signed in to change notification settings - Fork13
Researcher Agent to write blog posts/ articles using Amazon Bedrock and websearch.
License
aws-samples/sample-bedrock-deep-researcher
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Bedrock Deep Research is a Streamlit-based application using Amazon Bedrock, LangGraph, and LangChain AWS libraries that automates article/report generation through AI-powered research, content writing, and image generation. It combines web research, structured content generation, and human feedback to produce comprehensive, well-researched articles with accompanying header images (generated byAmazon Bedrock Nova Canvas). This repo is inspired by LangChain'sDeep Researcher.
- Automated Research: Performs targeted web searches to gather relevant information
- Structured Content Generation: Creates cohesive article outlines and detailed section content
- Interactive Feedback Loop: Incorporates human feedback to refine article outlines
- AI-Generated Imagery: Produces relevant header images for visual appeal
bedrock_deep_research/├── bedrock_deep_research.py # Main Streamlit application entry point├── bedrock_deep_research/│ ├── config.py # Configuration settings and parameters│ ├── graph.py # Core workflow orchestration using LangGraph│ ├── model.py # Data models for articles and sections│ ├── nodes/ # Individual workflow components│ │ ├── article_head_image_generator.py # Header image generation│ │ ├── article_outline_generator.py # Article outline creation│ │ ├── section_writer.py # Section content generation│ │ └── [other node files] # Additional workflow components│ ├── utils.py # Utility functions│ └── web_search.py # Web research integration using Tavily API├── poetry.lock # Poetry dependency lock file└── pyproject.toml # Project configuration and dependencies
The application follows a sequential workflow from topic input to final article generation, with feedback loops for refinement.
Key Components of the graph:
- Initial Researcher: It performs initial web searches to gather context
- Article Outline Generator: creates structured outline using research data
- Human Feedback Provider: This incorporates human feedback for the outline
- Section Writer: A subgraph that generates content after web research.
- Compilation: combines all elements into a cohesive article
- Final Section Generation: Generate the overview and the last paragraph based on the other sections.
- Header Image Generator: creates relevant header image
The setup is meant to be used locally withAWS authentication, as well as within Amazon Sagemaker: either inJupyterLab orCode Editor instance.
Note: Current setup is assumingus-east-1
region (as defined inenv.tmp
file).
- Python 3.12 (to install, visit this link:https://www.python.org/downloads/).Check your python version using
python --version
.If your global python isn't set as3.12
, follow the steps here:https://python-poetry.org/docs/managing-environments/) - Poetry for dependency management
- Make sure you haveenabled model access viaAWS Bedrock access in
us-east-1
region. You can find the supported models provided inSUPPORTED_MODELS
variable in./bedrock_deep_research/config.py
. - Tavily API key for web research capabilities.
# Clone the repositorygit clone https://github.com/aws-samples/sample-bedrock-deep-researcher.gitcd sample-bedrock-deep-researcher# Activate the virtual environmentpoetry shell# Install dependencies using Poetrypoetry install
Go tohttps://app.tavily.com/home and create a free API KEY. Copy the API Key and paste it into theenv.tmp
file.
Copy the environment variables into the local environment.
cp env.tmp .env
- Start the Streamlit application:
streamlit run bedrock_deep_research.py
Writing Guideline samples:You could include instructions like:
- Strict 150-200 word limit- Start with your most important insight in **bold**- Include code examples where relevant- Focus on practical implementation
Web Research Configuration:
number_of_queries=2# Number of search queries per sectionmax_search_depth=2# Maximum research iterations per section
Debug Mode:
# Enable debug loggingexport LOG_LEVEL=DEBUGstreamlit run bedrock_deep_research.py
Contributions are welcome! Please open an issue or submit a pull request if you have any improvements or bug fixes. Read CONTRIBUTING.md for more details.
This library is licensed under the MIT-0 License. See the LICENSE file.
About
Researcher Agent to write blog posts/ articles using Amazon Bedrock and websearch.
Topics
Resources
License
Code of conduct
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors3
Uh oh!
There was an error while loading.Please reload this page.