Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

This repository features three demos that can be effortlessly integrated into your AWS environment. They serve as a practical guide to leveraging AWS services for crafting a sophisticated Large Language Model (LLM) Generative AI, geared towards creating a responsive Question and Answer Bot and localizing content generation.

License

NotificationsYou must be signed in to change notification settings

aws-samples/pace-genai-demos

This repository provides code samples for three Generative AI demos, licensed under MIT-0 license.

  1. Amazon Kendra with foundational LLM: Utilizes the deep search capabilities of AmazonKendra combined with the expansive knowledge of Large Language Models. Thisintegration provides precise and context-aware answers to complex queries by drawingfrom a diverse range of sources.

  2. Embeddings model with foundational LLM: Merges the power of embeddings—atechnique to capture semantic meanings of words and phrases—with the vastknowledge base of LLMs. This synergy enables more accurate topic modeling, contentrecommendation, and semantic search capabilities.

Embeddings Foundational

  1. Foundation Models Pharma Ad Generator: A specialized application tailored for thepharmaceutical industry. Harnessing the generative capabilities of foundational models,this tool creates convincing and compliant pharmaceutical advertisements, ensuringcontent adheres to industry standards and regulations

Pharma Ad Generator

These demos can be seamlessly deployed in your AWS account, offering foundational insights and guidance on utilizing AWS services to create a state-of-the-art Large Language Model (LLM) Generative AI Question and Answer Bot and content generation.

You can deploy these demo's independent of each other. Please refer to the Readme files in each of the folders for deployment instructions.

Refer to theblog post for details on how these solutions work.

Authors

Troubleshoot

Unzipped size must be smaller than 262144000 bytes (Service: AWSLambdaInternal; Status Code: 400)

  1. Delete the Existing Lambda Layer Folder: Begin by removing the lambda_langchain_layer folder from your project. This action ensures that any corrupted or oversized files are cleared.
  2. Recreate the Layer: After deletion, recreate the lambda layer using the deploy.sh command. This process should generate a fresh, size-compliant layer with the necessary components.
  3. Clean Docker Resources: It's also crucial to ensure that no residual Docker images or containers are occupying unnecessary space. Clean all running Docker images and containers to free up space and avoid any potential conflicts.

Reporting Bugs/Feature Requests

When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't alreadyreported the issue. Please try to include as much information as you can. Details like these are incredibly useful:

  • A reproducible test case or series of steps
  • The version of our code being used
  • Any modifications you've made relevant to the bug
  • Anything unusual about your environment or deployment

License

This library is licensed under the MIT-0 License. See the LICENSE file.

About

This repository features three demos that can be effortlessly integrated into your AWS environment. They serve as a practical guide to leveraging AWS services for crafting a sophisticated Large Language Model (LLM) Generative AI, geared towards creating a responsive Question and Answer Bot and localizing content generation.

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp