<AIConversation>
User
Sun, May 21 at 3:23 PM
Hello
Assistant
Sun, May 21 at 3:24 PM
Hello! I am your virtual assistant how may I help you?
Note: the example is a mocked component and not hooked up to a live backend
Introduction
The<AIConversation>
component is highly customizable to fit into any application. The component is built so that it works with theuseAIConversation
hook. The hook manages the state and lifecycle of the component. The component by itself is just a renderer for the conversation state, which the hook provides. The<AIConversation>
component requires some props:
messages
an array of the messages in the conversationhandleSendMessage
a handler that is called when a user message is sent.
TheuseAIConversation
hook provides these values and manages the messages state as user messages are sent and assistant responses are streamed back.
import{AIConversation}from'@aws-amplify/ui-react-ai';exportdefaultfunctionChat(){return(<AIConversationmessages={[]}handleSendMessage={()=>{}}/>)}
The code above won't really do much, but if you wanted to play around with the component or visually test how it will look, you can do that passing in your own set of messages.
Getting started
Make sure to first follow ourgetting started guide for the Amplify AI kit to set up your Amplify AI backend.
Conversations required a logged in user, so we recommend using the<Authenticator>
component to easily add authentication flows to your app.
import{Amplify}from'aws-amplify';import{ generateClient}from"aws-amplify/api";import{Authenticator}from"@aws-amplify/ui-react";import{AIConversation, createAIHooks}from'@aws-amplify/ui-react-ai';import'@aws-amplify/ui-react/styles.css';importoutputsfrom"../amplify_outputs.json";import{Schema}from"../amplify/data/resource";Amplify.configure(outputs);const client=generateClient<Schema>({ authMode:"userPool"});const{ useAIConversation}=createAIHooks(client);exportdefaultfunctionApp(){const[{ data:{ messages}, isLoading,}, handleSendMessage,]=useAIConversation('chat');// 'chat' is based on the key for the conversation route in your schema.return(<Authenticator><AIConversationmessages={messages}isLoading={isLoading}handleSendMessage={handleSendMessage}/></Authenticator>);}
Formatting Markdown
LLMs can respond with markdown. The<AIConversation>
component does not have built-in markdown rendering, but does allow for you to pass in your own markdown renderer.
importReactMarkdownfrom'react-markdown';<AIConversationmessageRenderer={{text:({ text})=><ReactMarkdown>{text}</ReactMarkdown>}}/>
ThemessageRenderer
property lets you customize how markdown is rendered within the chat according to your application's needs. The example below demonstrates how to add code syntax highlighting by usingReactMarkdown
withrehypeHighlight
.
importReactMarkdownfrom'react-markdown';importrehypeHighlightfrom'rehype-highlight';<AIConversationmessageRenderer={{text:({ text})=>(<ReactMarkdownrehypePlugins={[rehypeHighlight]}>{text}</ReactMarkdown>)}}/>
Rendering images
The<AIConversation>
component renders images in the conversation history by default. You can also customize how images are rendered withmessageRenderer
, similar to the text example above.
// Note: the image in a message comes in as a byte array// you will need to convert this to base64functionconvertBufferToBase64( buffer:ArrayBuffer, format:'png'|'jpeg'|'gif'|'webp'):string{const base64string=Buffer.from(newUint8Array(buffer)).toString('base64');return`data:image/${format};base64,${base64string}`;}<AIConversationmessageRenderer={{image:({ image})=>(<imgclassName="testing"width={200}height={200}src={convertBufferToBase64(image.source.bytes, image.format)}alt=""/>),}}/>
Welcome message
You can have the<AIConversation>
component display a welcome message when a user starts a new conversation.
I am your virtual assistant, ask me any questions you like!
<AIConversationwelcomeMessage={<Cardvariation="outlined"><Text>I am your virtual assistant, ask me any questions you like!</Text></Card>}/>
The welcome message will disappear once a message has been sent.
Customizing the timestamp
All messages have a timestamp associated with them that are displayed next to the username. To customize how the timestamp displays you can pass a custom text formatter function calledgetMessageTimestampText
into thedisplayText
property on the<AIConversation>
component. This function will receive aDate
object as its argument and should return a string.
Browsers have a really nice built-in date/time formatter you can use calledIntl.DateTimeFormat
.
User
3:23 PM
Hello
Assistant
3:24 PM
Hello! I am your virtual assistant how may I help you?
<AIConversationdisplayText={{getMessageTimestampText:(date)=>newIntl.DateTimeFormat('en-US',{ timeStyle:'short', hour12:true, timeZone:'UTC'}).format(date)}}/>
You could also return an empty string if you wanted to hide the timestamps altogether.
Attachments
Some of the newer LLMs like the Claude 3 family of models from Anthropic support multi-modal input, so you can send images in your message to the model and it can respond based on the messages. To enable this functionality in the component, there is anallowAttachments
prop you can enable.
There are some limitations on the filetype and size of the images attached. The file size for each file should be below 400kb when base64 encoded. Also the currently supported file types are: png, jpg, gif, and webp.
User
Sun, May 21 at 3:23 PM
Hello
Assistant
Sun, May 21 at 3:24 PM
Hello! I am your virtual assistant how may I help you?
<AIConversation//...allowAttachments/>
Avatars
You can customize the usernames and avatars used in theAIConversation
component by using theavatars
prop. This lets you control what your AI assistant looks like in the chat and what your user's username and avatar are.
There are 2 avatars,user
andai
, and each have ausername
andavatar
attribute. Theavatar
is a React Node and theusername
is a string.

danny
Sun, May 21 at 3:23 PM
Hello
Amplify assistant
Sun, May 21 at 3:24 PM
Hello! I am your virtual assistant how may I help you?
<AIConversationavatars={{ user:{ avatar:<Avatarsrc="/images/user.jpg"/>, username:"danny",}, ai:{ avatar:<Avatarsrc="/images/ai.jpg"/>, username:"Amplify assistant"}}}/>
Response components
Response components are a way to define custom UI components that the LLM can respond with in the conversation. This creates a richer experience than just text responses so the conversation can be more interactive and engaging. To define a response component you need any React component and give it a name, description, and define the props the LLM should know.
User
Sun, May 21 at 3:23 PM
Whats the weather in San Jose?
Assistant
Sun, May 21 at 3:24 PM
Let me get the weather for San Jose for you.
<AIConversation responseComponents={{WeatherCard:{ description:'Used to display the weather of a given city to the user',component:({ city})=>{return<Card>{city}</Card>;}, props:{ city:{ type:'string', required:true,},},},}}/>
Response components are just plain React components; they can have their own interactive state, fetch data, update shared state, or really anything you can think of. You can pair response components withdata tools, so the LLM can query for some data and then use a component to display that data. Or your response component could fetch data itself.
Adding a fallback
Because response components are defined at runtime and conversation histories are stored in a database, there can be times when there is a response component in the message history that the current application does not have. Response components are saved in the message history as a "toolUse" block, similar to how an LLM would respond when it wants to call a tool. The toolUse block contains the name of the component, and the props the LLM wanted to pass to the component. The LLM is never directly sending UI code, but rather an abstract representation of what it wants to render.
If the AIConversation component receives a response component message for a response component that was not given to it, by default it will just not render anything. However if you want to add a fallback component if no component is found based on the name, you can use theFallbackResponseComponent
prop. You can think of this like a 404 page for response components.
User
Sun, May 21 at 3:23 PM
Whats the weather in San Jose?
Assistant
Sun, May 21 at 3:24 PM
Let me get the weather for San Jose for you.
<AIConversationFallbackResponseComponent={(props)=>(<Cardvariation="outlined">{JSON.stringify(props,null,2)}</Card>)}/>