Flutter 3.41 is live! Check out theFlutter 3.41 blog post!
Flutter AI Toolkit
Learn how to add the AI Toolkit chatbot to your Flutter application.
Hello and welcome to the Flutter AI Toolkit!
The AI Toolkit is a set of AI chat-related widgets that make it easy to add an AI chat window to your Flutter app. The AI Toolkit is organized around an abstract LLM provider API to make it easy to swap out the LLM provider that you'd like your chat provider to use. Out of the box, it comes with support forFirebase AI Logic.
Key features
#- Multiturn chat: Maintains context across multiple interactions.
- Streaming responses: Displays AI responses in real-time as they are generated.
- Rich text display: Supports formatted text in chat messages.
- Voice input: Allows users to input prompts using speech.
- Multimedia attachments: Enables sending and receiving various media types.
- Function calling: Supports tool calls to the LLM provider.
- Custom styling: Offers extensive customization to match your app's design.
- Chat serialization/deserialization: Store and retrieve conversations between app sessions.
- Custom response widgets: Introduce specialized UI components to present LLM responses.
- Pluggable LLM support: Implement a simple interface to plug in your own LLM.
- Cross-platform support: Compatible with Android, iOS, web, and macOS platforms.
Demo
#Here's what the demo example looks like hosting the AI Toolkit:

Thesource code for this demo is available in the repo on GitHub.
Or, you can open it inFirebase Studio, Google's full-stack AI workspace and IDE that runs in the cloud:
Get started
#- Installation
Add the following dependencies to your
pubspec.yamlfile:yamldependencies:flutter_ai_toolkit:^latest_versionfirebase_ai:^latest_versionfirebase_core:^latest_version - Configuration
The AI Toolkit supports both the Gemini endpoint (for prototyping) and theVertex endpoint (for production). Both require a Firebase project and the
firebase_corepackage to be initialized, as described in theGet started withthe Gemini API using the Firebase AI Logic SDKs docs.Once that's complete, integrate the new Firebase project into your Flutter appusing the
flutterfire CLItool, as described in theAdd Firebase to yourFlutter app docs.After following these instructions, you're ready to use Firebase to integrate AIin your Flutter app. Start by initializing Firebase:
dartimport'package:firebase_core/firebase_core.dart';import'package:firebase_ai/firebase_ai.dart';import'package:flutter_ai_toolkit/flutter_ai_toolkit.dart';// ... other importsimport'firebase_options.dart';// from `flutterfire config`voidmain()async{WidgetsFlutterBinding.ensureInitialized();awaitFirebase.initializeApp(options:DefaultFirebaseOptions.currentPlatform);runApp(constApp());}// ...app stuff hereWith Firebase properly initialized in your Flutter app, you're now ready tocreate an instance of the Firebase provider. You can do this in two ways. Forprototyping, consider the Gemini AI endpoint:
dartimport'package:firebase_ai/firebase_ai.dart';import'package:flutter_ai_toolkit/flutter_ai_toolkit.dart';// ... app stuff hereclassChatPageextendsStatelessWidget{constChatPage({super.key});@overrideWidgetbuild(BuildContextcontext)=>Scaffold(appBar:AppBar(title:constText(App.title)),// create the chat view, passing in the Firebase providerbody:LlmChatView(provider:FirebaseProvider(// Use the Google AI endpointmodel:FirebaseAI.googleAI().generativeModel(model:'gemini-2.5-flash',),),),);}The
FirebaseProviderclass exposes the Firebase AI Logic SDK to theLlmChatView. Note that you provide a model name (you have severaloptions from which to choose), but you do not provide an API key. Allof that is handled as part of the Firebase project.For production workloads, it's easy to swap in the Firebase Logic AI endpoint:
dartclassChatPageextendsStatelessWidget{constChatPage({super.key});@overrideWidgetbuild(BuildContextcontext)=>Scaffold(appBar:AppBar(title:constText(App.title)),body:LlmChatView(provider:FirebaseProvider(// Use the Vertex AI endpointmodel:FirebaseAI.vertexAI().generativeModel(model:'gemini-2.5-flash',),),),);}For a complete example, check out thegemini.dart andvertex.dartexamples.
- Set up device permissions
To enable your users to take advantage of features like voice input and mediaattachments, ensure that your app has the necessary permissions:
Network access: To enable network access on macOS, add the following toyour
*.entitlementsfiles:xml<plist version="1.0"><dict>...<key>com.apple.security.network.client</key><true/></dict></plist>To enable network access on Android, ensure that your
AndroidManifest.xmlfile contains the following:xml<manifest xmlns:android="http://schemas.android.com/apk/res/android">...<uses-permission android:name="android.permission.INTERNET"/></manifest>Microphone access: Configure according to therecord package's permissionsetup instructions.
File selection: Follow thefile_selector plugin's instructions.
Image selection: To take a picture onor select a picture from theirdevice, refer to theimage_picker plugin's installationinstructions.
Web photo: To take a picture on the web, configure the app according tothecamera plugin's setup instructions.
Examples
#firebase_options.dart
To use theVertex AI example app, place your Firebase configuration details into theexample/lib/firebase_options.dart file. You can do this with theflutterfire CLI tool as described in theAdd Firebase to your Flutter app docsfrom within theexample directory.
::: noteBe careful not to check thefirebase_options.dart file into your git repo. :::
Feedback
#Along the way, as you use this package, pleaselog issues and feature requests as well as submit anycode you'd like to contribute. We want your feedback and your contributions to ensure that the AI Toolkit is just as robust and useful as it can be for your real-world apps.
Unless stated otherwise, the documentation on this site reflects Flutter 3.38.6. Page last updated on 2026-01-07.View source orreport an issue.