- Notifications
You must be signed in to change notification settings - Fork0
A react-based starter app for using the Live API over websockets with Gemini
License
Vibgitcode27/live-api-web-console
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
This repository contains a react-based starter app for using the Live API over a websocket. It provides modules for streaming audio playback, recording user media such as from a microphone, webcam or screen capture as well as a unified log view to aid in development of your application.
Watch the demo of the Live APIhere.
To get started,create a free Gemini API key and add it to the.env file. Then:
$ npm install && npm startWe have provided several example applications on other branches of this repository:
Below is an example of an entire application that will use Google Search grounding and then render graphs usingvega-embed:
import{typeFunctionDeclaration,SchemaType}from"@google/generative-ai";import{useEffect,useRef,useState,memo}from"react";importvegaEmbedfrom"vega-embed";import{useLiveAPIContext}from"../../contexts/LiveAPIContext";exportconstdeclaration:FunctionDeclaration={name:"render_altair",description:"Displays an altair graph in json format.",parameters:{type:SchemaType.OBJECT,properties:{json_graph:{type:SchemaType.STRING,description:"JSON STRING representation of the graph to render. Must be a string, not a json object",},},required:["json_graph"],},};exportfunctionAltair(){const[jsonString,setJSONString]=useState<string>("");const{ client, setConfig}=useLiveAPIContext();useEffect(()=>{setConfig({model:"models/gemini-2.0-flash-exp",systemInstruction:{parts:[{text:'You are my helpful assistant. Any time I ask you for a graph call the "render_altair" function I have provided you. Dont ask for additional information just make your best judgement.',},],},tools:[{googleSearch:{}},{functionDeclarations:[declaration]}],});},[setConfig]);useEffect(()=>{constonToolCall=(toolCall:ToolCall)=>{console.log(`got toolcall`,toolCall);constfc=toolCall.functionCalls.find((fc)=>fc.name===declaration.name);if(fc){conststr=(fc.argsasany).json_graph;setJSONString(str);}};client.on("toolcall",onToolCall);return()=>{client.off("toolcall",onToolCall);};},[client]);constembedRef=useRef<HTMLDivElement>(null);useEffect(()=>{if(embedRef.current&&jsonString){vegaEmbed(embedRef.current,JSON.parse(jsonString));}},[embedRef,jsonString]);return<divclassName="vega-embed"ref={embedRef}/>;}
This project was bootstrapped withCreate React App.Project consists of:
- an Event-emitting websocket-client to ease communication between the websocket and the front-end
- communication layer for processing audio in and out
- a boilerplate view for starting to build your apps and view logs
In the project directory, you can run:
Runs the app in the development mode.
Openhttp://localhost:3000 to view it in the browser.
The page will reload if you make edits.
You will also see any lint errors in the console.
Builds the app for production to thebuild folder.
It correctly bundles React in production mode and optimizes the build for the best performance.
The build is minified and the filenames include the hashes.
Your app is ready to be deployed!
See the section aboutdeployment for more information.
This is an experiment showcasing the Live API, not an official Google product. We’ll do our best to support and maintain this experiment but your mileage may vary. We encourage open sourcing projects as a way of learning from each other. Please respect our and other creators' rights, including copyright and trademark rights when present, when sharing these works and creating derivative work. If you want more info on Google's policy, you can find thathere.
About
A react-based starter app for using the Live API over websockets with Gemini
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Languages
- TypeScript81.0%
- SCSS16.7%
- HTML1.9%
- CSS0.4%