Recently I tried experimenting with Google Home, trying to voice control LEDs. Majorly the whole thing can be split into two parts,
- A custom command that makes a web POST request to fetch the result.
- A simple Flask app that can receive post request with parameters and glow some LEDs based on the POST request data.
For the part one, the custom commands were possible thanks toGoogle Actions Apis. I usedAPI.AI for my purpose since they had good documentation. I wont go into detail explaining the form fields inApi.ai, they have done a good job with documentation and explaining part, I will just share my configurations screenshot for your quick reference and understanding. In Api.ai the conversations are broken intointents. I used one intent (Default Welcome Intent) and a followup intent (Default Welcome Intent – custom) for my application.
Heres my first intent which basically greets the user and asks for a LED colour when the custom command“glow LEDs” is activated.
As you can see theUser says is what defines my command , you can add multiple statements in which you want to activate the command. TheAction andContexts is set when you create a followup Intent.Text response is the part which your Google Home will use as response.
Next is the Followup Intent which basically takes the User response as input context (which is handled automatically when you create the followup intent) and looks for required parameters and tries to process the request.
Here the expectedUser says would be a colour (red, blue, green) is what I allowed. In Api.ai you can use their ML to process the speech and find your needed parameters and values. I needed colours hence used@sys.color. Their are other entities like@sys.address or@sys.flight etc. If these entities don’t serve your purpose then you might want to go vanilla and process the speech on your web-api end. The later part of the Followup Intent is a bit different, we are fulfilling the user request via web-hook here. Here theResponse is the fallback response incase the web request fails, the success response is received from web-hook response body.
The fulfilment option won’t be activated until you add your webhook in theFulfillment section. Thats all for the part one. Also you can useGoogle Web Simulator to test your application On the Go.
In part two , I used a Raspberry Pi, 3 LEDs (red, blue, green) , a 1K ohm resistor some wires, a breadboard(optional) and a T-cobbler Board(optional). Now, we will write a flask application that will accept a post request and turn on the required GPIO pin output high/low.
You can check with the request and response structure you need from the Api.ai docs. Next, this application receives the calls from api.ai webhook and it triggers the targeted LED depending on theresolvedQuery. The above code was written so that I can test locally with get requests too. I usedpagekite.net to tunnel and expose my flask application to the external world. Following is the circuit diagram for the connections.
Following is the Result,
Some more Reads:
- https://arstechnica.com/gadgets/2016/12/google-assistant-api-launches-today-so-we-tested-some-custom-voice-commands/
- https://docs.api.ai/docs/actions-on-google-integration
- https://developers.google.com/actions/develop/conversation
- https://developers.google.com/actions/develop/apiai/tutorials/getting-started
- https://developers.google.com/actions/samples/
- https://docs.api.ai/docs/webhook
- https://docs.api.ai/docs/concept-intents#user-says
Top comments(0)
For further actions, you may consider blocking this person and/orreporting abuse