Developer Preview] Google announces the local home SDK, full-screen app support for smart displays, and third-party availability for App Actions

0
3

Update (7/9/19 @ 2:25 PM ET): Google is launching the Local Home SDK Developer Preview today.

Google I/O is wrapping up, but there are still some announcements to cover that flew under the radar a bit. After all, there’s so many talks and events during the conference that it’s hard to keep up with it all. One such area that was overlooked is Actions on Google. At Google I/O 2019, the company announced a new local home SDK, full-screen app support for smart displays, and 3rd-party access to App Actions.

For those who may be unfamiliar with Actions on Google, it’s basically the developer side of Google Assistant integrations. It’s what allows developers to create the awesome integrations with Assistant that we use every day, and Google is always expanding the functionality. The platform is getting new tools for web, mobile, and smart home. Let’s take a look at what all of this means.

Local Home SDK

Smart home integration is a big part of Google Assistant and Google says there are now more than 30,000 compatible connected devices. The Local Home SDK is a step toward making the integration with smart devices even better.

The Local Home SDK allows smart home code to be run locally on Google Home speakers and Nest Displays, which can then use their radios to communicate locally with smart devices. This speeds up commands and makes them more reliable by reducing the number of cloud calls.

The Local Home SDK also improves the setup experience for smart devices. Google already started this with GE last year and you can set up their lights right from the Google Home app. This is a much easier and seamless experience for users. Google has already begun working with partners including Philips, Wemo, and LIFX on this SDK.

Full-screen Apps

Smart displays are becoming a bigger part of the Google Assistant hardware ecosystem. At I/O this year, Google launched the Nest Hub Max with a big 10-inch display. Google is letting developers take full advantage of these displays with a preview of “Interactive Canvas.” This allows apps to use the full screen for voice, visuals, and touch, but it’s not only limited to smart displays. It can also work on Android phones. Interactive Canvas is available for games right now (like HQ University), but soon Google will add more categories.

More App Actions

Lastly, let’s talk about new features for App Actions. App Actions were announced at last year’s Google I/O, but it has been pretty limited so far. Now Google is opening it up to more apps. App Actions allow developers to use intents from Assistant to deep link into specific parts of apps. Essentially a voice-launched shortcut, but a lot more powerful.

Google announced four new categories for these intents: Health & Fitness, Finance and Banking, Ridesharing, and Food Ordering. One example of a new use is starting a workout in a fitness app. You can say “Hey Google, start my run in Nike Run Club” and the app will open and start tracking your run. No need to find the app and manually start the workout.

They say it’s incredibly easy for developers to add these integrations. Apparently, the Nike Run Club feature was implemented in less than a day with the addition of an Actions.xml file. In the example above, Assistant jumped right into the app, but it can also show cards (Slices) right in the Assistant conversation.


These tools will allow developers to do more with Google Assistant, which is great for consumers. Homes are only getting smarter, displays will only become more prevalent, and users will rely on voice assistants, even more so now, to get stuff done. Check out the Actions website to learn more about building apps with these tools.

Source: Google


Update: Developer Preview

After announcing the Local Home SDK back in May during Google I/O, the company is now launching the SDK in a developer preview. Google has been testing the platform with partners and they’re ready to bring more into the fold. As mentioned during I/O, the SDK will allow developers to deeply integrate their smart devices into Assistant. Google has published an API reference and developer guides and samples to help people get started. Feedback during testing can be submitted through the bug tracker and /r/GoogleAssistantDev.

Source: Google

Want more posts like this delivered to your inbox? Enter your email to be subscribed to our newsletter.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here