Android P represents a big AI bet from Google and the company is adding special features throughout the OS to highlight its advancements in AI and machine learning. One of the new improvements with P is how Android can predict your next step and present to you actions and information from apps within other apps and Google Assistant

Google announced two new features – “App Actions” and “Slices” – that will prepare the smartphone for your next step by predicting what you might want to do. Going beyond basic search predictions, and predicting which apps you might want to open based on your past usage, Google is bringing predictive actions, suggesting things that you might want to do within specific apps.

Android P App Actions

Google plans to use the history and patterns of the way you use your smartphone by learning from things such as what time you take a particular action, or how much time you spend on a particular app activity.

For instance, connecting a headphone will show you a suggestion of opening your favorite music app, without actually opening the app. Google will surface these actions in the app drawer and other places in the OS. We have already seen some of this in action in the Settings page in Android P.

Android P Beta: "App Actions" and "Slices" Predict Your Next Move For Intuitive Experience

Google will be implementing App Actions across Google search, the launcher, Play Store, Google Assistant, and even bring smart text selection. So the next time you search for a movie, Google Search will suggest you to watch its trailer and book tickets even before you’ve found what you’re looking for.

Android P Slices

Slices give users the ability to interact with apps even without opening them. So, when you search for an app, the AI will give you suggestions based on the ways you use the app. Google appears to have taken cues from Samsung’s Bixby (and refined to it to a user-friendly level) in this feature.

Android P Beta: "App Actions" and "Slices" Predict Your Next Move For Intuitive Experience

As part of the on-stage demo, Google teased how searching for the Lyft app will help the users to book a ride to their most-visited locations, which the device-based machine learning algorithms will throw up based on what you ask Google. And since these Slices are interactive you can quickly take actions within the app without leaving the search results.

So what do you think of Slices and App Actions? Do you see them improving the way you use your phone? And what is your take on everything new in Android P that Google announced at I/O 2018. Let us know in the comments below!