Dependencies and Prerequisites
- Android 5.0 (API level 21) or higher
This class teaches you how to
- Provide Audio Services
- Configure Your Manifest
- Build a Browser Service
- Implement Play Controls
- Support Voice Actions
Related Samples
See Also
Video
Devbytes: Android Auto Audio
Drivers want to access their music and other audio content on the road. Audio books, podcasts, sports commentary, and recorded talks can make a long trip educational, inspirational, and enjoyable. The Android framework allows you to extend your audio app so users can listen to their favorite tunes and audio content using a simple, yet customizable user interface.
Apps running on mobile devices with Android 5.0 or higher can provide audio services for dashboard systems running Android Auto. By configuring your app with a few settings and implementing a service for accessing music tracks, you can enable Auto devices to discover your app and provide a browse and playback interface for your app's audio content.
This class assumes that you have built an app that plays audio through an Android device's integrated speakers or connected headphones. It describes how to extend your app to allow Auto devices to browse your content listings and play it through a car stereo system.
Provide Audio Services
Audio apps do not directly control a car dashboard device that runs Android Auto. When the user connects an Android mobile device into a dashboard system, Android Auto discovers your app through manifest entries that indicate what audio services your app can provide. The dashboard system displays a launcher icon for your app as a music provider and the user can choose to use your app's services. If the user launches your app, the Auto device queries your app to see what content is available, displays your content items to the user, and sends requests to your app to control playback with actions such as play, pause, or skip track.
To enable your app to provide audio content for Auto devices, you need to:
- Configure your app manifest to do the following:
- Declare that your app can provide audio content for Auto devices.
- Define a service that provides a browsable list of your audio tracks.
- Build a service that provides audio track listing information extending
MediaBrowserService
. - Register a
MediaSession
object and implement theMediaSession.Callback
object to enable playback controls.
Configure Your Manifest
When a user plugs an Android mobile device into a dashboard device running Auto, the system requests a list of installed apps that include app manifest entries to indicate they support services for Auto devices and how to access them. This section describes how to configure your app manifest to indicate your app supports audio services for Auto devices, and allow dashboard system to connect with your app.
Declare Auto audio support
You indicate that your app supports cars capabilities using the following manifest entry:
<application> ... <meta-data android:name="com.google.android.gms.car.application" android:resource="@xml/automotive_app_desc"/> ... <application>
This manifest entry refers to a secondary XML file, where you declare what Auto capabilities your
app supports. For an app that supports audio for cars, add an XML file to the res/xml/
resources directory as automotive_app_desc.xml
, with the following content:
<automotiveApp> <uses name="media"/> </automotiveApp>
For more information about declaring capabilities for Auto devices, see Getting Started with Auto.
Declare your media browser service
Auto devices expect to connect to a service in order to browse audio track listings. You declare this service in your manifest to allow the dashboard system to discover this service and connect to your app.
The following code example shows how to declare this listing browser service in your manifest:
<application> ... <service android:name=".MyMediaBrowserService" android:exported="true"> <intent-filter> <action android:name= "android.media.browse.MediaBrowserService"/> </intent-filter> </service> ... <application>
The service your app provides for browsing audio tracks must extend the
MediaBrowserService
. The implementation of this service is discussed
in the Build a Browser Service section.
Note: Other clients can also contact your app's browser service aside from Auto devices. These media clients might be other apps on a user's mobile device, or they might be other remote clients.
Specify a notification icon
The Auto user interface shows notifications about your audio app to the user during the course of operation. For example, if the user has a navigation app running, and one song finishes and a new song starts, the Auto device shows the user a notification to indicate the change with an icon from your app. You can specify an icon that is used to represent your app for these notifications using the following manifest declaration:
<application> ... <meta-data android:name="com.google.android.gms.car.notification.SmallIcon" android:resource="@drawable/ic_notification" /> ... <application>
Note: The icon you provide should have transparency enabled, so the icon's background gets filled in with the app's primary color.
Build a Browser Service
Auto devices interact with your app by contacting its implementation of a
MediaBrowserService
, which
you declare in your app manifest. This service allows Auto devices to find out what content your app
provides. Connected Auto devices can also query your app's media browser service to contact the
MediaSession
provided by your app, which handles content playback
commands.
You create a media browser service by extending the
MediaBrowserService
class.
Connected Auto devices can contact your service to do the following:
- Browse your app's content hierarchy, in order to present a menu to the user
- Get the token for your app's
MediaSession
object, in order to control audio playback
Media browser service workflow
- When your app's audio services are requested by a user through a connected Auto device, the
dashboard system contacts your app's media browser service.
In your implementation of the
onCreate()
method, you must create and register aMediaSession
object and its callback object. - The Auto device calls the browser service's
onGetRoot()
method to get the top node of your content hierarchy. The node retrieved by this call is not used as a menu item, it is only used to retrieve its child nodes, which are subsequently displayed as the top menu items. - Auto invokes the
onLoadChildren()
method to get the children of the root node, and uses this information to present a menu to the user. - If the user selects a submenu, Auto invokes
onLoadChildren()
again to retrieve the child nodes of the selected menu item. - If the user begins playback, Auto invokes the appropriate media session callback method to perform that action. For more information, see the section about how to Implement Playback Controls.
Building your content hierarchy
Auto devices acting as audio clients call your app's MediaBrowserService
to find out what content you have
available. You need to implement two methods in your browser service to support
this: onGetRoot()
and onLoadChildren()
.
Each node in your content hierarchy is represented by a MediaBrowser.MediaItem
object. Each of these objects is
identified by a unique ID string. The client treats these ID strings as
opaque tokens. When a client wants to browse to a submenu, or play a content
item, it passes the ID token. Your app is responsible for associating the ID
token with the appropriate menu node or content item.
Note: You should consider providing different content hierarchies depending on what client is making the query. In particular, Auto applications have strict limits on how large a menu they can display. This is intended to minimize distracting the driver, and to make it easy for the driver to operate the app via voice commands. For more information on the Auto user experience restrictions, see the Auto Audio Apps guidelines.
Your implementation of onGetRoot()
returns information about the root node of the menu
hierarchy. This root node is the parent of the top items your browse hierarchy.
The method is passed information about the calling client. You can use this
information to decide if the client should have access to your content at all.
For example, if you want to limit your app's content to a list of approved
clients, you can compare the passed clientPackageName
to your whitelist.
If the caller isn't an approved package, you can return null to deny access to
your content.
A typical implementation of onGetRoot()
might
look like this:
@Override public BrowserRoot onGetRoot(String clientPackageName, int clientUid, Bundle rootHints) { // To ensure you are not allowing any arbitrary app to browse your app's // contents, you need to check the origin: if (!PackageValidator.isCallerAllowed(this, clientPackageName, clientUid)) { // If the request comes from an untrusted package, return null. // No further calls will be made to other media browsing methods. LogHelper.w(TAG, "OnGetRoot: IGNORING request from untrusted package " + clientPackageName); return null; } if (ANDROID_AUTO_PACKAGE_NAME.equals(clientPackageName)) { // Optional: if your app needs to adapt ads, music library or anything // else that needs to run differently when connected to the car, this // is where you should handle it. } return new BrowserRoot(MEDIA_ID_ROOT, null); }
The Auto device client builds the top-level menu by calling onLoadChildren()
with the root node object and getting it's children. The client builds
submenus by calling the same method with other child nodes. The following
example code shows a simple implementation of onLoadChildren()
method:
@Override public void onLoadChildren(final String parentMediaId, final Result<List<MediaItem>> result) { // Assume for example that the music catalog is already loaded/cached. List<MediaBrowser.MediaItem> mediaItems = new ArrayList<>(); // Check if this is the root menu: if (MEDIA_BROWSER_ROOT.equals(parentMediaId)) { // build the MediaItem objects for the top level, // and put them in the <result> list } else { // examine the passed parentMediaId to see which submenu we're at, // and put the children of that menu in the <result> list } }
Enable Playback Control
Auto devices use MediaSession
objects to pass playback control
commands to an app that is providing audio services. Your audio app must create an instance of
this object to pass to the dashboard device and implement callback methods to enable remote
control of audio playback.
Register a media session
An Auto device using your app as audio service needs to obtain a MediaSession
object from your app. The Auto device uses the session object
to send playback commands requested by the Auto user back to your app.
When you initialize your browser service, you register that session object with your MediaBrowserService
by calling the setSessionToken()
method. This step
allows clients such as an Auto device to retrieve that object by calling your browser service's
getSessionToken()
method.
In your browser service's onCreate()
method,
create a MediaSession
. You can then query
the MediaSession
to get its token, and register
the token with your browser service:
public void onCreate() { super.onCreate(); ... // Start a new MediaSession MediaSession mSession = new MediaSession(this, "session tag"); setSessionToken(mSession.getSessionToken()); // Set a callback object to handle play control requests, which // implements MediaSession.Callback mSession.setCallback(new MyMediaSessionCallback()); ...
When you create the media session object, you set a callback object that is used to handle
playback control requests. You create this callback object by providing an implementation of the
MediaSession.Callback
class for your app. The next section
discusses how to implement this object.
Implement play commands
When an Auto device requests playback of an audio track from your app, it uses the
MediaSession.Callback
class from your app's
MediaSession
object, which it obtained from your app's
media browse service. When an Auto user wants to play content or control content playback,
such as pausing play or skipping to the next track, Auto invokes one
of the callback object's methods.
To handle content playback, your app must extend the abstract MediaSession.Callback
class and implement the methods
that your app supports. The most important callback methods are as follows:
onPlay()
- Invoked if the user chooses play without choosing a specific item. Your
app should play its default content. If playback was paused with
onPause()
, your app should resume playback. onPlayFromMediaId()
- Invoked when the user chooses to play a specific item. The method is passed the item's media ID, which you assigned to the item in the content hierarchy.
onPlayFromSearch()
- Invoked when the user chooses to play from a search query. The app should make an appropriate choice based on the passed search string.
onPause()
- Pause playback.
onSkipToNext()
- Skip to the next item.
onSkipToPrevious()
- Skip to the previous item.
onStop()
- Stop playback.
Note: Google Play requires your app not to play music immediately when it launches. For more information on this and other requirements, see Auto App Quality.
Your app should override these methods to provide any desired functionality.
In some cases you might not implement a method if it is not supported by your app.
For example, if your app plays a live stream (such as a sports
broadcast), the skip to next function might not make sense. In that case, you
could simply use the default implementation of
onSkipToNext()
.
When your app receives a request to play content, it should play audio the same way it would in a non-Auto situation (as if the user was listening through a device speaker or connected headphones). The audio content is automatically sent to the dashboard system to be played over the car's speakers.
For more information about playing audio content, see Media Playback, Managing Audio Playback, and ExoPlayer.
Support Voice Actions
To reduce driver distractions, you can add voice actions in your audio playback app. With voice action support, users can launch your app and play audio by providing voice input on Auto screens. If your audio playback app is already active and the user says “Play a song”, the system starts playing music without requiring the user to look at or touch the screen.
Enable your app to handle audio playback requests
Enable your audio app to launch with a voice command such as "Play [search query] on [your app name]" by adding the following entry in your manifest:
<activity> <intent-filter> <action android:name= "android.media.action.MEDIA_PLAY_FROM_SEARCH" /> <category android:name= "android.intent.category.DEFAULT" /> </intent-filter> </activity>
When the user says “Play music on [your app name]” on an Auto screen, Auto
attempts to launch your app and play audio by calling your app’s
MediaSession.Callback.onPlayFromSearch()
method. If the user has not specified criteria such as a track name or music genre, the
MediaSession.Callback.onPlayFromSearch()
method receives an empty query parameter. Your app should respond by immediately playing audio, such
as a song from a random queue or the most recent playlist.
Parse the voice query to build the playback queue
When a user searches for a specific criteria, such as “Play jazz on [your app name]”
or “Listen to [song title]”, the
onPlayFromSearch()
callback method receives the voice search results in the query parameter and an extras bundle. For
more information on how to handle search queries to play audio content, see
Play music
based on a search query.
To parse the voice search query to play back audio content in your app, follow these steps:
- Use the extras bundle and search query string returned from the voice search to filter results.
- Build the audio content queue based on these results.
- Play the audio content.
The
onPlayFromSearch()
method takes an extras parameter with more detailed information from the voice search.
These extras help you find the audio content in your app for playback. If the search results are
unable to provide this data, you can implement logic to parse the raw search query and play the
appropriate tracks based on the query.
The following extras are supported in Android Auto:
android.intent.extra.album
android.intent.extra.artist
android.intent.extra.genre
android.intent.extra.playlist
android.intent.extra.title
The following snippet shows how to override the
onPlayFromSearch()
method in your
MediaSession.Callback
implementation to handle the search query and extras for playing audio content in your app:
@Override public void onPlayFromSearch(String query, Bundle extras) { if (TextUtils.isEmpty(query)) { // The user provided generic string e.g. 'Play music' // Build appropriate playlist queue } else { // Build a queue based on songs that match "query" or "extras" param String mediaFocus = extras.getString(MediaStore.EXTRA_MEDIA_FOCUS); if (TextUtils.equals(mediaFocus, MediaStore.Audio.Artists.ENTRY_CONTENT_TYPE)) { isArtistFocus = true; artist = extras.getString(MediaStore.EXTRA_MEDIA_ARTIST); } else if (TextUtils.equals(mediaFocus, MediaStore.Audio.Albums.ENTRY_CONTENT_TYPE)) { isAlbumFocus = true; album = extras.getString(MediaStore.EXTRA_MEDIA_ALBUM); } // Implement additional "extras" param filtering } // Implement your logic to retrieve the queue if (isArtistFocus) { result = searchMusicByArtist(artist); } else if (isAlbumFocus) { result = searchMusicByAlbum(album); } if (result == null) { // No focus found, search by query for song title result = searchMusicBySongTitle(query); } if (result != null && !result.isEmpty()) { // Immediately start playing from the beginning of the search results // Implement your logic to start playing music playMusic(result); } else { // Handle no queue found. Stop playing if the app // is currently playing a song } }
Note: To minimize driver distractions, immediately initiate audio content
playback in the
onPlayFromSearch()
method when you have generated the audio content queue based on the user's request.
For a more detailed example on how to implement voice search to play audio content in your app, see the Universal Media Player sample.
Implement playback control actions
To provide a hands-free experience while users drive and listen to audio content in Android Auto, your app should allow users to control audio content playback with voice actions. When users speak commands such as “Next song”, “Pause music”, or “Resume music”, the system triggers the corresponding callback method where you implement the playback control action.
To provide voice-enabled playback controls, first enable the hardware controls by setting these
flags in your app’s
MediaSession
object:
mSession.setFlags(MediaSession.FLAG_HANDLES_MEDIA_BUTTONS | MediaSession.FLAG_HANDLES_TRANSPORT_CONTROLS);
Then, implement the callback methods with the playback controls that you support in your app. Here’s a list of voice-enabled playback controls supported by Android Auto:
Example phrase | Callback method |
---|---|
"Next song" | onSkipToNext() |
"Previous song" | onSkipToPrevious() |
"Pause music" | onPause() |
"Stop music" | onStop() |
"Resume music" | onPlay() |
For a more detailed example on how to implement voice-enabled playback actions in your app, see the Universal Media Player sample.