What Is Audio Routing & How Does It Work in Android?

Audio routing refers to how audio signals get directed between audio sources and audio sinks in an Android device. It determines which audio inputs get routed to which audio outputs (e.g. routing microphone audio to speaker output for a phone call).

Android’s audio routing system allows apps to manage audio sessions and define routing policies to control app-specific audio. The Android AudioManager handles audio routing by connecting audio sources like microphones and MediaPlayer outputs to audio sinks like speakers, headphones, and Bluetooth devices. Android provides developers an API to configure this routing between sources and sinks.

Proper audio routing is important for Android apps that handle audio playback or recording. With configurable audio routing policies, apps can optimize their audio experience and performance. This allows routing audio to different outputs based on use case, like routing music to headphones but call audio to speaker. Understanding Android audio routing helps developers build robust audio apps.


[1] https://source.android.com/docs/core/audio/combined-audio-routing

Audio Routing Components

The main components involved in audio routing in Android include:

Media framework – Provides the high-level APIs for media functionalities like audio playback and recording. Most apps interact with the media framework APIs in android.media package.

AudioFlinger service – Acts as the mediator between the media framework and the audio hardware. It handles audio mixing, routing, audio effects, volume control etc. (source 1)

Audio HAL – The Hardware Abstraction Layer provides standard interfaces that connect the AudioFlinger to actual audio drivers. Different devices can have different audio drivers but implement the same HAL interfaces. (source 2)

Audio drivers – Low-level software components that interface with the audio hardware/chipset and expose control APIs to the higher layers. They handle hardware-specific operations.

Audio Sources

The main audio sources in Android devices include the built-in microphone, wired headset, Bluetooth headset, and USB audio. The MediaRecorder.AudioSource class defines constants for these audio sources that indicate where the audio will be recorded from. The audio system handles routing audio data from these sources to the appropriate audio sinks.

The built-in microphone is represented by the MediaRecorder.AudioSource.MIC constant. This allows apps to capture audio using the microphone(s) built into the Android device. Microphone data is generally routed to sinks like phone calls, voice assistants, video recording, and audio recording apps.

Wired headsets connected via a 3.5mm jack or USB-C port are represented by MediaRecorder.AudioSource.WIRED_HEADSET. Audio played on the device can be routed to wired headsets for private listening. The headset’s microphone can also be used as an audio source. Wired connections typically have higher quality and lower latency than Bluetooth.

Bluetooth headsets and headphones paired to the device are identified by MediaRecorder.AudioSource.BLUETOOTH_SCO. The audio system handles transmitting audio over the Bluetooth classic or Low Energy Audio profile. Bluetooth audio has more latency but offers wireless freedom.

Finally, USB headsets, microphones, and sound cards connected via USB are represented by MediaRecorder.AudioSource.VOICE_UPLINK and MediaRecorder.AudioSource.VOICE_DOWNLINK. USB offers high-quality, low-latency audio input and output compared to other sources.

Audio Sinks

Audio sinks refer to the endpoints where audio is output on an Android device. There are several common audio sinks:


The built-in speaker(s) on the device are used to play media and other audio out loud. The speaker is usually the default audio sink. Android provides the Audio Manager system service to control audio routing to the speaker.

Wired Headset

A wired headset plugged into the 3.5mm audio jack on the device routes audio playback privately through the headset. When a wired headset is connected, Android typically automatically switches the audio output to route to the headset instead of the speaker.

Bluetooth Headset

Android supports pairing and connecting with Bluetooth audio devices, like wireless headsets and speakers. The user can select a paired Bluetooth device in Settings to switch the audio output and route it wirelessly over Bluetooth.

USB Audio

Some Android devices allow audio output routing through a connected USB device that provides audio support, such as wired USB headphones or an external USB DAC. The audio can be routed through USB while also charging the phone using the same cable.

Audio Routing Policies

Android uses audio routing policies to determine how audio should be directed between various audio sources and sinks. The default routing policy sends audio to device speakers or wired headsets depending on what is connected. However, apps can override the default policy to direct audio to different outputs.

The default routing policy is configured in the audio_policy_configuration.xml file and specifies default routes for various use cases like media, alarms, and notifications. For example, it may specify that media audio should be routed to the wired headset if connected, otherwise to the built-in speakers (1).

Apps can override the default routing by calling methods like setSpeakerphoneOn() or setBluetoothScoOn() to direct audio to different outputs while the app is in focus. This allows apps to control how their audio is presented based on context. For example, a video call app may want to route audio to a wired headset if available (2).

The audio policy system in Android also supports making dynamic policy changes at runtime in response to changes in audio devices or other signals. For example, it can automatically pause music playback when a headset is disconnected and resume playback on the device speaker (3).

Overall, Android audio routing policies provide a flexible system to dynamically control audio routing between sources like media apps and sinks like speakers, headphones, and bluetooth devices.

Audio Focus

Audio focus refers to the system that manages audio playback priority and exclusive audio resource access in Android (source). It ensures that important audio streams are not disrupted by less critical streams. There are different audio focus states like gain, loss, loss transient, etc. that determine which audio streams have priority.

An app has to explicitly request audio focus from the AudioManager when it wants exclusive access to audio playback. This is done by creating an AudioFocusRequest object with details about the audio usage and passing it to requestAudioFocus(). The app will get callbacks like onAudioFocusChange() when focus is gained, lost or changed.

To release audio focus, the app calls abandonAudioFocus() method. The callbacks received during different audio focus state changes allow the app to seamlessly pause, resume, duck or mute audio playback based on priority.

Implementing Audio Routing

The AudioManager API is the primary way to control audio routing in Android. Some key methods include:

setMode() to set the overall audio mode, like normal, ringtone or in-call.

setSpeakerphoneOn() to route to speaker or earpiece for calls.

setBluetoothScoOn() to route to Bluetooth headset.

It’s important to properly handle routing changes like headset plug/unplug or Bluetooth connects/disconnects. You can register an OnAudioFocusChangeListener to be notified of changes.

Common routing scenarios include:

  • Media playback – allow switching between speaker, wired headset, and Bluetooth.
  • Phone calls – route audio to earpiece or speakerphone depending on user settings.
  • Navigation guidance – mix with music playback and allow switching output.
  • Accessibility – always allow routing media playback to accessibility aids.

With some logic to handle these cases, your app can robustly support audio routing across various devices and use cases.

Automotive Audio Routing

Automotive audio routing refers to how audio is directed within an in-vehicle infotainment system. Modern cars have complex audio systems with features like entertainment screens, navigation, hands-free calling, and more. Android Automotive OS provides APIs and components for handling audio routing in this complex environment.

A key aspect of automotive audio routing is supporting multiple audio zones within the vehicle. For example, the driver may be using navigation voice prompts while rear seat passengers are listening to media. Android Automotive OS allows assigning audio streams to different zones via the AudioManager APIs [1]. Apps can specify the target audio zone for their streams.

Another important capability is routing audio to match what zone the user is interacting with. Android ties audio streams to the User ID associated with a zone. As users change zones, the audio will automatically switch. This provides a seamless audio experience as passengers move about the vehicle.

Specific APIs like CarAudioManager provide automotive-centric audio controls [2]. For example, adjusting volume per zone, managing focus, and designating priority streams. Android Automotive OS handles the underlying audio routing to deliver a robust, zoned audio system using these APIs.

In summary, Android Automotive OS provides specialized capabilities for audio routing within complex, multi-zone in-vehicle infotainment systems. The audio architecture and APIs allow seamless zoned audio aligned to users and customized for automotive needs.

Testing and Troubleshooting

Proper testing and troubleshooting are crucial when developing audio routing functionality for Android. Emulators provide a key tool for initial testing before deployment to physical devices. The Android Studio emulator includes an emulated audio output device that can be used to test basic playback and routing. More advanced emulators like Genymotion offer additional configurable audio inputs and outputs to test complex routing scenarios.

At runtime, audio sources and sinks can be monitored using dumpsys audio from adb shell or audio debugging apps like USB Audio Recorder PRO. This allows inspection of currently active streams, volumes, and routes. Monitoring these values over time is key to diagnosing issues.

Common routing problems include improper audio focus handling, invalid configurations, faulty connections, sample rate mismatches, volume inconsistencies, and unintended rerouting. Each issue must be systematically ruled out. Check focus requests, examine routing policies, verify hardware/software compatibility, inspect sample rates, adjust volume controls, and trace routing changes step-by-step until the source of the problem is identified.

In depth troubleshooting may require analyzing AudioRecord/AudioTrack buffer contents, inserting debug callbacks, and monitoring AudioManager events. But methodically verifying correct audio focus, routing policies, configurations, and monitoring key stream properties will resolve many basic Android audio routing issues.


By this point, we have clearly seen the key audio routing concepts and importance of understanding the routing architecture in Android. Audio routing allows multiple audio streams to be directed to appropriate audio sinks based on configured policies. The main components include audio sources like media players, audio sinks like speakers and headphones, routing policies defined in the audio framework, and prioritization of audio focus.

Implementing proper audio routing is crucial for providing a seamless audio experience, especially for use cases like automotive where managing audio across different zones is critical. Testing and troubleshooting routing issues requires knowledge of the architecture and tools like dumpsys. As audio innovations continue, like capabilities for simultaneous streaming to multiple sinks, Android’s flexible routing architecture will adapt to enable these new functionalities.

In summary, comprehending audio routing internals is foundational for anyone working in Android audio development. We have covered the key concepts here, but there is always more to learn. The Android audio framework documentation provides further details for those looking to dive deeper. With this knowledge in hand, developers can build immersive audio experiences on Android and push the platform forward.

Leave a Reply

Your email address will not be published. Required fields are marked *