How to add audio resource in Android?

Adding audio capabilities is a common requirement for many Android apps. The Android platform provides various APIs and frameworks for working with audio. This allows developers to easily incorporate audio playback and recording features in their apps.

The core components for audio in Android include the MediaPlayer, SoundPool, and AudioTrack classes. These classes provide interfaces for audio resource loading, streaming, sound pooling and audio track creation. Android also provides AudioAttributes to configure audio behaviors and AudioFocus for managing audio focus between apps.

This article provides an overview of the key concepts and APIs used to integrate audio into Android apps. We will cover adding raw audio files, audio resource loading, audio focus, playback controls, recording and other important topics related to handling audio in Android.

Add Raw Audio Files

The simplest way to add audio files in an Android app is to add them to the /res/raw directory in your Android Studio project. This folder is designed to hold raw asset files like images, audio clips, and other media. To add an audio file:

  1. Open your Android Studio project and navigate to the /res folder.
  2. Right click on /res and select New > Android Resource Directory.
  3. Name the new folder “raw”.
  4. Copy your audio files (MP3, WAV, OGG etc.) into the /res/raw folder.

The audio files in /res/raw can now be loaded in code using the R.raw class. This provides a simple way to bundle audio resources with your app.

Some key advantages of storing audio in /res/raw:

  • Files are bundled with app, no external storage needed.
  • Simple to access files in code via R.raw
  • Works for most audio formats like MP3, WAV, OGG etc.

The main downside is that these files are read-only at runtime. But for most audio playback needs, /res/raw provides an easy solution.

MediaPlayer Class

The MediaPlayer class can be used to play audio files in Android. It provides methods to control playback, such as start(), pause(), and stop(). MediaPlayer supports playback of raw audio files as well as audio streams.

To use MediaPlayer to play audio:

  1. Create a MediaPlayer instance
  2. Set the data source using setDataSource()
  3. Prepare the media player by calling prepare() or prepareAsync()
  4. Start playback using start()
  5. Control playback with pause(), resume(), seekTo(), etc.
  6. Release resources when done by calling release()

Some key points about MediaPlayer:

  • It can play both local and network audio files
  • Supports common media formats like MP3, AAC, OGG, WAV
  • Audio focus management is handled automatically
  • Buffers and decodes audio in the background for smooth playback

Overall, MediaPlayer provides a full-featured way to add audio playback in your Android app. With its playback controls and focus management, it simplifies working with audio significantly.

SoundPool Class

The SoundPool class is used to play short, recurring audio clips and sound effects in Android. It allows you to load multiple audio samples into memory and play them back with low latency. SoundPool is optimized for playing short sounds like game sounds effects, UI feedback beeps, etc.

Some key advantages of using SoundPool include:

  • Low latency audio playback – sounds can be played back quickly with minimal delay
  • Resource efficient – audio samples are loaded into memory and reused as needed
  • Support for concurrency – multiple sounds can be played at the same time
  • Volume control – the volume of each sound can be set independently

The SoundPool class manages all the sounds, allowing applications to play short sounds while minimizing memory usage and performance overhead. So for short sound effects or UI noises, SoundPool is recommended over solutions like MediaPlayer.

AudioTrack Class

The AudioTrack class provides low level audio playback capabilities in Android. It allows writing raw audio data to an audio sink for playback in real-time. This makes it useful for streaming audio, low latency audio applications like games, or music apps where you need precise timing control.

To use AudioTrack, you first initialize it with parameters like the sample rate, channel configuration, audio encoding and buffer size. You then write blocks of audio data to it from a callback using the write() method. The audio data typically comes from decoding compressed audio or generating synthesized audio.

Here is an example of playing a WAV file using AudioTrack from StackOverflow:

First initialize AudioTrack:

int sampleRate = 44100; 
int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int bufferSize = AudioTrack.getMinBufferSize(sampleRate, channelConfig, audioFormat); 

AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,

Then in a playback loop, read chunks of audio data from the WAV file and write to AudioTrack:

short[] buffer = new short[chunkSize];

while (playing) {
  int bytesRead =, 0, chunkSize);
  audioTrack.write(buffer, 0, bytesRead);

This allows low latency playback directly from the audio samples. See the AudioTrackMp3Player demo for a full example playing MP3s.

Audio Attributes

Audio attributes allow you to configure audio playback behaviors in Android. This includes settings like audio stream type, usage, content type, and flags. Audio attributes are specified using the AudioAttributes class.

Some key uses of audio attributes are:

  • Control audio routing – Send audio to speaker, headphone etc based on usage
  • Volume handling – Tie volume controls to audio usage
  • Focus management – Manage audio focus for different playback types like music, alarms etc.

For example, you can set the audio attributes usage to USAGE_ALARM so audio will always play at full volume even in silent mode. Or set it to USAGE_MEDIA so it obeys volume controls. Usage, content type, flags etc. let you control playback behavior.

Audio attributes are set on audio players like MediaPlayer and SoundPool using setAudioAttributes(). By configuring audio attributes, you can control how Android handles your app’s audio playback.

Audio Focus

Audio focus in Android allows multiple audio streams to share audio output resources and handle interruptions gracefully.
The system manages audio focus, forcing audio playback from an app to fade out when another app requests focus.
The system also mutes new audio streams if proper audio focus is not held.

To manage audio interruptions, request audio focus before starting playback using requestAudioFocus(). Pass an AudioFocusRequest with the same attributes used for the audio stream.

Implement an OnAudioFocusChangeListener to handle focus changes. Pause playback when losing focus and resume when gaining focus again.

Properly handling audio focus allows apps to share the audio stage politely. Streams are muted when lacking focus, avoiding jarring interruptions.

Audio Recording

To record audio input on an Android device, you have a few different options. The simplest is to use the built-in recorder app that comes pre-installed on many Android devices. For example, Samsung devices have the Voice Recorder app. You can also download third-party recording apps like Easy Voice Recorder from the Google Play store.

To use the recorder app, simply open it, tap the record button, and start speaking into your phone’s microphone. When finished, hit stop and you’ll have an audio file saved on your device. Most recorder apps allow you to set audio quality, file formats, etc. in the settings.

Some advantages of using a recording app include simplicity, onboard editing tools like trimming, and easy sharing. But recorder apps may lack more advanced audio options. For full control over the audio recording, you can use the MediaRecorder or AudioRecord APIs in your own Android app development.

Audio Playback Controls

Android provides built-in media playback controls that apps can integrate with to give users control over audio playback from the lock screen, notification shade, and headphone buttons. This allows for consistent and convenient media controls across apps.

The media playback system is built around the MediaSession API. To integrate with the media controls, your app needs to create a MediaSession when playback starts and call relevant methods like play(), pause(), etc. to send playback commands. The system’s MediaController then interacts with the session to handle user input and send actions.

Some advantages of using Android’s built-in media controls include:

  • Works across different Android versions and devices consistently
  • Users can control playback without opening your app
  • Integrates with hardware media buttons on headphones/bluetooth devices
  • Allows another app to control your app’s playback if granted permission

Overall, implementing MediaSession playback controls provides a superior user experience by giving easy access to audio playback systems natively in Android.


The Android platform provides various options for working with audio in an app, including playing raw audio files, using MediaPlayer, SoundPool, and AudioTrack classes, managing audio focus, recording audio, and implementing playback controls.

Overall, Android offers a robust audio architecture and API to build advanced audio functionality. The key is choosing the right approach based on your specific needs – whether playing short sound effects, background music, streaming audio, or recording user input.

For more information, refer to the official Android audio documentation, as well as the reference pages for key classes like MediaPlayer, AudioManager, and AudioTrack.

Leave a Reply

Your email address will not be published. Required fields are marked *