The Android platform offers built-in encoding/decoding for a variety of common media types, so that you can easily integrate audio, video, and images into your applications. Accessing the platform's media capabilities is fairly straightforward — you do so using the same intents and activities mechanism that the rest of Android uses.
Android lets you play audio and video from several types of data sources. You can play audio or video from media files stored in the application's resources (raw resources), from standalone files in the filesystem, or from a data stream arriving over a network connection. To play audio or video from your application, use the MediaPlayer class.
The platform also lets you record audio and video, where supported by the mobile device hardware. To record audio or video, use the MediaRecorder class. Note that the emulator doesn't have hardware to capture audio or video, but actual mobile devices are likely to provide these capabilities, accessible through the MediaRecorder class.
For a list of media formats for which Android offers built-in support, see the Android Media Formats appendix.
Media can be played from anywhere: from a raw resource, from a file from the system, or from an available network (URL).
You can play back the audio data only to the standard output device; currently, that is the mobile device speaker or Bluetooth headset. You cannot play sound files in the conversation audio.
Perhaps the most common thing to want to do is play back media (notably sound) within your own applications. Doing this is easy:
res/raw
folder of your project, where the Eclipse plugin (or aapt) will find it and
make it into a resource that can be referenced from your R classMediaPlayer
, referencing that resource using
MediaPlayer.create, and then call
start() on the instance:MediaPlayer mp = MediaPlayer.create(context, R.raw.sound_file_1); mp.start();
To stop playback, call stop(). If
you wish to later replay the media, then you must
reset() and
prepare() the MediaPlayer object
before calling start() again.
(create()
calls prepare()
the first time.)
To pause playback, call pause(). Resume playback from where you paused with start().
You can play back media files from the filesystem or a web URL:
MediaPlayer
using new
MediaPlayer mp = new MediaPlayer(); mp.setDataSource(PATH_TO_FILE); mp.prepare(); mp.start();
stop() and pause() work the same as discussed above.
Note: It is possible that mp
could be
null, so good code should null
check after the new
.
Also, IllegalArgumentException
and IOException
either
need to be caught or passed on when using setDataSource()
, since
the file you are referencing may not exist.
Note: If you're passing a URL to an online media file, the file must be capable of progressive download.
The Android platform includes a JET engine that lets you add interactive playback of JET audio content in your applications. You can create JET content for interactive playback using the JetCreator authoring application that ships with the SDK. To play and manage JET content from your application, use the JetPlayer class.
For a description of JET concepts and instructions on how to use the JetCreator authoring tool, see the JetCreator User Manual. The tool is available fully-featured on the OS X and Windows platforms and the Linux version supports all the content creation features, but not the auditioning of the imported assets.
Here's an example of how to set up JET playback from a .jet file stored on the SD card:
JetPlayer myJet = JetPlayer.getJetPlayer(); myJet.loadJetFile("/sdcard/level1.jet"); byte segmentId = 0; // queue segment 5, repeat once, use General MIDI, transpose by -1 octave myJet.queueJetSegment(5, -1, 1, -1, 0, segmentId++); // queue segment 2 myJet.queueJetSegment(2, -1, 0, 0, 0, segmentId++); myJet.play();
The SDK includes an example application — JetBoy — that shows how to use JetPlayer to create an interactive music soundtrack in your game. It also illustrates how to use JET events to synchronize music and game logic. The application is located at <sdk>/platforms/android-1.5/samples/JetBoy
.
Audio capture from the device is a bit more complicated than audio/video playback, but still fairly simple:
new
TITLE
, TIMESTAMP
, and the all important
MIME_TYPE
MediaRecorder.AudioSource.MIC
The example below illustrates how to set up, then start audio capture.
recorder = new MediaRecorder(); ContentValues values = new ContentValues(3); values.put(MediaStore.MediaColumns.TITLE, SOME_NAME_HERE); values.put(MediaStore.MediaColumns.TIMESTAMP, System.currentTimeMillis()); values.put(MediaStore.MediaColumns.MIME_TYPE, recorder.getMimeContentType()); ContentResolver contentResolver = new ContentResolver(); Uri base = MediaStore.Audio.INTERNAL_CONTENT_URI; Uri newUri = contentResolver.insert(base, values); if (newUri == null) { // need to handle exception here - we were not able to create a new // content entry } String path = contentResolver.getDataFilePath(newUri); // could use setPreviewDisplay() to display a preview to suitable View here recorder.setAudioSource(MediaRecorder.AudioSource.MIC); recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); recorder.setOutputFile(path); recorder.prepare(); recorder.start();
Based on the example above, here's how you would stop audio capture.
recorder.stop(); recorder.release();