I currently have a Loop back program for testing Audio on Android devices.
It uses AudioRecord and AudioTrack to record PCM sound from a microphone and play PCM sound from a headphone.
Here is the code:
public class Record extends Thread { static final int bufferSize = 200000; final short[] buffer = new short[bufferSize]; short[] readBuffer = new short[bufferSize]; public void run() { isRecording = true; android.os.Process.setThreadPriority (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); int buffersize = AudioRecord.getMinBufferSize(11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT); arec = new AudioRecord(MediaRecorder.AudioSource.MIC, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, buffersize); atrack = new AudioTrack(AudioManager.STREAM_VOICE_CALL, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, buffersize, AudioTrack.MODE_STREAM); atrack.setPlaybackRate(11025); byte[] buffer = new byte[buffersize]; arec.startRecording(); atrack.play(); while(isRecording) { arec.read(buffer, 0, buffersize); atrack.write(buffer, 0, buffer.length); } } }
So, as you can see when creating AudioTrack and AudioRecord, encoding is provided through AudioFormat, but this only allows 16-bit or 8-bit PCM.
Now I have my own implementation of Codec G711, and I want to be able to encode the sound from the microphone and decode it in EarPiece. So I have encode (short lin [], int offset, byte enc [], int frames) and decode (byte enc [], short lin [], int frames) , but I'm not sure how to use them to encode and decoding audio from AudioRecord and AudioTrack.
Can someone help me or point me in the right direction?
source share