Visualizing Android AudioTrack from ByteStream

I am currently using AudioTrack to play recorded music. This includes recording track data in an audio buffer. How can I use the audio stream to draw a waveform on the screen that represents real-time sound? I did not use preliminary graphics on Android before I am not sure how to get started. Should I use openGL or can I implement View? as well as how to transform data into something useful for drawing.

+4
source share
2 answers

Your first step is to call setPositionNotificationPeriod(periodInFrames) . This determines how often your application will receive an onPeriodicNotification call. So, if you are doing an oscilloscope type visualization, and you want to show audio data for 50 milliseconds for a given moment, you should use the periodInFrames value of 2205 (if your WAV file is mono, 16 bits per sample and sampling frequency 44 100 Hz).

Inside the notification event, you can determine where the playback of your AudioTrack object is currently located, and then get the corresponding piece of data from the source file or array. Then you then use this piece of data for viewing using conventional 2D graphics methods (there is no need for openGL here). This answer contains a C # sample for drawing a piece of audio data - easy to translate to java.

So your code will look something like this:

 audioTrack.setPositionNotificationPeriod(2205); audioTrack.setPlaybackPositionUpdateListener(this); ... public void onPeriodicNotification(AudioTrack track) { int pos = track.getNotificationMarkerPosition(); short[] slice = Array.copy(_data, pos, _sliceSize) // pseudo-code // render the slice to the view } 
+3
source

As for this point, I would suggest that you use the android.media.audiofx.Visualizer class, which handles the generation of waveform arrays or frequency arrays. So the only thing you need to take care of is to draw graphics.

You can use this class since the release of API 9.

+2
source

Source: https://habr.com/ru/post/1337230/


All Articles