This class allows you to play an audio signal at a given frequency and with a given amplitude. It uses AudioQueues from AudioToolbox.framework. This is just a sketch, much needs to be clarified, but the signal creation mechanism works.
Using is pretty simple if you see @interface
.
The class generates a cos oscillogram oscillating at a given integer frequency ( amplitude * cos (2pi * frequency * t) ); all work is done using void handleBuffer(...)
using AudioQueue with linear PCM format, mono, float @ 44.1kHz. To change the waveform, you can simply change this line. For example, the following code creates a square shape:
float x = fmodf(unit*(CGFloat)(tone.frequency*i), 2 * M_PI); data[i] = amplitude * (x > M_PI ? -1.0 : 1.0);
For floating point frequencies, keep in mind that an integer amount of oscillation is not required in one second of audio data, so the presented signal is interrupted at the junction between the two buffers and creates a strange “tick”. For example, you can set fewer samples so that the connection is at the end of the period signal.
- As Paul P noted, you must first calibrate the hardware to get a reliable conversion between the value you set in your implementation and the sound your device makes. In fact, the floating point samples generated in this code range from -1 to 1, so I just converted the amplitude value to dB ( 20 * log_10 (amplitude) ).
- Take a look at the comments for other details in the implementation and the “known limitations” (all of these are “TO DO”). The features used are well documented by Apple in their link.
source share