Web audio API - difference between PeriodicWave and AudioBufferSourceNode loop to achieve wavetable?

I use two methods to create a wave synthesizer sound:

1 - Loop a AudioBufferSourceNode that contains one waveform loop

// Load a single cycle short wave file, then :
  audioContext.decodeAudioData(audioData, function(buffer) {
     source.buffer = buffer;
     source.loop = true;
   },

2 - Create a PeriodicWave and provide it with Fourier coefficients (using the coefficients found in the network, i.e. (0,1) for the sine wave (0, .1, .4, .6, ...) for more complex waves.

 var wave = ac.createPeriodicWave(real, imag); 
 OscillatorNode.setPeriodicWave(wave);

What are the pros and cons of using one method over another? Do these techniques get very different sound results?

I have a demo that has both approaches: http://davedave.us/wavetable-synth/

My code works, but it is here: https://github.com/looshi/wavetable-synth

+4
1

, , , . , , , , . , .

, .

, , , . .

+2

Source: https://habr.com/ru/post/1667281/


All Articles