I am looking to create a function that could create a fade-in / fade-out function in a WAV file in five seconds.
I found this code on the MATLAB forums, but it seems that the implementation was a bit wrong, although there is a good idea. This was for 300ms .WAV files with 10 second attenuation / output:
tenmssamples = length(soundfile)*10/300;
fade1 = linspace(0,1,tenmssamples);
fadedsound = soundfile .* ...
[fade1, ones(1,length(soundfile)-2*tenmssamples), fliplr(fade1)]
tenmssamples = length(soundfile)*10/300;
fade2 = sin(linspace(0,2*pi/4,tenmssamples));
fadedsound2 = soundfile .* ...
[fade2, ones(1,length(soundfile)-2*tenmssamples), fliplr(fade2)]
I see what he was trying to do, trying to scale the first 10 waveform samples read by the incremental function using linspace, but I tried changing and changing it, but I can't get it to work.
Does anyone have any suggestions? Thank.
source
share