I have a UWP project and I want to use the Windows.Media.Audio API to play the file. Instead of using FileInputNode, I want to transfer the file so that I can pinpoint the various temporary properties.
I found the MediaStreamSource API and made the following code in an attempt to decode the 16-bit PCM channel .wav file
public async Task<Windows.Storage.Streams.Buffer> GetBuffer() { // check if the sample requested byte offset is within the file size if (byteOffset + BufferSize <= mssStream.Size) { inputStream = mssStream.GetInputStreamAt(byteOffset); // create the MediaStreamSample and assign to the request object. // You could also create the MediaStreamSample using createFromBuffer(...) MediaStreamSample sample = await MediaStreamSample.CreateFromStreamAsync(inputStream, BufferSize, timeOffset); sample.Duration = sampleDuration; sample.KeyFrame = true; // increment the time and byte offset byteOffset += BufferSize; timeOffset = timeOffset.Add(sampleDuration); return sample.Buffer; } else { return null; } }
Instead of using the Event system, I created a method that runs whenever my AudioFrameInputNode needs a new AudioFrame.
Now it seems that the resulting byte array in MediaStreamSample is exactly the same as when I was just reading my StorageFile using the DataReader.
Does MediaStreamSample.CreateFromStreamAsync create decoding of an audio file into a byte array with a float? Or is this done in MediaElement when it reproduces a sample?
And if so, how can I decode an audio file so that I can return the resulting AudioBuffer to my FrameInputNode?
source share