Best way to send a web audio API stream to server-side Java processing

I currently have a website that can record audio using Matt Diamond recorder.js + getUserMedia (web audio API). After the client has finished writing, I upload the data (in .wav) via the ajax post to the file system. Then my java server side processes data.wav in other formats. It works great.

However, I am concerned about performance / bandwidth issues with the ajax message and you want to explore the possibility of streaming data directly to java code or the file system. Does anyone have any suggestions on how to pass this client-side audio clip stream (opened with getUserMedia) to server-side Java? If that matters, we use the spring framework.

Thank.

EDIT:

Here is the requested code:

I added this to recorder.js

    this.upload = function () {
        var data = new FormData();
        data.append('file', blob);

        $j.ajax({
            url :  "/Your_Path/To_PHP/action/UploadAudio/",
            type: 'POST',
            data: data,
            contentType: false,
            processData: false,
            success: function(data) {
                alert("Success");
            },    
            error: function() {
                alert("uploadFail");
           }
        });
     }

And in the PHP code:

        if($_GET['action'] == 'UploadAudio') {
            if (!file_exists('/audioArchiveDestination/subDirectory/')) {
                mkdir('/audioArchiveDestination/subDirectory/', 0777, true);
            }
            $filePath = "/audioArchiveDestination/subDirectory/";

            $filename = $_SESSION['member_id'].'_'.time();   //Can be anything
            $fileExtension = "wav";
            $uploadSuccess = move_uploaded_file($_FILES['file']['tmp_name'], $filePath.$filename.".wav");

            if($uploadSuccess) {
                $results = json_encode(array("filename" => $filename));
                echo $results;
            }
        }
+4
source share
1 answer

You can try to encode data on the client to reduce bandwidth. In addition, you can send data as soon as you receive it. I think MediaRecorder (the standard API for multimedia encoding) is only available in Firefox at the moment, but I'm sure other browsers will implement it. In any case, here's how it happens, you can send content evt.dataevery time it is called for streaming to a java application.

<audio controls></audio>
<button></button>
<script>
  var v = document.querySelector("video");
  var b = document.querySelector("button");

  window.navigator.mozGetUserMedia({audio:true}, function(stream) {
    v.mozSrcObject = stream;
    v.play();

    var mediaRecorder = new MediaRecorder(stream);
    var chunks = [];

    b.addEventListener("click", function() {
        mediaRecorder.stop();
    });

    mediaRecorder.ondataavailable = function(evt) {
      chunks.push(evt.data);
      // or send evt.data using an XMLHttpRequest to the server
    };

    mediaRecorder.onerror = function(evt) {
      console.log('onerror fired');
    };

    mediaRecorder.onstop = function(evt) {
      console.log('onstop fired');
      // make a blob out of all the chunks
      var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
      // allow the user to download the opus file
      window.location.href = URL.createObjectURL(blob);
    };

    mediaRecorder.onwarning = function(evt) {
      console.log('onwarning fired');
    };

    v.addEventListener("loadedmetadata", function() {
      mediaRecorder.start();  
      v.play();
    });

    v.addEventListener("ended", function() {
      mediaRecorder.stop();
      mediaRecorder.requestData();
    });
  }, function() {
    alert("gUM failure")
  });
</script>
+1
source

Source: https://habr.com/ru/post/1542413/


All Articles