Capture system or application sound using AudioKit

I am not a modern mobile developer, so please excuse me. I was talking with a friend and I was curious. I have experience with audio, but not a modern fast + ios experience.

Let's say I had a view with a built-in YouTube player. The fast players I found (pods, etc.) do not seem to look like a sound channel, like a stream or an object. For instance:

// from the pod swift-audio-player

// Import Swift module
import YouTubePlayer
@IBOutlet var videoPlayer: YouTubePlayerView!
// init YouTubePlayerView w/ playerFrame rect (assume playerFrame declared)
var videoPlayer = YouTubePlayerView(frame: playerFrame)
// Load video from YouTube ID
videoPlayer.loadVideoID("nfWlot6h_JM")

The player has functions to control playback, but not at a low level. Suppose you can get the mpeg stream through another library, and you have access to the audio stream with mpegStream.audioChannels().

How could you integrate this with AudioKit? I understand how to mix / capture nodes from documents, but where would you connect the video player stream?

, AVFoundation, ?

// assume some library here
let mpeg = mpegStream.audioChannels()

// do audiokit sampling here and be able to capture audio
// like a karaoke machine / audio editor / sampler / sound board
import AudioKit

// attach node to mixer ??
// What interface could I look for?  AVFoundation stream?  Does AudioKit do that?

// from here .. I am clear with concepts.
// AudioKit lets you capture/route/modify audio in buffers just like
// A DAW or how SoundFlower works.

, YouTube mpeg AudioKit? , AudioKit . , , youtube, mpeg raw / ? AVFoundation ( AVFoundation, - AVFoundation AudioKit)?

+4
1

AudioKit .

+1

Source: https://habr.com/ru/post/1661516/


All Articles