Mac OS X equivalent for DirectShow, GraphEdit

New to Mac OS X, familiar with Windows. Windows has DirectShow, a large number of built-in filters, COM programming and GraphEdit for very quick prototyping and tracking on the graphs that you created in the code.

Now I'm going to switch to a Mac for working with cameras, webcams, microphones, color spaces, files, splitting, synchronizing, rendering, reading files, saving files and many things that I came up with for the provided DirecShow when building live applications. On the Mac side, so far I have found ... nothing! Either I don’t know where to look, or I have the most difficult time connected with the Mac’s reputation due to the ease of handling media with consistent software ability to penetrate there and start working with building blocks of media manipulatin.

I saw some weak suggestions for using gstreamer or some library for QT, but I can’t make myself believe that this is Apple's way. And I came across some QuickTime docs, but I don't want to do transitions, sprites, translation, ...

Having a brain trained by DirectShow, I don’t even know how Apple thinks about providing DirectShow-like features. This means that I don’t know the right keywords and don’t even know where to look. Books? I bought a few. Now I could write a code that could edit the wedding video with my sister (if I can’t make decent progress on this topic, I’ll probably ask what it would cost you), but to determine which filters are available and how to connect them together ... nothing. Suggestions?

+6
source share
1 answer

Currently, video processing is on a huge transition on the Mac. QuickTime is very old, but also large and powerful, so it has undergone a gradual replacement process over the last 5 years or so.

However, QTKit is a subset of QuickTime (capture, playback, format conversion and basic video editing) that is supported in the future. Inherited QuickTime APIs still exist at the moment and are likely to remain at least until its core functions are available elsewhere, but are only 32-bit. For some of the footage involved, you may need to use it locally.

IOS is currently ahead of the Mac because it can start from scratch with the AV Foundation . The future of Mac media files is likely to be either directly AV Foundation (with QTKit, which is a light overlay), or a QTKit extension that looks very similar.

For audio, there is Core Audio, which is on Mac and iOS and will not leave soon. He is quite powerful, but somewhat dumb. Fortunately, online support is very good; a mailing list is an important resource.

For filters and frame-level processing, you have Core Video , as mentioned above, as well as Core Image . For motion graphics, Quartz Composer , which includes a graphical editor and plugin architecture to add your own patches. For programmatic procedural animation and easily mixing rendering models (OpenGL, Quartz, video, etc.) There is Core Animation .

In addition to all of these, of course, there is no reason why you cannot use open source libraries where the embedded material does not do what you want.


To post your comment below:

In QuickTime (and QTKit), individual data types, such as audio and video, are represented as tracks. It may not be immediately clear that QuickTime can open audio as well as video file formats. A common way to combine audio and video:

  • Create a QTMovie with your video file.
  • Create a QTMovie with your audio file.
  • Take the QTTrack object representing the audio and add it to QTMovie with the video in it.
  • Flatten the film, so it not only contains a link to another film, but actually contains audio data.
  • Burn movie to disc.

Here is an example from Blender . You will see how the A / V multiplexer runs in end_qt . It also uses Core Audio there ( AudioConverter* ). (There's some kind of classic QuickTime export code in quicktime_export.c , but it doesn't seem to sound.)

+10
source

Source: https://habr.com/ru/post/885528/


All Articles