Creating a music visualizer

So how does someone create a music visualizer? I looked at Google, but I did not find anything that says real programming; basically just links to plugins or application renderings.

I am using iTunes, but I understand that I need Xcode for this (I am currently deployed in Iraq and cannot download this large file). Therefore, right now it’s just interesting for me to study the “theory” behind it, for example, to process frequencies and everything that is required.

+45
visualization itunes music
Sep 30 '08 at 15:59
source share
12 answers

When the visualizer plays a song file, it reads the audio data in very short time slices (usually less than 20 milliseconds). The visualizer performs a Fourier transform on each fragment, extracts the frequency components and updates the visual display using frequency information.

How the visual display is updated in response to frequency information to the programmer. As a rule, graphical methods should be extremely fast and easy in order to timely update visual effects with music (rather than scare PCs). In the early days (and yet), visualizers often changed the color palette in Windows directly to achieve pretty good effects.

One of the characteristics of frequency-based visualizers is that they often do not respond to music “beats” (for example, percussion, for example) very well. More interesting and responsive visualizers can be written that combine frequency domain information with the recognition of "spikes" in audio, which often correspond to impact bumps.

+23
Sep 30 '08 at 16:17
source share

To create a BeatHarness ( http://www.beatharness.com ) I “just” used FFT to get the audio spectrum, then I used some filtering and / onset -detectors edge.

About Fast Fourier Transform: http://en.wikipedia.org/wiki/Fast_Fourier_transform

If you're used to math, you can read Paul Burke’s page: http://local.wasp.uwa.edu.au/~pbourke/miscellaneous/dft/

(Paul Bourke is the name you want to use in any case, he has a lot of information about those topics that you either want to know right now or, possibly, in the next 2 years;))

If you want to read Google beat / pace detection information for Masataki Goto, he has written some interesting articles about it.

Edit:

Its homepage: http://staff.aist.go.jp/m.goto/ Interesting reading: http://staff.aist.go.jp/m.goto/PROJ/bts.html

Once you have some meanings, for example, for bass, mid tones, treble and volume (left and right), it all depends on your imagination what to do with them. Display the image, multiply the size by the bass, for example - you get an image that zooms in, etc.

+18
May 20 '09 at 1:56 pm
source share

As a rule, you take a certain amount of audio data, start a frequency analysis above it and use this data to change the graphics displayed again and again. The obvious way to do frequency analysis is FFT , but a simple tone detection can work just as well, with lower computational overhead.

So, for example, you are writing a procedure that constantly draws a series of figures arranged in a circle. Then you use the dominant frequencies to determine the color of the circles and use the volume to set the size.

+13
Sep 30 '08 at 16:12
source share

There are many ways to process audio data, the simplest of which is to simply display it as a rapidly changing signal, and then apply some graphic effect to it. Similarly, things like volume can be calculated (and passed as a parameter to some graphical procedure) without performing a fast Fourier transform to get the frequencies: just calculate the average amplitude of the signal.

Converting data to the frequency domain using FFT or otherwise allows the use of more complex effects, including spectrograms . This is deceptive, but to detect even fairly “obvious” things, such as drum beat time or pitch notes directly from the FFT output

Reliable beat detection and tone detection are difficult problems, especially in real time. I am not an expert, but this page goes through some simple sample algorithms and their results.

+6
Sep 30 '08 at 16:35
source share

If you are looking for a small download, a fairly portable set of tools for the game (and a furious community to draw), I would suggest Processing ( http://www.processing.org ), specifically http://processing.org/learning/libraries/ in ESS section. This should lead you to the rabbit hole. I wouldn’t have done the iTunes visualizer, but was used to prototype this: http://www.barbariangroup.com/software/magnetosphere , which became the default iTunes visualizer.

+5
Sep 30 '08 at 16:30
source share
  • Develop an algorithm to draw something interesting on the screen based on a set of variables
  • Imagine a way to convert an audio stream into a set of variables analyzing things like bits / minute of frequency, different frequency ranges, tone, etc.
  • Connect the variables to your algorithm and see how it draws.

A simple visualization is one that changes the color of the screen every time the music goes over a certain frequency value. or just write bpm on the screen. or just displaying ociliscope.

check out this wikipedia article

+2
Sep 30 '08 at 16:22
source share

Like the proposed @Pragmaticyankee processing, it’s a really interesting way to visualize your music. You can upload your music to Ableton Live and use the equalizer to filter the high, mid and low frequencies from your music. You can then use the VST plugin to convert audio envelopes to MIDI CC messages, such as Mokafix Audio Gatefish (works on windows) or PizMidis midiAudioToCC plugin (works on Mac). You can then send these MIDI CC messages to a light-emitting hardware instrument that supports MIDI, such as percussa audiocubes. You can use a cube for each frequency that you want to display and assign a color to the cube. Take a look at this post:

http://www.percussa.com/2012/08/18/how-do-i-generate-rgb-light-effects-using-audio-signals-featured-question/

+2
Sep 21 '12 at 17:58
source share

Recently, we have added DirectSound-based audio input procedures to the LightningChart data visualization library. The LightningChart SDK is a set of components for Visual Studio .NET (WPF and WinForms), you may find this useful.

With the AudioInput component, you can get real-time waveform data samples from an audio device. You can play sound from any source, such as Spotify, WinAmp, CD / DVD player or use the microphone jack.

With the SpectrumCalculator component, you can get a power spectrum (FFT conversion), which is convenient in many visualizations.

With the LightningChartUltimate component, you can visualize data in many different forms, such as waveform diagrams, histograms, heat maps, spectrograms, 3D spectrograms, 3D lines, etc., and you can combine them. All rendering occurs through Direct3D acceleration.

Our own examples in the SDK have a scientific approach that does not have a particular entertainment aspect, but it can definitely be used for stunning visualization of entertainment.

We also have a custom SignalGenerator (sweeps, multi-channel configurations, sines, squares, triangles and noise signals, real-time WAV streaming, and DirectX audio output components to send wave data from speakers or line output.

Waveform and 3D spectrogram

Gradient bars [I am the CTO of LightningChart components, I only do this because I like it :-)]

+2
May 30 '14 at 11:47
source share

Lee Brimelou is a great video tutorial for this in flash. Must point you in the right direction, even if you want to realize it using something other than flash.

+1
Nov 06 '08 at 8:43
source share

VizKit Heiko Wichmann seems like a very good cross-platform (third-party) sdk and a starting point (it uses the Visualizer API Apple released a while ago).

I just compiled it using Xcode, and it also includes a visual studio project. iTunes crashed once, but after that it was all over. What I like so far: low dependencies (it was necessary to fix one skeleton path on my env), a lot of samples (equalizer, spectrum, album covers, histograms), a very transparent architecture , an acceptable license.

Also available at Sourceforge .

+1
Jun 29 '13 at 6:25
source share

Go to http://developer.apple.com/library/mac/#technotes/tn/tn2016.html . It provides iTunes Visualizer information directly from Apple, and it mentions that iTunes can provide you waveform data after FFT without any work.

0
01 Oct '10 at 1:09 on
source share

This link does exactly what you want, and the source code can be downloaded: I found it useful: http://www.raywenderlich.com/36475/how-to-make-a-music-visualizer-in-ios

0
Jun 07 '13 at 7:57
source share



All Articles