Who doesn't like the beautiful music visualizations that you can find in your mediaplayer? This example shows the simplest and shortest amount of code that can plot the waveform of the microphone input. The app dynamically draws the recorded sound waves.
Before reading this, we highly recommend you to read the great tutorial at Hello World example first. This example is included in AudioKit in the Examples directory with versions for iOS and OSX.
There're some major additions to this project. First we create an instance of an AKMicrophone() which we will call "mic" to catch the audio from the standard input device:
However, we can't just write:
Because we want to get both a frequency and a amplitude of the input, we create an instance of AKFrequencyTracker() which will track the pitch of signal, setting both the lower and the upper bound of frequency detection. And then we create an instance of AKBooster() which is AudioKit's version of Apple’s Mixer Node. The gain parameter is an amplification factor (Default: 1, Minimum: 0).
Then, in viewDidAppear(), start the AudioKit engine.
Next we just to need to set up a plot. So we need to create an instance of AKNodeOutputPlot() to be able to plot the output from the mic in an signal processing graph. EZAudioPlot is a cross-platform (iOS and OSX) class that plots an audio waveform using Core Graphics.
Finally, we use updateUI() to update the labels with the frequency and amplitude of the input using the properties of the tracker object (created instance of AKFrequencyTracker()):
The OS X version differs from the iOS version of this example only in the user interface code.