Who doesn't like the beautiful music visualizations that you can find in your mediaplayer? Essentially, it is just the simplest and shortest amount of code that plot the waveform of the microphone input. The app dynamically draws the recorded sound waves.
Before reading this, we highly recommend you to read a great tutorial on Hello World example first. This example is included in AudioKit in the Examples directory with versions for iOS and OSX.
There're some major additions to this project. First we do create an instance of an AKMicrophone() which we will just call "mic" to catch the audio from the standard input device:
However, we can't just write:
Cause we want to get both a frequency and a amplitude of the input. That's the reason we create an instance of AKFrequencyTracker() what will track the pitch of signal setting both the lower and the upper bound of frequency detection, respectively. And then we create an instance of AKBooster() which is AudioKit version of Apple’s Mixer Node. The gain parameter is an amplification factor (Default: 1, Minimum: 0).
Then, in viewDidAppear(), start the AudioKit engine.
Next we just to need to set up a plot. So we need to create an instance of AKNodeOutputPlot() to be able to plot the output from the mic in an signal processing graph. EZAudioPlot is a cross-platform (iOS and OSX) class that plots an audio waveform using Core Graphics.
Finally, we use updateUI() to update the labels via getting both a frequency and a amplitude of the input using the properties of the tracker object (created instance of AKFrequencyTracker()):
The OS X version differs from the iOS version of this example only in the user interface code.