Fonofone, built with AudioKit


Simple, intuitive, playful, powerful and innovator… We present to you fonofone, an application for musical creation education developed for the school environment by renowned composers and pedagogues. Adaptable to any grade from preschool to high school, fonofone allows a global path through the basic techniques of composition, from generating sounds to constructing the layout of a collective project and its interpretation.

Fonofone is a production of COSIMU, a non-profit based in Montreal,Canada. Our mission is to develop the youth’s sense of wonder, curiosity and imaginary, by conceiving innovative approaches as well as intuitive and innovative tools for creation. The application was designed by the composer and pedagogue Yves Daoust, in collaboration with the digital instrument-maker and artist Alexandre Burton.


Apple's WWDC in Review - the AudioKit takeaway

There are two parts of WWDC that I'm going to write about. The first will be the audio related talks and the impact the new audio developments will have on AudioKit. The second is about the AudioKit people I met at the conference.

AudioKit Guys

There were two audio related talks. What's New in Audio was especially important because the new additions will have immediate impact on AudioKit.

The first thing that we're excited about is AVAudioPlayerNode now has a proper completion handler callback. Previously, the completion handler was called when an audio file completed loading, but now you can specify a callback to occur when a sound has completed playing! We actually built this feature into our AKSamplePlayer node, but now we'll have it with AKAudioPlayer as well.

The major new feature of AVAudioEngine is offline rendering. While AudioKit does use offline rendering already, for its test engine, we had to jump through a lot of hoops to make it work. It also only worked in iOS, not macOS, so our solution was limited. Now, it should be trivial for you to use AudioKit processing to affect files at blazingly fast speeds.

A related feature that at first seems almost uninteresting is realtime manual rendering. But, what that should allow us to do is take audio from other sources and process them with AudioKit. One exciting development is that this means we should be able to process streamed audio! We shoud also be able to be able to incorporate other sources of audio through this mechanism as well, which should allow integration with any other audio libraries you may have. Perhaps a speech synthesis engine for example?

The Designing Sound talk was definitely entertaining and it was great to see an engaging talk about audio at the conference. Unfortunately, the talk still conveyed this sense that audio is too special to be handled by just anyone. Audio that was highlighted were recordings, much like Foley sounds. Someday, I hope Apple will have a similar talk about the expressiveness of generated, custom audio, that still sounds amazing, but offers much more embedded information.

Of course, the best part of any good conference is meeting people. AudioKit faithful gathered at Gordon Biersch, including Ryan McLeod, the creator of Blackbox, Marcus Hobbs, AudioKit's resident microtonality expert, Paul Batchelor, creator of Soundpipe and Sporth which are central to AudioKit, Mark Jeschke, creator of DrumKick, and Yaron Karasik, creator of Jam Looper.


I want to specifically highlight Ryan. This year, Ryan won the coveted Apple Design Award for Blackbox. He and I have been collaborating on the most recent version of Blackbox for a few months, but this was the first time we met in person. I'm sure he would have won the award without using AudioKit, but its nice to be part of his software arsenal.

I also want to mention that I didn't even know that Yaron Karasik had released an AudioKit powered app called Jam Looper! This looper sets itself apart with its outstanding UI design, solid feel, and of course, AudioKit effects! Here's a link to download it from the app store.

Jam Looper

In conclusion, its a great time to be alive and be bonded together over this little audio engine called AudioKit. I'm feeling very blessed to have such awesome friends, and really in many ways, you're all a part of my family and I'm thankful for you all. I hope to meet you all again in 2018 with even more AudioKit users at WWDC!

AudioKit 3.7 - Microtonality, Audiobus 3, and new Sample Player

This is a very big release with lots of bug fixes and improvements but the main new features are

  • Microtonality - developed by the amazing Marcus Hobbs
  • Simplified Microphone Tracker - for applications that don't need the full AudioKit signal chain
  • Audiobus 3 Support for the great new IAA framework for iOS by Michael Tyson and friends
  • Sample Player - versatile, fast, and customizable player by Jeff Cooper

Blackbox, now powered by AudioKit


Blackbox is the #1 Indie game for the iPhone by Ryan McLeod, and it is now powered by AudioKit!

Below is what Ryan writes about the new audio update for the puzzle game, but I also wanted to write a personal note here saying that I've been working with Ryan on this update and he's an absolutely amazing person to work with. A great programmer, persistent and calculating, and super creative. It's been a real treat and I'm just as proud of this release of Blackbox as Ryan is!

Anyway, here's what Ryan has to say:

So you think Blackbox is hard eh? Have you ever tried playing with your eyes closed?

This update brings sonic interfaces to Blackbox. An entirely new dimension of clues and dynamic feedback that infuse new life into all 71 challenges. But this is about more than replacing the previous few sounds that I made with my mouth and laptop. Adding these new sonic interfaces, paired with careful haptic feedback, and countless accessibility enhancements means Blackbox is now accessible to almost everyone regardless of their needs. Did I mention the sounds are also just wicked cool?

So what’s a sonic interface? Well for the last year nearly 3.5 million people have played Blackbox solely using their eyes. For the over 258 million people worldwide with visual impairments, including blindness, the game was more or less a literal Blackbox. The sonic interfaces paired with Voice Over and haptic feedback create an entirely alternative interface for Blackbox that doesn’t require vision at all. So now, whether you use dynamic type sizing to read small text, text-to-speech to to hear content, or are fully blind you can enjoy the same loving suffering that is Blackbox.


AudioKit Talk at 360iDev Conference

AudioKit core team members Matt Fecher and Aurelius Prochazka will be giving a workshop at this year's 360iDev Conference in Denver, CO on August 13, 2017.

The workshop will be called "Build Your Own Custom Musical Instrument" but will be relevant to anyone working with audio.

Being able to add audio to your app is essential for creating a truly immersive multimedia experience. Even if you’re not interested in making music apps per se, audio is very important in games, alerts, and action feedback. At this workshop you can learn how easy it can be to create and manipulate audio using Swift.

We’ll start by using playgrounds to quickly design, prototype, and experiment with sound. We will not only generate new synthesized sounds, but also use effect processors to manipulate that sound with effects like reverb and delay. Then we’ll take the code from the playgrounds into an iOS project. It will have a multitouch keyboard, knobs, sliders, and other customizable UI components. Finally, we’ll make sure our app works well in Apple’s audio ecosystem by making our app play well with other audio apps such as AudioBus and Garageband using InterApp Audio.

During the second half of the day, the speaker and other audio developers will be on hand to help you with your own audio app. Or, you can form groups and collaborate with each other.

Register Today!