AKSettings

@objc open class AKSettings: NSObject

Global settings for AudioKit

  • Enum of available buffer lengths from Shortest: 2 power 5 samples (32 samples = 0.7 ms @ 44100 kz) to Longest: 2 power 12 samples (4096 samples = 92.9 ms @ 44100 Hz)

    See more

    Declaration

    Swift

    @objc public enum BufferLength: Int
  • The sample rate in Hertz

    Declaration

    Swift

    @objc open static var sampleRate: Double = 44_100
  • Number of audio channels: 2 for stereo, 1 for mono

    Declaration

    Swift

    @objc open static var numberOfChannels: UInt32 = 2
  • Whether we should be listening to audio input (microphone)

    Declaration

    Swift

    @objc open static var audioInputEnabled: Bool = false
  • Whether to allow audio playback to override the mute setting

    Declaration

    Swift

    @objc open static var playbackWhileMuted: Bool = false
  • Global audio format AudioKit will default to

    Declaration

    Swift

    @objc open static var audioFormat: AVAudioFormat
  • Whether to output to the speaker (rather than receiver) when audio input is enabled

    Declaration

    Swift

    @objc open static var defaultToSpeaker: Bool = false
  • Whether to use bluetooth when audio input is enabled

    Declaration

    Swift

    @objc open static var useBluetooth: Bool = false
  • Additional control over the options to use for bluetooth

    Declaration

    Swift

    @objc open static var bluetoothOptions: AVAudioSessionCategoryOptions = []
  • Whether AirPlay is enabled when audio input is enabled

    Declaration

    Swift

    @objc open static var allowAirPlay: Bool = false
  • Global default rampTime value

    Declaration

    Swift

    @objc open static var rampTime: Double = 0.000_2
  • Allows AudioKit to send Notifications

    Declaration

    Swift

    @objc open static var notificationsEnabled: Bool = false
  • AudioKit buffer length is set using AKSettings.BufferLength default is .VeryLong for a buffer set to 2 power 10 = 1024 samples (232 ms)

    Declaration

    Swift

    @objc open static var bufferLength: BufferLength = .veryLong
  • The hardware ioBufferDuration. Setting this will request the new value, getting will query the hardware.

    Declaration

    Swift

    @objc open class AKSettings: NSObject
  • The hardware ioBufferDuration. Setting this will request the new value, getting will query the hardware.

    Declaration

    Swift

    @objc open static var ioBufferDuration: Double
  • AudioKit recording buffer length is set using AKSettings.BufferLength default is .VeryLong for a buffer set to 2 power 10 = 1024 samples (232 ms) in Apple’s doc : The requested size of the incoming buffers. The implementation may choose another size. So setting this value may have no effect (depending on the hardware device ?)

    Declaration

    Swift

    @objc open static var recordingBufferLength: BufferLength = .veryLong
  • If set to true, Recording will stop after some delay to compensate latency between time recording is stopped and time it is written to file If set to false (the default value) , stopping record will be immediate, even if the last audio frames haven’t been recorded to file yet.

    Declaration

    Swift

    @objc open static var fixTruncatedRecordings = false
  • Enable AudioKit AVAudioSession Category Management

    Declaration

    Swift

    @objc open static var disableAVAudioSessionCategoryManagement: Bool = false
  • If set to false, AudioKit will not handle the AVAudioSession route change notification (AVAudioSessionRouteChange) and will not restart the AVAudioEngine instance when such notifications are posted. The developer can instead subscribe to these notifications and restart AudioKit after rebuiling their audio chain.

    Declaration

    Swift

    @objc open static var enableRouteChangeHandling: Bool = true
  • If set to false, AudioKit will not handle the AVAudioSession category change notification (AVAudioEngineConfigurationChange) and will not restart the AVAudioEngine instance when such notifications are posted. The developer can instead subscribe to these notifications and restart AudioKit after rebuiling their audio chain.

    Declaration

    Swift

    @objc open static var enableCategoryChangeHandling: Bool = true
  • Turn off AudioKit logging

    Declaration

    Swift

    @objc open static var enableLogging: Bool = true
  • Checks the application’s info.plist to see if UIBackgroundModes includes audio. If background audio is supported then the system will allow the AVAudioEngine to start even if the app is in, or entering, a background state. This can help prevent a potential crash (AVAudioSessionErrorCodeCannotStartPlaying aka error code 561015905) when a route/category change causes AudioEngine to attempt to start while the app is not active and background audio is not supported.

    Declaration

    Swift

    @objc open static let appSupportsBackgroundAudio = (Bundle.main.infoDictionary?["UIBackgroundModes"] as? [String])?.contains("audio") ?? false
  • Shortcut for AVAudioSession.sharedInstance()

    Declaration

    Swift

    @objc open static let session = AVAudioSession.sharedInstance()
  • Convenience method accessible from Objective-C

    Declaration

    Swift

    @objc open static func setSession(category: SessionCategory, options: UInt) throws
  • Set the audio session type

    Declaration

    Swift

    @objc open static func setSession(category: SessionCategory,
                                    with options: AVAudioSessionCategoryOptions = [.mixWithOthers]) throws
  • Checks if headphones are plugged Returns true if headPhones are plugged, otherwise return false

    Declaration

    Swift

    @objc static open var headPhonesPlugged: Bool
  • Enum of available AVAudioSession Categories

    See more

    Declaration

    Swift

    @objc public enum SessionCategory: Int, CustomStringConvertible