Movatterモバイル変換


[0]ホーム

URL:


Skip to content
DEV Community
Log in Create account

DEV Community

Vlad Gorlov
Vlad Gorlov

Posted on • Edited on

     

Using SwiftUI and Metal in AudioUnit v3 Plug-In

On this Page

Creating Plug-In scaffold

AudioUnit v3 plug-ins needs to be implemented asApplication Extension. Thus we need first to create host application.

Creating Host App

Creating Host App - Settings

Now we can add AudioUnit extension into the host app.

Creating AU

Creating AU - Settings

Now we can run and debug our plugin in some AUv3 host. For instance in JuceAudioPluginHost.app or in GarageBang.app.

AU Build Schema

AU in GarageBand.app

Note ⚠️: If you are getting errorEXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0) try to enableThread sanitizer in Run configuration.

AU Runtime Error

AU Enabling Tsan

While GarageBand.app is running, the plug-in temporary remains registered in the system. So, we can also check presence of it in system by usingauval tool.

$auval-s aufx 2>/dev/null    AU Validation Tool    Version: 1.7.0    Copyright 2003-2019, Apple Inc. All Rights Reserved.    Specify-h(-help)forcommandoptionsaufx attr HOME  -  HOME: AttenuatorAU ⬅️aufx bpas appl  -  Apple: AUBandpassaufx dcmp appl  -  Apple: AUDynamicsProcessor...aufx tmpt appl  -  Apple: AUPitch
Enter fullscreen modeExit fullscreen mode

We can even validate plug-in inauval tool.

$auval-v aufx attr HOME    AU Validation Tool    Version: 1.7.0    Copyright 2003-2019, Apple Inc. All Rights Reserved.    Specify-h(-help)forcommandoptions--------------------------------------------------VALIDATING AUDIO UNIT:'aufx' -'attr' -'HOME'--------------------------------------------------Manufacturer String: HOMEAudioUnit Name: AttenuatorAUComponent Version: 1.6.0(0x10600)...** PASS--------------------------------------------------AU VALIDATION SUCCEEDED.--------------------------------------------------
Enter fullscreen modeExit fullscreen mode

Another way to check if the plug-in registered in system, is to usepluginkit tool.

$pluginkit-m     com.apple.AppSSOKerberos.KerberosExtension(1.0)     com.apple.diagnosticextensions.osx.timemachine(1.0)!    abc.example.Attenuator.AttenuatorAU(1.0) ⬅️+    com.apple.share.System.add-to-safari-reading-list(641.6)+    com.apple.ncplugin.weather(1.0)     com.apple.diagnosticextensions.osx.syslog(1.0)     com.apple.RemoteManagement.PasscodeSettingsExtension(1.0)     ...
Enter fullscreen modeExit fullscreen mode

Note ⚠️: Once we will stop debug session in GarageBang.app or in Juce AudioPluginHost.app. The plug-in will be unregistered from the system.

$auval-s aufx 2>/dev/null    AU Validation Tool    Version: 1.7.0    Copyright 2003-2019, Apple Inc. All Rights Reserved.    Specify-h(-help)forcommandoptionsaufx bpas appl  -  Apple: AUBandpassaufx dcmp appl  -  Apple: AUDynamicsProcessor...aufx tmpt appl  -  Apple: AUPitch
Enter fullscreen modeExit fullscreen mode
$pluginkit-m     com.apple.AppSSOKerberos.KerberosExtension(1.0)     com.apple.diagnosticextensions.osx.timemachine(1.0)+    com.apple.share.System.add-to-safari-reading-list(641.6)+    com.apple.ncplugin.weather(1.0)     com.apple.diagnosticextensions.osx.syslog(1.0)     com.apple.RemoteManagement.PasscodeSettingsExtension(1.0)     ...
Enter fullscreen modeExit fullscreen mode

Here is how plug-in works inAudioPluginHost fromJUCE SDK.

AU in Juce

I found JUCE host better then GarageBand.app because it allows to automate plug-in parameters. This is significant value for testing.

Summary of this step marked with git tag01-PlugIn-Scaffold.

Refactoring DSP and UI implementation

Xcode created default implementation of AudioUnit, DSP processor and Helper classes. For our Attenuator plug-in we don't need code related to MIDI events processing. Also we want to use Swift as much as possible. Plus we want to use SwiftUI in a plug-in view.

After refactoring project structure will look like below.

AU Project after Refactoring

// AttenuatorAU-Bridging-Header.h#import "AttenuatorDSP.h"
Enter fullscreen modeExit fullscreen mode
// AttenuatorDSP.h#ifndef AttenuatorDSP_h#define AttenuatorDSP_h#import <AudioToolbox/AudioToolbox.h>@interfaceAttenuatorDSP:NSObject@property(nonatomic)floatparamGain;@property(nonatomic)boolisBypassed;@property(nonatomic)uintnumberOfChannels;-(void)process:(AUAudioFrameCount)frameCountinBufferListPtr:(AudioBufferList*)inBufferListPtroutBufferListPtr:(AudioBufferList*)outBufferListPtr;@end#endif/* AttenuatorDSP_h */
Enter fullscreen modeExit fullscreen mode

DSP not doing any work related to bus management. It just altering input data to output data based on current plug-in parameters.

// AttenuatorDSP.mm#include "AttenuatorDSP.h"@implementationAttenuatorDSP-(instancetype)init{self=[superinit];if(self){self.paramGain=1;}returnself;}-(void)process:(AUAudioFrameCount)frameCountinBufferListPtr:(AudioBufferList*)inBufferListPtroutBufferListPtr:(AudioBufferList*)outBufferListPtr{for(intchannel=0;channel<_numberOfChannels;++channel){if(_isBypassed){if(inBufferListPtr->mBuffers[channel].mData==outBufferListPtr->mBuffers[channel].mData){continue;}}// Get pointer to immutable input buffer and mutable output bufferconstfloat*inPtr=(float*)inBufferListPtr->mBuffers[channel].mData;float*outPtr=(float*)outBufferListPtr->mBuffers[channel].mData;// Perform per sample dsp on the incoming float `inPtr` before asigning it to `outPtr`for(intframeIndex=0;frameIndex<frameCount;++frameIndex){if(_isBypassed){outPtr[frameIndex]=inPtr[frameIndex];}else{outPtr[frameIndex]=_paramGain*inPtr[frameIndex];}}}}@end
Enter fullscreen modeExit fullscreen mode
// AttenuatorParameter.swiftimportFoundationimportAudioUnitenumAttenuatorParameter:UInt64{casegain=1000staticfuncfromRawValue(_rawValue:UInt64)->AttenuatorParameter{ifletvalue=AttenuatorParameter(rawValue:rawValue){returnvalue}fatalError()}varparameterID:String{letprefix="paramID:"switchself{case.gain:returnprefix+"Gain"}}varname:String{switchself{case.gain:return"Gain"}}varmin:AUValue{switchself{case.gain:return0}}varmax:AUValue{switchself{case.gain:return1}}vardefaultValue:AUValue{switchself{case.gain:return1}}funcstringFromValue(value:AUValue)->String{switchself{case.gain:return"\(value)"}}}
Enter fullscreen modeExit fullscreen mode

AudioUnit subclass performs all work related to bus management and buffer allocation.

// AttenuatorAudioUnit.swiftimportAudioUnitimportAVFoundationclassAttenuatorAudioUnit:AUAudioUnit{publicenumError:Swift.Error{casestatusError(OSStatus)caseunableToInitialize(String)}privateletmaxNumberOfChannels:UInt32=8privateletmaxFramesToRender:UInt32=512privatevar_parameterTree:AUParameterTree!private(set)varparameterGain:AUParameter!privateletdsp=AttenuatorDSP()privatevarinputBus:AUAudioUnitBusprivatevaroutputBus:AUAudioUnitBusprivatevaroutPCMBuffer:AVAudioPCMBufferprivatevar_inputBusses:AUAudioUnitBusArray!privatevar_outputBusses:AUAudioUnitBusArray!overrideinit(componentDescription:AudioComponentDescription,options:AudioComponentInstantiationOptions)throws{guardletformat=AVAudioFormat(standardFormatWithSampleRate:44100,channels:2)else{throwError.unableToInitialize(String(describing:AVAudioFormat.self))}inputBus=tryAUAudioUnitBus(format:format)inputBus.maximumChannelCount=maxNumberOfChannelsoutputBus=tryAUAudioUnitBus(format:format)outputBus.maximumChannelCount=maxNumberOfChannelsguardletpcmBuffer=AVAudioPCMBuffer(pcmFormat:format,frameCapacity:maxFramesToRender)else{throwError.unableToInitialize(String(describing:AVAudioPCMBuffer.self))}pcmBuffer.frameLength=maxFramesToRenderoutPCMBuffer=pcmBufferdsp.numberOfChannels=format.channelCountdsp.paramGain=AttenuatorParameter.gain.defaultValuetrysuper.init(componentDescription:componentDescription,options:options)self.maximumFramesToRender=maxFramesToRender_parameterTree=setUpParametersTree()_inputBusses=AUAudioUnitBusArray(audioUnit:self,busType:AUAudioUnitBusType.input,busses:[inputBus])_outputBusses=AUAudioUnitBusArray(audioUnit:self,busType:AUAudioUnitBusType.output,busses:[outputBus])}overridevarparameterTree:AUParameterTree?{get{return_parameterTree}set{fatalError()}}overridevarshouldBypassEffect:Bool{get{returndsp.isBypassed}set{dsp.isBypassed=newValue}}publicoverridevarinputBusses:AUAudioUnitBusArray{return_inputBusses}publicoverridevaroutputBusses:AUAudioUnitBusArray{return_outputBusses}overridefuncallocateRenderResources()throws{// Should be equal as we created it with the same format.ifoutputBus.format.channelCount!=inputBus.format.channelCount{setRenderResourcesAllocated(false)throwError.statusError(kAudioUnitErr_FailedInitialization)}trysuper.allocateRenderResources()guardletpcmBuffer=AVAudioPCMBuffer(pcmFormat:inputBus.format,frameCapacity:maximumFramesToRender)else{throwError.unableToInitialize(String(describing:AVAudioPCMBuffer.self))}pcmBuffer.frameLength=maxFramesToRenderoutPCMBuffer=pcmBufferdsp.numberOfChannels=outputBus.format.channelCount}overridevarinternalRenderBlock:AUInternalRenderBlock{return{[weakself]_,timestamp,frameCount,outputBusNumber,outputData,_,pullInputBlockinguardletthis=selfelse{returnkAudioUnitErr_NoConnection}ifframeCount>this.maximumFramesToRender{returnkAudioUnitErr_TooManyFramesToProcess;}guardletpullInputBlock=pullInputBlockelse{returnkAudioUnitErr_NoConnection}varpullFlags:AudioUnitRenderActionFlags=[]letinputData=this.outPCMBuffer.mutableAudioBufferList// Instead of `inputBusNumber` we can also pass `0`letstatus=pullInputBlock(&pullFlags,timestamp,frameCount,outputBusNumber,inputData)ifstatus!=noErr{returnstatus}/*          Important:          If the caller passed non-null output pointers (outputData->mBuffers[x].mData), use those.          If the caller passed null output buffer pointers, process in memory owned by the Audio Unit          and modify the (outputData->mBuffers[x].mData) pointers to point to this owned memory.          The Audio Unit is responsible for preserving the validity of this memory until the next call to render,          or deallocateRenderResources is called.          If your algorithm cannot process in-place, you will need to preallocate an output buffer          and use it here.          See the description of the canProcessInPlace property.          */letinListPointer=UnsafeMutableAudioBufferListPointer(inputData)letoutListPointer=UnsafeMutableAudioBufferListPointer(outputData)forindexOfBufferin0..<outListPointer.count{// Should be equal by default.outListPointer[indexOfBuffer].mNumberChannels=inListPointer[indexOfBuffer].mNumberChannelsoutListPointer[indexOfBuffer].mDataByteSize=inListPointer[indexOfBuffer].mDataByteSizeifoutListPointer[indexOfBuffer].mData==nil{outListPointer[indexOfBuffer].mData=inListPointer[indexOfBuffer].mData}}this.dsp.process(frameCount,inBufferListPtr:inputData,outBufferListPtr:outputData)returnstatus}}// MARK: - PrivateprivatefuncsetUpParametersTree()->AUParameterTree{letpGain=AttenuatorParameter.gainparameterGain=AUParameterTree.createParameter(withIdentifier:pGain.parameterID,name:pGain.name,address:pGain.rawValue,min:pGain.min,max:pGain.max,unit:AudioUnitParameterUnit.linearGain,unitName:nil,flags:[],valueStrings:nil,dependentParameters:nil)parameterGain.value=pGain.defaultValuelettree=AUParameterTree.createTree(withChildren:[parameterGain])tree.implementorStringFromValueCallback={param,valueinguardletparamValue=value?.pointeeelse{return"-"}letparam=AttenuatorParameter.fromRawValue(param.address)returnparam.stringFromValue(value:paramValue)}tree.implementorValueObserver={[weakself]param,valueinletparam=AttenuatorParameter.fromRawValue(param.address)switchparam{case.gain:self?.dsp.paramGain=value}}tree.implementorValueProvider={[weakself]paraminguardlets=selfelse{returnAUValue()}letparam=AttenuatorParameter.fromRawValue(param.address)switchparam{case.gain:returns.dsp.paramGain;}}returntree}}
Enter fullscreen modeExit fullscreen mode

View controller acts as a factory and a clue between UI and AudioUnit.

// AudioUnitViewController.swiftimportCoreAudioKitpublicclassAudioUnitViewController:AUViewController,AUAudioUnitFactory{privatelazyvarauView=MainView()varaudioUnit:AttenuatorAudioUnit?privatevarparameterObserverToken:AUParameterObserverToken?privatevarisConfigured=falsepublicoverridefuncloadView(){view=auViewpreferredContentSize=NSSize(width:200,height:150)}publicoverridevarpreferredMaximumSize:NSSize{returnNSSize(width:800,height:600)}publicoverridevarpreferredMinimumSize:NSSize{returnNSSize(width:200,height:150)}publicoverridefuncviewDidLoad(){super.viewDidLoad()setupViewIfNeeded()}publicfunccreateAudioUnit(withcomponentDescription:AudioComponentDescription)throws->AUAudioUnit{letau=tryAttenuatorAudioUnit(componentDescription:componentDescription,options:[])audioUnit=auDispatchQueue.main.async{self.setupViewIfNeeded()}returnau}privatefuncsetupViewIfNeeded(){if!isConfigured,letau=audioUnit{isConfigured=truesetupUI(au:au)}}privatefuncsetupUI(au:AttenuatorAudioUnit){auView.setGain(au.parameterGain.value)parameterObserverToken=au.parameterTree?.token(byAddingParameterObserver:{address,valueinDispatchQueue.main.async{[weakself]inletparamType=AttenuatorParameter.fromRawValue(address)switchparamType{case.gain:self?.auView.setGain(value)}}})auView.onDidChange={[weakself]valueiniflettoken=self?.parameterObserverToken{self?.audioUnit?.parameterGain?.setValue(value,originator:token)}}}}
Enter fullscreen modeExit fullscreen mode
// MainView.swiftimportFoundationimportSwiftUIfinalclassSliderData:ObservableObject{@Publishedvargain:Float=100}classMainView:NSView{privateletsliderData=SliderData()varonDidChange:((Float)->Void)?overrideinit(frameframeRect:NSRect){super.init(frame:frameRect)wantsLayer=truelayer?.backgroundColor=NSColor.lightGray.cgColorletview=NSHostingView(rootView:MainUI{[weakself]inletvalue=$0/100print("MainView> Value to Host:\(value)")self?.onDidChange?(value)}.environmentObject(sliderData))view.translatesAutoresizingMaskIntoConstraints=falseaddSubview(view)leadingAnchor.constraint(equalTo:view.leadingAnchor).isActive=truetrailingAnchor.constraint(equalTo:view.trailingAnchor).isActive=truetopAnchor.constraint(equalTo:view.topAnchor).isActive=truebottomAnchor.constraint(equalTo:view.bottomAnchor).isActive=true}requireddynamicinit?(coderaDecoder:NSCoder){fatalError()}funcsetGain(_value:Float){print("MainView> Value from Host:\(value)")sliderData.gain=100*value}}
Enter fullscreen modeExit fullscreen mode

View contains Slider to control value of the gain parameter.

// MainUI.swiftimportFoundationimportCombineimportSwiftUIstructMainUI:View{@EnvironmentObjectvarsliderData:SliderData@Statevargain:Float=100privatevaronChanged:(Float)->Voidinit(onChanged:@escaping(Float)->Void){self.onChanged=onChanged}varbody:someView{VStack{Slider(value:Binding<Float>(get:{self.gain},set:{self.gain=$0self.onChanged($0)}),in:0...100,step:2)Text("Gain:\(Int(gain))")}.onReceive(sliderData.$gain,perform:{self.gain=$0})}}
Enter fullscreen modeExit fullscreen mode

Here is how refactored plug-in looks in Juce AudioPluginHost.app.

AU in Juce

Summary of this step marked with git tag02-Refactored-PlugIn-Code.

Adding VU meter backed by Metal

Now we have a simple Attenuator plug-in. Lets add VU meter which will show level of incoming signal.

First, on DSP side, we need to calculate maximum magnitude value.

// AttenuatorDSP.h#ifndef AttenuatorDSP_h#define AttenuatorDSP_h#import <AudioToolbox/AudioToolbox.h>@interface AttenuatorDSP: NSObject@property (nonatomic) float paramGain;@property (nonatomic) bool isBypassed;@property (nonatomic) uint numberOfChannels;// Used by VU meter on UI side 1️⃣.@property (nonatomic) float maximumMagnitude;-(void)process:(AUAudioFrameCount)frameCount inBufferListPtr:(AudioBufferList*)inBufferListPtr outBufferListPtr:(AudioBufferList*)outBufferListPtr;@end#endif /* AttenuatorDSP_h */
Enter fullscreen modeExit fullscreen mode
// AttenuatorDSP.mm#include "AttenuatorDSP.h"@implementationAttenuatorDSP// ..-(void)process:(AUAudioFrameCount)frameCountinBufferListPtr:(AudioBufferList*)inBufferListPtroutBufferListPtr:(AudioBufferList*)outBufferListPtr{_maximumMagnitude=0;for(intchannel=0;channel<_numberOfChannels;++channel){// Get pointer to immutable input buffer and mutable output bufferconstfloat*inPtr=(float*)inBufferListPtr->mBuffers[channel].mData;float*outPtr=(float*)outBufferListPtr->mBuffers[channel].mData;// Perform per sample dsp on the incoming float `inPtr` before asigning it to `outPtr`for(intframeIndex=0;frameIndex<frameCount;++frameIndex){floatvalue=inPtr[frameIndex];if(!_isBypassed){value*=_paramGain;}outPtr[frameIndex]=value;_maximumMagnitude=fmax(_maximumMagnitude,value);// 2️⃣ Saving max magnitude.}}}@end
Enter fullscreen modeExit fullscreen mode

Then we need to create Metal view which will render VU level.

// VUView.swiftimportFoundationimportMetalKitclassVUView:MTKView{publicenumError:Swift.Error{caseunableToInitialize(Any.Type)}private(set)varviewportSize=vector_float2(100,100)privatevarmetalDevice:MTLDevice!privatevarlibrary:MTLLibrary!privatevarcommandQueue:MTLCommandQueue!privatevarpipelineState:MTLRenderPipelineState!privatevarcolorData=vector_float4(0,0,1,1)privatevarverticesData=[vector_float2]()privatevarlevel:Float=0varonRender:(()->Float)?init(thisIsNeededToMakeSwiftCompilerHapy:Bool=true)throws{letdevice=MTLCreateSystemDefaultDevice()super.init(frame:.zero,device:device)// Clear color. See: https://forums.developer.apple.com/thread/26461clearColor=MTLClearColorMake(0,0,0,0)ifletdevice=device{metalDevice=devicecolorPixelFormat=MTLPixelFormat.bgra8Unorm// Actually it is default valuedelegate=self}else{throwError.unableToInitialize(MTLDevice.self)}guardleturl=Bundle(for:type(of:self)).url(forResource:"default",withExtension:"metallib")else{throwError.unableToInitialize(URL.self)}library=trymetalDevice.makeLibrary(filepath:url.path)guardletcommandQueue=metalDevice.makeCommandQueue()else{throwError.unableToInitialize(MTLCommandQueue.self)}self.commandQueue=commandQueueguardletvertexProgram=library.makeFunction(name:"vertex_line")else{throwError.unableToInitialize(MTLFunction.self)}guardletfragmentProgram=library.makeFunction(name:"fragment_line")else{throwError.unableToInitialize(MTLFunction.self)}letpipelineStateDescriptor=MTLRenderPipelineDescriptor()pipelineStateDescriptor.vertexFunction=vertexProgrampipelineStateDescriptor.fragmentFunction=fragmentProgram// Alternatively can be set from drawable.texture.pixelFormatpipelineStateDescriptor.colorAttachments[0].pixelFormat=colorPixelFormatpipelineState=trymetalDevice.makeRenderPipelineState(descriptor:pipelineStateDescriptor)}requiredinit(coder:NSCoder){fatalError()}}extensionVUView:MTKViewDelegate{funcmtkView(_view:MTKView,drawableSizeWillChangesize:CGSize){viewportSize.x=Float(size.width)viewportSize.y=Float(size.height)}funcdraw(inview:MTKView){ifinLiveResize{return}ifletdrawable=currentDrawable,letdescriptor=currentRenderPassDescriptor{autoreleasepool{do{tryrender(drawable:drawable,renderPassDescriptor:descriptor)}catch{print(String(describing:error))assertionFailure(String(describing:error))}}}}}extensionVUView{funcrender(drawable:CAMetalDrawable,renderPassDescriptor:MTLRenderPassDescriptor)throws{guardletcommandBuffer=commandQueue.makeCommandBuffer()else{throwError.unableToInitialize(MTLCommandBuffer.self)}// Transparent Metal background. See: https://forums.developer.apple.com/thread/26461renderPassDescriptor.colorAttachments[0].loadAction=.clearguardletrenderEncoder=commandBuffer.makeRenderCommandEncoder(descriptor:renderPassDescriptor)else{throwError.unableToInitialize(MTLRenderCommandEncoder.self)}do{renderEncoder.setRenderPipelineState(pipelineState)letwidth=Double(viewportSize.x)letheight=Double(viewportSize.y)letviewPort=MTLViewport(originX:0,originY:0,width:width,height:height,znear:0,zfar:1)renderEncoder.setViewport(viewPort)tryprepareEncoder(encoder:renderEncoder)renderEncoder.endEncoding()commandBuffer.present(drawable)commandBuffer.commit()}catch{renderEncoder.endEncoding()throwerror}}funcprepareEncoder(encoder:MTLRenderCommandEncoder)throws{verticesData.removeAll(keepingCapacity:true)level=onRender?()??0iflevel<=0{return}letx=max(Float(viewportSize.x*level),1)letvertices=makeRectangle(xMin:0,xMax:x,yMin:0,yMax:viewportSize.y)verticesData+=verticesencoder.setVertexBytes(&verticesData,length:verticesData.count*MemoryLayout<vector_float2>.stride,index:0)encoder.setVertexBytes(&colorData,length:MemoryLayout<vector_float4>.stride,index:1)encoder.setVertexBytes(&viewportSize,length:MemoryLayout<vector_float2>.stride,index:2)encoder.drawPrimitives(type:.triangle,vertexStart:0,vertexCount:verticesData.count)}funcmakeRectangle(xMin:Float,xMax:Float,yMin:Float,yMax:Float)->[vector_float2]{// Adding 2 triangles to represent recrtangle.return[vector_float2(xMin,yMin),vector_float2(xMin,yMax),vector_float2(xMax,yMax),vector_float2(xMin,yMin),vector_float2(xMax,yMax),vector_float2(xMax,yMin)]}}
Enter fullscreen modeExit fullscreen mode

And of cause we need to create Metal shaders.

// VUView.metal#include <metal_stdlib>usingnamespacemetal;structColoredVertex{float4position[[position]];float4color;};vertexColoredVertexvertex_line(uintvid[[vertex_id]],constantvector_float2*positions[[buffer(0)]],constantvector_float4*color[[buffer(1)]],constantvector_float2*viewportSizePointer[[buffer(2)]]){vector_float2viewportSize=*viewportSizePointer;vector_float2pixelSpacePosition=positions[vid].xy;ColoredVertexvert;vert.position=vector_float4(0.0,0.0,0.0,1.0);vert.position.xy=(pixelSpacePosition/(viewportSize/2.0))-1.0;vert.color=*color;returnvert;}fragmentfloat4fragment_line(ColoredVertexvert[[stage_in]]){returnvert.color;}
Enter fullscreen modeExit fullscreen mode

Drawing model and maximum magnitude wired together in a view controller, via callback.

// AudioUnitViewController.swift// ...privatefuncsetupUI(au:AttenuatorAudioUnit){auView.setGain(au.parameterGain.value)parameterObserverToken=au.parameterTree?.token(byAddingParameterObserver:{address,valueinDispatchQueue.main.async{[weakself]inletparamType=AttenuatorParameter.fromRawValue(address)switchparamType{case.gain:self?.auView.setGain(value)}}})auView.onDidChange={[weakself]valueiniflettoken=self?.parameterObserverToken{self?.audioUnit?.parameterGain?.setValue(value,originator:token)}}// 1️⃣ Connecting UI and DSP.auView.onRender={[weakself]inself?.audioUnit?.maximumMagnitude??0}}
Enter fullscreen modeExit fullscreen mode

Finally we have a plug-in with visual feedback, which shows volume level of incoming signal.

AU with VU in Juce

Summary of this step marked with git tag03-Created-VU-Meter.

Happy coding! 🙃

Sources of Plug-In can be found atGitHub.

Top comments(0)

Subscribe
pic
Create template

Templates let you quickly answer FAQs or store snippets for re-use.

Dismiss

Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment'spermalink.

For further actions, you may consider blocking this person and/orreporting abuse

  • Joined

More fromVlad Gorlov

DEV Community

We're a place where coders share, stay up-to-date and grow their careers.

Log in Create account

[8]ページ先頭

©2009-2025 Movatter.jp