HTML5   发布时间:2022-04-27  发布网站:大佬教程  code.js-code.com
大佬教程收集整理的这篇文章主要介绍了iOS:Audio Unit RemoteIO无法在iPhone上运行大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在尝试根据麦克风的输入创建自己的自定义音效音频单元.该应用程序允许从麦克风到扬声器的同时输入/输出.我可以使用模拟器应用效果和工作,但当我尝试在iPhone上测试时,我听不到任何声音.如果有人可以帮我,我会粘贴我的代码

- (id) init{
    self = [super init];

    OSStatus status;

    // Describe audio component
    AudioComponentDescription desc;
    desc.componentType = kAudioUnitType_Output;
    desc.componentSubType = kAudioUnitSubType_RemoteIO;
    desc.componentFlags = 0;
    desc.componentFlagsmask = 0;
    desc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent inputComponent = AudioComponentFindNext(NULL,&desc);

    // Get audio units
    status = AudioComponenTinstanceNew(inputComponent,&audioUnit);
    checkStatus(status);

    // Enable IO for recording
    UInt32 flag = 1;
    status = AudioUnitSetProperty(audioUnit,kAudioOutputUnitProperty_EnableIO,kAudioUnitScope_Input,kInputBus,&flag,sizeof(flag));
    checkStatus(status);

    // Enable IO for playBACk
    status = AudioUnitSetProperty(audioUnit,kAudioUnitScope_Output,kOutputBus,sizeof(flag));
    checkStatus(status);

    // Describe format
    AudioStreamBasicDescription audioFormat;
    audioFormat.mSampleRate         = 44100.00;
    audioFormat.mFormatID           = kAudioFormatLinearPCM;
    audioFormat.mFormatFlags        = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    audioFormat.mFramesPerPacket    = 1;
    audioFormat.mChAnnelsPerFrame   = 1;
    audioFormat.mBitsPerChAnnel     = 16;
    audioFormat.mBytesPerPacket     = 2;
    audioFormat.mBytesPerFrame      = 2;


    // Apply format
    status = AudioUnitSetProperty(audioUnit,kAudioUnitProperty_StreamFormat,&audioFormat,sizeof(audioFormat));
    checkStatus(status);
    status = AudioUnitSetProperty(audioUnit,sizeof(audioFormat));
    checkStatus(status);


    // Set input callBACk
    AURenderCallBACkStruct callBACkStruct;
    callBACkStruct.inputProc = recordingCallBACk;
    callBACkStruct.inputProcRefCon = self;
    status = AudioUnitSetProperty(audioUnit,kAudioOutputUnitProperty_SeTinputCallBACk,kAudioUnitScope_Global,&callBACkStruct,sizeof(callBACkStruct));
    checkStatus(status);

    // Set output callBACk
    callBACkStruct.inputProc = playBACkCallBACk;
    callBACkStruct.inputProcRefCon = self;
    status = AudioUnitSetProperty(audioUnit,kAudioUnitProperty_SetRenderCallBACk,sizeof(callBACkStruct));
    checkStatus(status);


    // Allocate our own buffers (1 chAnnel,16 bits per sample,thus 16 bits per frame,thus 2 bytes per framE).
    // Practice learns the buffers used contain 512 frames,if this changes it will be fixed in processAudio.
    tempBuffer.mnumberChAnnels = 1;
    tempBuffer.mDataByteSize = 512 * 2;
    tempBuffer.mData = malloc( 512 * 2 );

    // Initialise
    status = AudioUniTinitialize(audioUnit);
    checkStatus(status);

    return self;
}

当来自麦克风的新音频数据可用时,将调用此回调.但是当我在iPhone上测试时,永远不要进入这里:

static OSStatus recordingCallBACk(void *inRefCon,AudioUnitRenderActionFlags *ioActionFlags,const Audiotimestamp *intimestamp,UInt32 inBusnumber,UInt32 innumberFrames,AudioBufferList *ioData) {
    AudioBuffer buffer;

    buffer.mnumberChAnnels = 1;
    buffer.mDataByteSize = innumberFrames * 2;
    buffer.mData = malloc( innumberFrames * 2 );

    // Put buffer in a AudioBufferList
    AudioBufferList bufferList;
    bufferList.mnumberBuffers = 1;
    bufferList.mBuffers[0] = buffer;

    // Then:
    // Obtain recorded samples

    OSStatus status;

    status = AudioUnitRender([iosAudio audioUnit],ioActionFlags,intimestamp,inBusnumber,innumberFrames,&bufferList);
    checkStatus(status);

    // Now,we have the samples we just read sitTing in buffers in bufferList
    // Process the new data
    [iosAudio processAudio:&bufferList];

    // release the malloc'ed data in the buffer we created earlier
    free(bufferList.mBuffers[0].mData);

    return noErr;
}

解决方法

解决了我的问题.我只需要在播放/录制之前初始化AudioSession.我使用以下代码执行此操作:

OSStatus status;

AudioSessionInitialize(NULL,NULL,self);
UInt32 sessionCategory = kAudioSessionCategory_PlayAndRecord;
status = AudioSessionSetProperty (kAudioSessionProperty_AudioCategory,sizeof (sessionCategory),&sessionCategory);

if (status != kAudioSessionNoError)
{
    if (status == kAudioservicesUnsupportedPropertyError) {
        NSLog(@"AudioSessionInitialize Failed: unsupportedPropertyError");
    }else if (status == kAudioservicesBadPropertySizeError) {
        NSLog(@"AudioSessionInitialize Failed: badPropertySizeError");
    }else if (status == kAudioservicesBadSpecifierSizeError) {
        NSLog(@"AudioSessionInitialize Failed: badSpecifierSizeError");
    }else if (status == kAudioservicesSystemSoundUnspecifiedError) {
        NSLog(@"AudioSessionInitialize Failed: systemSoundUnspecifiedError");
    }else if (status == kAudioservicesSystemSoundClientTimedOutError) {
        NSLog(@"AudioSessionInitialize Failed: systemSoundClientTimedOutError");
    }else {
        NSLog(@"AudioSessionInitialize Failed! %ld",status);
    }
}


AudioSessionSetActive(true);

大佬总结

以上是大佬教程为你收集整理的iOS:Audio Unit RemoteIO无法在iPhone上运行全部内容,希望文章能够帮你解决iOS:Audio Unit RemoteIO无法在iPhone上运行所遇到的程序开发问题。

如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。

本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。