Attention! For Japanese customers, my Japanese blog might help.

Jul 31, 2016

Recognizing my own Audio Unit App Extension from 3rd party host applications

NOTE: This is the English version of my Japanese blog.
It is noted that this article is an instllation memo for my own Audio Unit App Extension. Because I figure out through a trial and error, there are some mistakes on this article.

Detail

As for an Audio Unit App Extension, which can be added from "Add Target..." on Xcode 7.3,
despite copying the AU to ~/Library/Audio/Plug-Ins/Components/, it was not recognized by neither 3rd parthy host applications nor auvaltool.

After some suveys, I found that it is needed to make an embed application which contains the extension and launch the application, like a Today App Extension. (that is to say, it is not needed to copy to ~/Library/Audio/Plug-Ins/Components/ as conventional AUs.)

With regard to the solution, following discussion was very helpful.

Audio Units v3 OS x: Instantiating custom audio... | Apple Developer Forums

After a successful installation, a following log message was shown in syslog:
Aug  1 10:23:41 MBP13R pkd[308] <Warning>: INSTALLED:com.shakeyama-mi.MyAUHost.Vibrato com.shakeyama-mi.MyAUHost.Vibrato(1.5) <__NSConcreteUUID 0x7fc9b2c19020> C3C2F430-0567-4651-800A-B8D0EE39A8E9 /Users/shakeyama/Library/Developer/Xcode/DerivedData/AU-auuapdgoorecwwhayzytocylavpn/Build/Products/Release/MyAUHost.app/Contents/PlugIns/Vibrato.appex

Of course, The message is available in Console.app.
In my environment, the message was shown when the embed application was launched from Finder, not from Xcode.
In the case of the launch from Xcode, it may be bad that Xcode attaches to the process of the application.
When launched from Xcode, the following message was shown in syslog.
<Notice>: <rdar://problem/11489077> A sandboxed application with pid 1277, "MyAUHost" checked in with appleeventsd, but its code signature could not be read and validated by appleeventsd, and so it cannot receive AppleEvents targeted by name, bundle id, or signature. Install the application in /Applications/ or some other world readable location to resolve this issue. Error=ERROR: #100013  { "NSDescription"="SecCodeCopySigningInformation() returned 100013, -." }  (handleMessage()/appleEventsD.cp #2098) com.apple.root.default-qos
But I don't know the message is concerned with the problem.

After the instllation, I could successfully recognize my AU from 3rd party host applications or auvaltool even if the embed application was terminated.

However, when I launched and terminated the embed app repeatedly, my AU was not recognized again.
Syslog said
Aug  1 11:21:27 MBP13R pkd[308] <Warning>: UNINSTALLED:com.shakeyama-mi.MyAUHost.Vibrato com.shakeyama-mi.MyAUHost.Vibrato(1.5) C0F51268-6DA2-41E2-9805-03D7119241ED /Users/shakeyama/Desktop/MyAUHost 0028-08-01 11-19-53/MyAUHost.app/Contents/PlugIns/Vibrato.appex

When the app was launched, it was seemed that my AU was once uninstalled and installed again.
But my AU became never re-installed somehow. I don't know the reason yet.

・another AU topic

It is seemed that the bundle identifier of AU needs to begin with that of embed application.
When not satisfied, the AU cannot be built with following error.
error: Embedded binary's bundle identifier is not prefixed with the parent app's bundle identifier.

Embedded Binary Bundle Identifier: com.shakeyama-mi.AU.Vibrato
Parent App Bundle Identifier: com.shakeyama-mi.MyAUHost

By the way, when you newly add an AU app extension target to a project which already contains an application, the build setting of the embed application was automatically edit to embed the extension as a plug-in.

However, when you newly add an embed application target to a project which contains an extension, You need to change build setting manually.
To change manually, do
1: Select the added application target
2: Push "+" button on the top of a "Build Phases" tab
3: Select "New Copy Files Phase"
4: Select "PlugIns" from "Destination"
5: Push "+" button in the bottom of the phase
6: Select present AU app extension
If you add your extension into a "Copy Bundle Resources" phase, altough the exntesion can be recognized from your embed app, it probably cannot be recognized from 3rd party host applications or auvaltool.

Jul 30, 2016

How to handle non-interleaved audio data

I think it is too difficult to handle non-interleaved audio data in a app.
After a workround, I compiled points seemed to be important.
  1. How to set up AudioBufferList
  2. How to set up AudioStreamBasicDescription(ASBD)
  3. How to convert frame <-> bytes

1. How to set up AudioBufferList

In the case of non-interleaved data, you need to allocate additional space for AudioBuffers.
For example, Following cod may work.
    AudioBufferList *list = calloc(sizeof(AudioBufferList) + sizeof(AudioBuffer) * (numCh - 1), 1);

Reference: Core Audio その1 AudioBufferとAudioBufferList | Objective-Audio

2. How to set up AudioStreamBasicDescription(ASBD)

In the case of non-interleaved data, you should set ASBD.mBytesPerFrame for the size of ONE channel.
This is described in CoreAudioTypes.h as follows.

However, when an ASBD has the kAudioFormatFlagIsNonInterleaved flag, the
                    AudioBufferList has a different structure and semantic. In this case, the ASBD
                    fields will describe the format of ONE of the AudioBuffers that are contained in
                    the list, AND each AudioBuffer in the list is determined to have a single (mono)
                    channel of audio data. Then, the ASBD's mChannelsPerFrame will indicate the
                    total number of AudioBuffers that are contained within the AudioBufferList -
                    where each buffer contains one channel. This is used primarily with the
                    AudioUnit (and AudioConverter) representation of this list - and won't be found
                    in the AudioHardware usage of this structure.
3. How to convert  <-> bytes

As a result, in the case of non-ineterleaved data, the formula

mBytesPerFrame == (mChannelsPerFrame * mBitsPerChannel / 8)

is NOT satisfied.

So, for the cose the uses mBytesPerFrame, you need convert a manucal conversion such as
bytesPerFrame = (mChannelsPerFrame * mBitsPerChannel / 8)

I would used to use the code as follows:

+ (UInt32)BPFofASBD:(ASBD)desc {
    if ((desc.mFormatFlags & kAudioFormatFlagIsNonInterleaved) == kAudioFormatFlagIsNonInterleaved) {
        return desc.mChannelsPerFrame * desc.mBitsPerChannel / 8;
    }
    
    return desc.mBytesPerFrame;
}

Try it!

Jul 22, 2016

AUSampler does not appear on AU Lab 2.3

Somehow, on AU Lab 2.3, AUSampler cannot be selected from a pop up button in a sheet, which is displayed when "Edit" - "Add Audio Unit Instrument" menu is selected.

Although I confirmed that AU Lab 2.3 could load a .trak file which contained AUSampler and created on 2.2.2, I decided to keep on using 2.2.2 after all.

Jul 21, 2016

AudioComponentInstanceNew() returns kAudioUnitErr_CannotDoInCurrentContext for v3 AudioUnits.

NOTE: This is the English version of my Japanese article.

I was looking for the reason that AudioComponentInstanceNew() returned -10863(kAudioUnitErr_CannotDoInCurrentContext) for an AudioUnit, which was created using target template on Xcode 7.3.

The AudioUnit was also failed a validation with auvaltool -v.

$ auvaltool -v aufx vibr Symi

    AU Validation Tool
    Version: 1.6.1a1 
    Copyright 2003-2013, Apple Inc. All Rights Reserved.
    Specify -h (-help) for command options

--------------------------------------------------
VALIDATING AUDIO UNIT: 'aufx' - 'vibr' - 'Symi'
--------------------------------------------------
Manufacturer String: Shakeyama
AudioUnit Name: VibratoUnit
Component Version: 1.6.0 (0x10600)

* * PASS
--------------------------------------------------
TESTING OPEN TIMES:
COLD:
Time to open AudioUnit:         333.540 ms
WARM:
Time to open AudioUnit:         101.261  ms
This AudioUnit is a version 3 implementation.
FIRST TIME:
FATAL ERROR: Initialize: result: -50

From the message above, it was found that the AudioUnit seemed version 3.
For version 3 AudioUnit, for example, it can be initialized using AVAudioUnit.instantiateWithComponentDescription() in Swift.
Reference: Shared/SimplePlayEngine.swift

After further survey, I confirmed that the AU can be initialized using +[AUAudioUnit instantiateWithComponentDescription:options:completionHandler:] in Obj-C, or AudioComponentInstantiate() in C, as well.

Note that these initialization was done asynchronously.
So, you will need modify your host code to write post-initalization code in Blocks.

By the way, if you want to make version 2 AudioUnit, Audio Unit Examples (AudioUnit Effect, Generator, Instrument, MIDI Processor and Offline) may help.
I think this way will be easer...

Get display names without using CGDisplayIOServicePort()

NOTE: This article is the English version of my Japanese article.

Because CGDisplayIOServicePort() has been deprecated in 10.9, I was looking for alternatives to get display names.

There was the same question in Stack Overflow:
objective c - CGDisplayIOServicePort is deprecated in OS X &gt;= 10.9, how to replace? - Stack Overflow

By a comment to the question above, there was seemed to be another implementation in a repository in GitHub.

To see how the functionality is implemented, it may be a good idea to see diff:
Replace CGDisplayIOServicePort with a workaround implementation · 8101d7a · glfw/glfw

Concretely, a function IOServicePortFromCGDisplayID() was added and used instead of CGDisplayIOServicePort(), which was used in getDisplayName().

You can use the functionality by copying IOServicePortFromCGDisplayID and getDisplayName() from the newest source code to your code.
(You should add IOKit.framework and include IOGraphicsLib.h if not.)

But, it may be needed to customize the code to convert to NSString or specify a locale for your environment.

Jul 17, 2016

I can't open .webarchive files made by my own app.

As a temporary solution, you can open the files by setting "allow apps downloaded from" to "Anywhere" in a "Security and Privacy" in a SystemPreferences.

That is to say, the file seems to be treated as apps which  were not code-signed.
I don't know the way to open the files any more.

By the way, when I tried by double-clicking the files, the following allert was displayed.
“hogehoge.webarchive”は壊れているため開けません。 “ゴミ箱”に入れる必要があります。
このファイルは“fugafuga”により今日の 11:21 に作成されました。 

But, I could not open by double-clikking,
I could open the files by choosing a file from a Open menu in Safari, without configuring SystemPreferences.

As your information, in Console.app, I found the following error was logged.
2015/07/27 11:20:16.244 CoreServicesUIAgent[15194]: Error SecAssessmentCreate: 操作を完了できませんでした。(OSStatus エラー -67026)
Here, -67026 was found as an error named errSecCSFileHardQuarantined by referencing OSStatus.com.
http://www.osstatus.com/search/results?platform=all&framework=all&search=-67026

Moreover, you can open files written by Safari by double click.
So, I think it may be OK to open files in a app by which the app made.



Write contents in a WebView as a .webarchive file.

It will be an easy way to write using method such as writeTo* methods by converting to NSData with [[[[webView mainFrame] DOMDocument] webArchive] data].

For example,

    NSData *data = [[[[self.webView mainFrameDOMDocumentwebArchivedata];
    BOOL success = [data writeToFile:path atomically:YES];

There are other approaches.
Reference: How do you save to WebArchive webView editable content?
Reference: webArchive | raizan2ame

If you can't read .webarchive files written by you own app, my article "I can't open .webarchive files made by my own app." may help.

Disable drag & drop onto a WebView

You should implement webView:dragDestinationActionMaskForDraggingInfo: on the class, which is set to WebUIDelegate of a WebView.
To disable drag & drop, your should return WebDragDestinationActionNone in the method.

Reference: WebUIDelegate Protocol Reference

Disable executing JavaScript of a WebView

To disable JavaScript, set javaScriptEnabled property of WebPreferences, which was the property a WebView, to NO.

For example,

self.view.preferences.javaScriptEnabled = NO;

Reference: WebPreferences Class Reference - developer.apple.com

Now, if you edit WebView in a .xib file, you can also disable it by clearing a checkbox named "Enable JavaScript".
Otherwise, when you make a WebView in a source code, the property will be YES by default.

Jul 15, 2016

Reading MP3 files using ExtAudioFile functions

This article is the English version of my Japanese article.

ExtAudioFile functions provide the easer way to convert non-LPCM into LPCM  than AudioFile functions do.

The following code converts MP3 or M4A files into LPCM data (big-endian).

#include <CoreServices/CoreServices.h>
#include <AudioToolbox/AudioToolbox.h>

#define _ERR_RETURN(err)  {if(noErr != err){printf("%d - err:%d\n", __LINE__, err); return err;}}

typedef AudioStreamBasicDescription ASBD;
typedef AudioBufferList ABL;

OSStatus DecodeExtFileAtPath(const char *inputFilePath, const char *outputFilePath);
void SetStandardDescription(AudioStreamBasicDescription *descPtr);
static inline ABL MakeABL(UInt32 ch, UInt32 bytes, void *buf);

int main(int argc, char* argv[]) {
    if (argc != 3) {
        printf("usage: ./Mp3Decoder inFile outFile\n");
        return 0;
    }
    
    dispatch_group_t group = dispatch_group_create();
    
    dispatch_group_async(group, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT0), ^{
            OSStatus err = DecodeExtFileAtPath(argv[1], argv[2]);
            printf("done. err:%d\n", err);
    });
    
    dispatch_group_wait(group, DISPATCH_TIME_FOREVER);
    
    return 0;
}

/*
Reads an audio file specified by inputFilePath, and write to outputFilePath as CDDA quality binary data (big-endian).
Although this function is designed for non-LPCM files such as  MP3 and AAC, .wav and .aiff files are also acceptable.
When failed, an OSStatus at failed function will be returned.
 */
OSStatus DecodeExtFileAtPath(const char *inputFilePath, const char *outputFilePath) {
    CFURLRef url = CFURLCreateWithBytes(NULL,
                                        (const UInt8 *)inputFilePath,
                                        strlen(inputFilePath),
                                        kCFStringEncodingUTF8,
                                        NULL);
    
    ExtAudioFileRef file;
    OSStatus err = ExtAudioFileOpenURL(url,
                                       &file);
    _ERR_RETURN(err);
    
    CFRelease(url);
    
    ASBD clientDesc;
    SetStandardDescription(&clientDesc);
    UInt32 size = sizeof(clientDesc);
    
    err = ExtAudioFileSetProperty(file,
                                  kExtAudioFileProperty_ClientDataFormat,
                                  size,
                                  &clientDesc);
    _ERR_RETURN(err);
    
    SInt64 fileFrameLength;
    size = sizeof(fileFrameLength);
    err = ExtAudioFileGetProperty(file,
                                  kExtAudioFileProperty_FileLengthFrames,
                                  &size,
                                  &fileFrameLength);
    _ERR_RETURN(err);
    
    const UInt32 numFramesToReadInACycle = 1024*1024;
    const UInt32 bufferSize = clientDesc.mBytesPerFrame * numFramesToReadInACycle;
    void *buffer = malloc(bufferSize);
    FILE *fp = fopen(outputFilePath, "w");
    SInt64 frameOffset = 0;
    while (frameOffset != fileFrameLength) {
        UInt32 numFramesToRead = numFramesToReadInACycle;
        AudioBufferList list = MakeABL(clientDesc.mChannelsPerFrame,
                                       bufferSize,
                                       buffer);
        err = ExtAudioFileRead(file,
                               &numFramesToRead,
                               &list);
        _ERR_RETURN(err);  
         // 0 if end-of-file
        if (numFramesToRead == 0) {
            break;
        }
        fwrite(list.mBuffers[0].mData,
               list.mBuffers[0].mDataByteSize,
               1,
               fp);
        
        frameOffset += numFramesToRead;
    }
    fclose(fp);
    free(buffer);
    
    err = ExtAudioFileDispose(file);
    return err;
}

/*
 Sets CDDA-quality ASBD to descPtr.
 For little-endian processor, it should be noted this function returns big-endian format.
 */
void SetStandardDescription(AudioStreamBasicDescription *descPtr) {
    descPtr->mSampleRate = 44100.0;
    descPtr->mFormatID = kAudioFormatLinearPCM;
    descPtr->mFormatFlags = kAudioFormatFlagIsBigEndian |
                            kAudioFormatFlagIsSignedInteger |
                            kAudioFormatFlagIsPacked;
    descPtr->mBytesPerPacket = 4;
    descPtr->mBytesPerFrame = 4;
    descPtr->mFramesPerPacket = 1;
    descPtr->mChannelsPerFrame = 2;
    descPtr->mBitsPerChannel = 16;
}

static inline ABL MakeABL(UInt32 ch, UInt32 bytes, void *buf) {
    ABL list;
    list.mNumberBuffers = 1;
    list.mBuffers[0].mNumberChannels = ch;
    list.mBuffers[0].mDataByteSize = bytes;
    list.mBuffers[0].mData = buf;
    return list;
}

Compared to AudioFile functions, ExtAudioFile funcs are more comfortable because AudioConverter is unnecessary.
(annoying frame conversion calculations or data input callback are NOT needed!)

Andmore, total frames are available by using kExtAudioFileProperty_FileLengthFrames property.
The property helps us to estimate total byte size of LPCM data roughly before conversion.

However, note that there are cases that kExtAudioFileProperty_FileLengthFrames property does not return correct values.
So, you should check the number of read frames returned by ExtAudioFileRead() in a loop because the function returns 0 when reaches EOF.

Oh, by the way, I have not tried VBR files yet. But these will also be OK.