No AFXDelegate feature callbacks on OSX



Installed 3.2 via Cocoapods for use inside an OSX (not iOS) app. I’m not getting any delegate callbacks to the AFXDelegate method:
(void)detector:(AFDXDetector *)detector hasResults:(NSMutableDictionary *)faces forImage:(UIImage *)image atTime:(NSTimeInterval)time;

The callback skips straight to detectorDidFinishProcessing:. I’m running this on the sample video provided in the Affectiva iOS app. The same code and video work just fine in an iOS app I’ve built. It appears to be an issue with the OSX build.

Any advice appreciated.





Are you sure you’re using the correct delegate method? You mentioned using:

This delegate method is only available for iOS (notice the UIImage).

The correct one for macOS is
- (void)detector:(AFDXDetector *)detector hasResults:(NSMutableDictionary *)faces forImage:(NSImage *)image atTime:(NSTimeInterval)time;


Yes, I meant to say the NSImage method. With UIImage, it won’t even compile for Cocoa.


Can you share your code for creating and initializing the detector?

self.detector = [[AFDXDetector alloc] initWithDelegate:self usingFile:_filePath maximumFaces:1 faceMode:LARGE_FACES];
[_detector enableAnalytics];
_detector.maxProcessRate = 15;

[_detector setDetectAllEmotions:YES];
[_detector setDetectAllExpressions:YES];  NSError *error = [_detector start];
if(error) {
    NSLog(@"Error : %@", [error localizedDescription]);


This looks good, not sure what might be wrong…
Does it happen with a specific video?
Can you share a minimal project that reproduce this issue so I can debug it myself and see what happens?


I’m using the video of the Asian girl from the AffdexMe app. It obviously works on iOS and the simulator, etc… You can find a minimal OSX version that demonstrates the issue at the following link.


The problem was that you were trying to load a file from outside of your app’s sandbox. You can read more about app sandbox from here:


So to a load file you need to either have it in your app bundle and then get it’s url using
NSURL *file = [[NSBundle mainBundle] URLForResource:@"face1" withExtension:@"m4v"];
and use this url to initialize your detector.

Or use NSOpenPanel to let the user choose a file and then use it’s url to initialize your detector

    NSOpenPanel* openDlg = [NSOpenPanel openPanel];

    [openDlg setCanChooseFiles:YES];
    [openDlg setAllowsMultipleSelection:NO];
    [openDlg setAllowedFileTypes:@[@"mp4", @"mpeg", @"mov", @"qt", @"m4v"]];

    [openDlg setCanChooseDirectories:NO];

    [openDlg beginWithCompletionHandler:^(NSModalResponse result) {
        if (result == NSModalResponseOK) {
            NSURL *file = openDlg.URL;
            dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
                // initialize the detector using the obtained url


Cool, thanks. I’ll give it a try!