Audio fingerprinting in mobile with react-native (long tutorial)

18/04/2020    


Audio Fingerprinting is fun. Back in 2018 while working on one of my side projects (Nadyy), I stumbled upon this nice library for audio fingerprinting (dejavu). I've liked how it was fairly easy to understand for such not-so-easy task, and the accuracy was good enough for me. So my decision was to start working on top of dejavu.

Ultimately, my objective was to build a mobile app that detect and give information about user audio recordings. Just like what Shazam app is doing for example. The tricky part is that audio fingerprinting will consume resources, so fingerprinting user submitted queries (their audio recording) at server side perhaps is not a wise decision. Moreover, modern mobile phones are now rocket fast in terms of audio analysis. Dejavu audio fingerprinting algorithm was written in python. So I had to port it to whatever language/technology that works on Android and iOS. I went ahead and wrote the port in C++ (open sourced). 

The rest of this post is a long tutorial that will wrap-up the steps necessary (tested on Linux Ubuntu) to include this audio fingerprinting c++ function, to a react native project. Moreover, I've also open sourced a sample code with all those steps already done, with a suggested method to record raw audio.

Disclaimer: I have prepared this tutorial for someone who asked for it. Eventually, I've decided to publish it publicly instead of letting it buried in my PC.

Part One: Android

  1. In a new folder, create a react-native project. Let’s call it ‘mobileAfp’ You can use the following command: npx react-native init mobileAfp
  2. Make sure the app in its default configuration is running with no problems. Sometimes react-native has its own bugs. Moreover, make sure you have record audio and internet permissions in your android manifest file.
  3. OpenCV is a necessary dependency. Download openCV SDK for Android . Make sure you download version 3.4.6 to guarantee compatibility. You can do experiments with different versions afterwards.
  4. Make sure you have the Android Native development Kit (NDK). C++ code won’t compile without it. Again to guarantee compatibility the required version is 17. You can download it from android studio or manually. Check the following for extra tutorials: here and here. To add it manually basically you place the files Inside the folder at this path /Android/Sdk/ndk-bundle (on Linux). 
  5.  C++ can only be called from within Java (or Dart?). In order to get to this point one should manage to call Java code from JS (react native). Facebook has provided tutorial for this. I will quickly run over it here again: 
    • Create Java class (will call it RecAndFp) inside /src/main/java/com/mobileafp/ folder. The implementation should follow standards of react-native (please check sample code or facebook tutorial for more info). We will have three main function that we should be able to call from JS:
      public void start_recording()
      public void fp( Promise promise)
      public void stop_recording()
    • Create Package file for RecandFp.java (will call it RecAndFpPackage). This file should implements createNativeModules method and register our first java class.
    • Lastly, import the package into MainApplication.java, and register it inside the method getPackages().
  6. Make sure You now can call your Java code from JS. You could perhaps print messages inside start_recording() function and test it out. You can call it like so:
    import {NativeModules} from "react-native";
    NativeModules.RecAndFp.start_recording();
    NativeModules.RecAndFp.stop_recording();
    const result_string = await NativeModules.RecAndFp.fp()
  7.  Now the c++ part. To include basic c++ code in Android Google provided tutorial for this. I will quickly run over it here again: 
    • Create a folder with the name cpp inside src/main/
    • Create a new .cpp file inside that folder. It will contain the c++ code. Let’s call it fingerprint.cpp. Again it will be at src/main/cpp/fingerprint.cpp.
    • Create CMakeLists.txt file inside /cpp folder. If you follow Google’s guide, Android studio can create it for you. However I'm providing one in the sample code. 
    • Link gradle to cmake: Edit the file android/app/build.gradle and include specific instructions inside the correct blocks (file provided in sample code, adjust paths accordingly if you ran into problems).
    • Calls to c++ from Java should follow the Java Native Interface Specification (JNI). Basically we will have the following method as communication medium from Java to c++ :
      public native String passingDataToJni(float[] audio_data, int array_length);
      This function should be defined in c++ like the following:
      extern "C" JNIEXPORT jstring JNICALL
      Java_com_mobileafp_RecAndFp_passingDataToJni(JNIEnv *env, jobject instance, jfloatArray floatarray, jint intValue) {}
    • Finally RecAndFp.Java file should be edited to include the c++ library:
      static {
      System.loadLibrary("fingerprint");
      }
  8. At this point what is remaining is to include OpenCV and c++ fingerprint function. But before that, make sure you can call c++ successfully from Java. In addition, if you check the c++  function you would find that boost library is a dependency to produce the sha1 hashes. Including boost library to Android is a bit complicated and there is no reason to spend too much time on it. You could do the hashing using Java, JS, or even server side. So I've changed the c++ function on the provided sample code a bit to drop boost dependency.
  9. Now to include OpenCV This stackoverflow answer  sums up all the steps. I will run over it here again, quoting  “” from the same answer: 
    • Open the project in Android Studio then “Create a new module by selecting File>New Module. Select "Android Library", and then enter the details:  Library name: OpenCV, Module name: opencv Package name: org.opencv”. You should now have opencv folder at android/opencv.
    • “Copy the contents of path_to_opencv_sdk/sdk/java/src directory into path_to_your_project/opencv/src/main/java”.
    • “Under main, create the following directory path: aidl/org/opencv/engine and move /java/org/opencv/engine/OpenCVEngineInterface.aidl into it”.
    • “Copy the contents of path_to_opencv_sdk/sdk/java/res into path_to_your_project/opencv/src/main/res.”
    • “Create sdk folder inside path_to_your_project/opencv/src/ and copy path_to_opencv_sdk/sdk/native folder into it.”
    • create CMakeLists.txt at opencv/ folder. Add the following:
      cmake_minimum_required(VERSION 3.4.1)
      set(OpenCV_DIR "src/sdk/native/jni")
      find_package(OpenCV REQUIRED)
      message(STATUS "OpenCV libraries: ${OpenCV_LIBS}")
      include_directories(${OpenCV_INCLUDE_DIRS}
    • Edit opencv/gradle.build file. Add instructions to specific blocks (file provided in sample code). Also in the same build file, make sure your target and min Sdks matches your project’s Sdk version:
      minSdkVersion rootProject.ext.minSdkVersion
      targetSdkVersion rootProject.ext.targetSdkVersion
    • Edit android/cpp/CMakeLists.txt file to include OpenCV:
      set(OpenCV_DIR "../opencv/src/sdk/native/jni")
      find_package(OpenCV REQUIRED)
      message(STATUS "OpenCV libraries: ${OpenCV_LIBS}")
      target_link_libraries(YOUR_TARGET_LIB ${OpenCV_LIBS})
    • Include OpenCV in your app/gradle.build file, under dependencies block:
      implementation project(':opencv') 

Part Two: iOS

  1. You can continue to work inside ios folder of previously created react native project in part 1. (it can be recreated if necessary).
  2. Make sure the app in its default configuration is running with no problems. Perhaps you will need to run pod install inside ios folder, and then open the file mobileAfp.xcworkspace with Xcode to start working. In addition, I strongly recommend that Xcode and iOS to be updated to latest versions. At the time of preparing this tutorial, the latest versions were 11.3 for Xcode and 13.3 for iOS. Also, make sure you have record audio permission in your plist file.
  3. OpenCV is a necessary dependency. Download opencv pack for iOS. Make sure you download version 3.4.6 to guarantee compatibility. You can do experiments with different versions afterwards. 
  4. C++ can only be called from within Objective-c or Swift. To call c++ from JS, one should first try to call Swift from JS. Facebook has provided tutorial for this (exporting Swift section). This is also another tutorial. I will quickly run over the steps here again: 
    • Create Objective-C class (will call it RecAndFp.m) inside ios root folder. Use Xcode interface to avoid potential linking problems. go to file -> new -> choose Objective-C file and choose the default group to be placed in ios root folder. The implementation should follow standards of react-native (check sample code or Facebook tutorial for more info). We will export three methods like the following:
      #import "React/RCTBridgeModule.h"
      @interface
      RCT_EXTERN_MODULE(RecAndFp, NSObject)
      RCT_EXTERN_METHOD(start_recording)
      RCT_EXTERN_METHOD(stop_recording)
      RCT_EXTERN_METHOD(fp :(RCTPromiseResolveBlock)resolve   rejecter:(RCTPromiseRejectBlock)reject)
      @end 
    • In the same ios directory, create swift file RecAndFp.swift. In Xcode go to file -> new -> choose Swift file then choose the default group. This file should contain our implementation for the three functions listed above (Note: for now you may want test linking using just print messages).    
    • Whenever you use Objective-C and Swift in the same iOS project, bridging header is needed. In the same iOS directory, a file named mobileAfp-Bridging-Header.h should be created. You should be prompted by Xcode to do so. The name here is critical. The bridge should include the following for now:
      #import <Foundation/Foundation.h>
      #import <React/RCTBridgeModule.h>
       
  5. Make sure You now can call Swift from JS. You could perhaps print messages inside start_recording() function and test it out. 
  6. Now the c++ part. Apparently there are different ways to include c++ code on iOS project. Some listed here. I will walk-through one option here: 
    • Create objective-C ++ class in the ios root folder. call it fingerprint.mm. Notice the double .mm extension here. In Xcode go to file -> new -> choose Objective-C file and choose the default group. Don’t forget to double check the file name.
    • Create a header file for fingerprint.mm. Go to file -> new -> choose header file and name it fingerprint.h.
    • Update the bridging header by including the header file we've just created:
      #import "fingerprint.h"
    • Our c++ code require c++14. To enable it In your Xcode project go to -> target-> build settings -> Apple-clang-language c++ -> c++ language dialect -> choose GNU++14.
  7. At this point what is remaining is to include OpenCV and  c++ fingerprint function. But before that, make sure you can call c++ successfully from Swift. In addition, We will drop the boost library dependency Just like the Android part for the same reason.
  8. Now to include OpenCV. This article explains how to include the framework. I will run over it here again on what works for me, quoting “” from the same article:  
    • “Click on your project in the left navigator [choose your target, then] go to Build Phases -> Link Binary With Libraries. Click the “+”. Then “Add Other …”. [Then Choose add files] Direct to opencv2.framework and add it. This should automatically copy the framework file into your project directory.”
    • In the same linking section add the framework assetslibrary.framework. Make sure it is included first before opencv (notice the order).
    • Add $(PROJECT_DIR)/ as a framework path under your target settings. Go to your target, then “Under Build Settings ->Framework Search Paths, add the path: $(PROJECT_DIR)/.



That’s pretty much it. Remember to follow the steps carefully and make sure everything works before moving to the next step. This is a non-trivial task. Took me some time to set things up.

Final Note

If you ran into problems, always try the do the following first:


Sample code: what's included

The project will not be ready to run out of the box. You have at least to include OpenCV and make sure you have set the NDK for Android.




If you like the post please share it and let me know. Additionally, If you believe there is a mistake/misconception in either this article or the sample code please contact me, or you can use Github issues.