iOS

Register an account for free if you don’t have one.

Identify Music or TV with iOS SDK

This demo shows how to identify music ( songs ) or detect live TV channels by recorded sound with ACRCloud iOS SDK. Contact us if you have any question or special requirement about the SDK: support@acrcloud.com

Preparation

  • The newest ACRCloud iOS SDK which contains both ObjectC and Swift demo projects.
  • If you want to recognize music, you need a Audio Recognition project. ( See How to Recognize Music )
  • If you want to detect tv channels, you need a Live Channel Detection project. ( See How to Detect Live TV Channels )
  • Save the information of “host”, “access_key”, “access_secret” of your project.
  • Make sure you have Xcode installed.

Quick Trial

If you are familiar with iOS development.

  • Download the ACRCloud iOS SDK package and unzip it.
  • Open either ACRCloudDemo or ACRCloudDemo_Swift
  • Update accessKey, host and accessSecret in ViewController with the information of your project.
  • Run the demo project to test recognizing contents in the buckets of your project.

Step-by-Step Tutorial

Step 1

Download the ACRCloud iOS SDK package and unzip it.

 

Step 2

Open Xcode and create a new Single View iOS Application Project in Xcode. Click “Next” to choose the directory to place the folder.

51C5100D-B4CD-4CDC-89C9-DBA395A166BF

6610748E-E37A-4358-8423-B2FC5A16F0B6

 

07823BF2-95FF-4582-AF9B-5AAE76A3F5A0

Step 3

Copy libACRCloud_IOS_SDK.a and the two header files ACRCloudConfig.h and ACRCloudRecognition.h to the directory of your project.

Picture1

Add the three files above to your project by using the “Add Files to…“ function of the “File” menu.

Picture1

Picture1

Step 4

Open the configuration page of your project by selecting it and click “+” button in the “Linked Frameworks and Libraries” section to search and add these system frameworks and libraries below:

Security.framework
libc++.dylib
AVFoundation.framework
AudioToolbox.framework

Picture1

Picture1

Step 5

Just replace your default empty Main.storyboard, ViewController.h and ViewController.m with the corresponding files within the “ACRCloudDemo” we provided to get a quick overview of how our SDK’s works.

Step 6

Update accessKey, host and accessSecret in ViewController with the information of your project.

Then you can run the demo project to test recognizing contents in the buckets of your project.

Picture1

Recognition Mode

config.recMode is depending on the type of your project, rec_mode_remote is for Audio & Video Recognition, Live Channel Detection, Hybrid Recognition and rec_mode_local for Offline Recognition.

For Offline Recognition, please put the offline database ( such as “acrcloud_local_db” ) into your app project’s workspace.

One-time Recognition and Loop Recognition

One-time Recognition:

Click the “Start” Button and the App Demo will begin to record and recognize. When it detects a result, the app demo will stop and display the result. In the progress of recording and detecting, you can stop (click the “Stop” Button”) this recognition at any time.

Loop Recognition:

Picture1

If you remove these two lines of code above, the App Demo will not stop recording and detecting until you click the “Stop” Button. Once the app detect a result or one loop is over, then you can get result from “handleResult” Block, and then the app will continue the detection process.

Open Prerecording Recognition

Open prerecording will make the recognition much more faster.

If you want open this feature, please call  -(void)startPreRecord:(NSInteger)recordTime
The parameter recordTime is the prerecording time. The recommend value is 3000-4000

What’s Next

See  iOS SDK Reference to start integration.