Register an account for free if you don’t have one.
Identify Music or TV with iOS SDK
This demo shows how to identify music ( songs ) or detect live TV channels by recorded sound with ACRCloud iOS SDK. Contact us if you have any question or special requirement about the SDK: firstname.lastname@example.org
- The newest ACRCloud iOS SDK which contains both ObjectC and Swift demo projects.
- If you want to recognize music, you need a Audio Recognition project. ( See How to Recognize Music )
- If you want to detect tv channels, you need a Live Channel Detection project. ( See How to Detect Live TV Channels )
- Save the information of “host”, “access_key”, “access_secret” of your project.
- Make sure you have Xcode installed.
If you are familiar with iOS development.
- Download the ACRCloud iOS SDK package and unzip it.
- Open either ACRCloudDemo or ACRCloudDemo_Swift
- Update accessKey, host and accessSecret in ViewController with the information of your project.
- Run the demo project to test recognizing contents in the buckets of your project.
Download the ACRCloud iOS SDK package and unzip it.
Open Xcode and create a new Single View iOS Application Project in Xcode. Click “Next” to choose the directory to place the folder.
Copy libACRCloud_IOS_SDK.a and the two header files ACRCloudConfig.h and ACRCloudRecognition.h to the directory of your project.
Add the three files above to your project by using the “Add Files to…“ function of the “File” menu.
Open the configuration page of your project by selecting it and click “+” button in the “Linked Frameworks and Libraries” section to search and add these system frameworks and libraries below:
Just replace your default empty Main.storyboard, ViewController.h and ViewController.m with the corresponding files within the “ACRCloudDemo” we provided to get a quick overview of how our SDK’s works.
Update accessKey, host and accessSecret in ViewController with the information of your project.
Then you can run the demo project to test recognizing contents in the buckets of your project.
config.recMode is depending on the type of your project,
rec_mode_remote is for Audio & Video Recognition, Live Channel Detection, Hybrid Recognition, it’s online recognition
rec_mode_local is for Offline Recognition, please put the offline database ( such as “acrcloud_local_db” ) into your app project’s workspace.
rec_mode_both support both online and offline recognition. it will search the local db first, then search the cloud db.
rec_mode_advance_remote : is as almost as same as rec_mode_remote, except that you can get the fingerprint data when the network does not work. You should set resultFpBlock to the ACRCloudConfig
One-time Recording Session Recognition and Loop Recognition
Click the “Start” Button and the App Demo will begin to record and recognize. When it detects a result, the app demo will stop and display the result. In the progress of recording and detecting, you can stop (click the “Stop” Button”) this recognition at any time.
If you remove these two lines of code above, the App Demo will not stop recording and detecting until you click the “Stop” Button. Once the app detect a result or one loop is over, then you can get result from “handleResult” Block, and then the app will continue the detection process.
Open Prerecording Recognition
Open prerecording will make the recognition much more faster.
If you want open this feature, please call -(void)startPreRecord:(NSInteger)recordTime
The parameter recordTime is the prerecording time. The recommend value is 3000-4000
if you recongize audio data, the audio format shoud be RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, mono 8000 Hz, you also can use the resample function to convert the audio data to what we need.
Convert your audio format to RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, mono 8000 Hz
See iOS SDK Reference to start integration.