Emgu CV for iOS
Prerequisite
You will need Xamarin iOS Bussiness version for iOS development. Click here to purchase a license or download an evaluation copy.
Getting Emgu CV for iOS
Emgu CV for iOS is available as a commercial product. You can purchase Emgu CV iOS Commercial License 3.0 from our web store. Once the license is purchased, the receipt, download url and login credential will be emailed to you.
Differences between Emgu CV for iOS and the Desktop version
- The System.Drawing namespace is provided by OpenTK package on MonoTouch.
- MonoTouch does not contains the implementation of the "Bitmap" class. As a result, the Image<,> class does not have constructors that accepts Bitmap, it does not contains the Bitmap property nor ToBitmap() function. Instead, the Image<,> class contains a constructor that accepts a "CGImage" that converts a CGImage to Image<,> object, It also contains an ToUIImage() function to convert the Image<,> object to native UIImage object.
- The Emgu.CV.GPU namespace is only provided for compilation against shared code developed for desktop system that optionally use GPU computation. The function GpuInvoke.HasCuda will always return false on iOS. If you are doing a fresh development for iOS you can skip this reference.
Creating a MonoTouch Project
We will show you how to create a new MonoTouch Project for iOS.
- First, download Emgu CV for iOS and extract it to a local drive. From Mac OS, open the file "Solution/MonoTouch/Emgu.CV.MonoTouch.sln". You will find a MonoTouch solution with some sample projects. In this case we will walk you through creating a new face detection sample.
- Add a new MonoTouch universal project (for both iPhone and IPad) and we will name the project FaceDetection
- Add references to the project. The following references are needed for FaceDetection:
Emgu.Util.MonoTouch, Emgu.CV.MonoTouch, Emgu.CV.GPU.MonoTouch, OpenTK, MonoTouch.Dialog-1
We need Emgu.CV.GPU.MonoTouch here because we are going to share the engine source code with Windows and Linux, which can take advantage of GPU if CUDA compatible one is presented. If you do not plan to reuse source code that take advantage of GPU on desktop/server platform, you can skip this reference. We will also need to reference OpenTK because this reference provide System.Drawing namespace on MonoTouch. We add MonoTouch.Dialog-1 because we want to use it for GUI development.
- Now go into project configuration and make sure Linker behaviour is set to "Dont' Link" for both debug and release, for both iphone simulator and device. Otherwise you may get a compilation errors.
- Go to advance tab of the configuration, for supported architecture, you may wants to select ARMv6+ARMv7 for both debug and release, for both iphone simulator and device. This allow the compiled program to run on all versions of iPhone and iPad. Enabling ARMv7 also means if the iOS device contains an ARMv7 CPU, it can take advantage of the ARMv7 CPU architecture. For release mode on IPhone Device, you may wants to enable "Use LLVM optimizing compiler" for best performance.
- Next we will need to add the data files to the project. We will add the image "lena.jpg", the haarcascade files "haarcascade_eye.xml" and "haarcascade_frontalface_default.xml" file as project content. We will also add the "DetectFace.cs" file from our windows Face Detection example as a linked file, such that the computational routine is shared by the desktop version and the mobile version of the Face Detection program.
- Now go to the AppDelegate.cs file and we will implement the GUI components. If you have added MonoTouch.Dialog-1 as a reference to the project, we will use it for the Face Detection GUI. In the top of the source code, add the following using statements:
using Emgu.CV;
using Emgu.CV.Structure;
using Emgu.Util;
using System.Drawing;
using MonoTouch.Dialog;
Add the following code in the body of the FinishedLaunching function:
RootElement root = new RootElement("");
UIImageView imageView = new UIImageView(window.Frame);
StringElement messageElement = new StringElement("");
root.Add(new Section()
{ new StyledStringElement("Process", delegate {
long processingTime;
using (Image<Bgr, Byte> image = new Image<Bgr, Byte>("lena.jpg"))
{
DetectFace.DetectAndDraw(image, out processingTime);
using (Image<Bgr, Byte> resized =image.Resize((int)window.Frame.Width, (int)window.Frame.Height, Emgu.CV.CvEnum.INTER.CV_INTER_NN, true))
{
imageView.Frame = new RectangleF(PointF.Empty, resized.Size);
imageView.Image = resized.ToUIImage();
}
}
messageElement.Value = String.Format("Processing Time: {0} milliseconds.", processingTime);
messageElement.GetImmediateRootElement().Reload(messageElement, UITableViewRowAnimation.Automatic);
imageView.SetNeedsDisplay();
}
)});
root.Add(new Section() {messageElement});
root.Add(new Section() {imageView});
DialogViewController viewController = new DialogViewController(root);
window.RootViewController = viewController;
The code above will create a "Process" button, such that once it is clicked, it will load "lena.jpg", perform face and eyes detection and draw it on iOS. The function DetectFace.DetectAndDraw is the same function that is provided in the Windows Face Detection example that we re-use for mobile development.
- We are ready to test the face detection program on Simulator. Just click the "run" button in MonoTouch and the program will be build. The simulator will open up to install the program and run it from there. Just click the "Process" button on the simulator and the result of the face detection will be displayed.
The simulator run the application on a mac mini with a dual core 2.3GHz i5 cpu, so the computation is done relatively fast compares to a physical device.
- At last, we will run it on an iPad 2 in release mode. you will need to register your device for development under the Apple's developer program before going to the next step. If you have already done so, just connect the iPad to your Mac, select "Release|iPhone" in MonoTouch, rebuild "Emgu.CV.MonoTouch" project and the "FaceDetection Project". What is left is "Run" the project. For more information view here.