Book Image

iOS Application Development with OpenCV 3

By : Joseph Howse
4 (1)
Book Image

iOS Application Development with OpenCV 3

4 (1)
By: Joseph Howse

Overview of this book

iOS Application Development with OpenCV 3 enables you to turn your smartphone camera into an advanced tool for photography and computer vision. Using the highly optimized OpenCV library, you will process high-resolution images in real time. You will locate and classify objects, and create models of their geometry. As you develop photo and augmented reality apps, you will gain a general understanding of iOS frameworks and developer tools, plus a deeper understanding of the camera and image APIs. After completing the book's four projects, you will be a well-rounded iOS developer with valuable experience in OpenCV.
Table of Contents (13 chapters)
iOS Application Development with OpenCV 3
Credits
About the Author
Acknowledgments
About the Reviewer
www.PacktPub.com
Preface
Index

Developing a minimal application


So far, we have set up a development environment including Xcode, the iOS SDK, and OpenCV. Now, let's use these tools and libraries to develop our first iOS application. The app will have the following flow of execution:

  1. When the application starts:

    1. Load an image from a file that is bundled with the app.

    2. If the image is in color (not grayscale), automatically adjust its white balance.

    3. Display the image in fullscreen mode.

  2. Every two seconds:

    1. Create an updated image by applying a random tint to the original image.

    2. Display the updated image.

Note that the application will not use a camera or any user input at all. However, the user will see an image that appears to be backlit with a colorful, changing light. This is not really a demo of computer vision, but it is a demo of image processing and integration between the iOS SDK and OpenCV. Moreover, it is decorative, festive, and best of all it has a theme—cool pigs. Our app's name will be CoolPig and it will display a cool picture of a pig. Consider the following example of a black-and-white photo of a piglet (left), along with three tinted variants:

Note

In this book's print version, all images appear in grayscale. To see them in color, download them from Packt Publishing's website at https://www.packtpub.com/sites/default/files/downloads/iOSApplicationDevelopmentwithOpenCV3_ColorImages.pdf, or read the eBook.

The original image is the work of Gustav Heurlin (1862-1939), a Swedish photographer who documented rural life in the early 20th century. He was an early adopter of the autochrome color photography process, and National Geographic published many of his photographs during 1919-1931.

When our users see a pig in a beautiful series of pop-art colors, they will question their preconceptions and realize it is a really cool animal.

Note

To obtain the completed projects for this book, go to the author's GitHub repository at https://github.com/JoeHowse/iOSWithOpenCV, or log in to your account on Packt Publishing's site at https://www.packtpub.com/.

Creating the project

Open Xcode. Click on the Create new Xcode project button or select the File | New | Project… menu item. Now, a dialog asks you to choose a project template. Select iOS | Application | Single View Application, as shown in the following screenshot:

Single View Application is the simplest template as it just creates an empty GUI with no special navigational structure. Click on the Next button to confirm the selection. Now, a dialog asks you to pick a few project settings. Fill out the form as shown in the following screenshot:

Let's review the items in the form:

  • Product Name: This is the application's name, such as CoolPig.

  • Organization Name: This is the name of the application's vendor, such as Nummist Media Corporation Limited.

  • Organization Identifier: This is the vendor's unique identifier. The identifier should use reverse domain name notation, such as com.nummist.

  • Bundle Identifier: This is the application's unique identifier, which is generated based on the Product Name and Organization Identifier. This field is non-editable.

  • Language: This is the project's high-level programming language, either Objective-C or Swift. This book uses Objective-C, which is a pure superset of C and interoperable with C++ to a great extent. Swift is not interoperable with C++. OpenCV's core language is C++, so Objective-C's interoperability makes it an obvious choice.

  • Devices: This is the supported hardware, which may be Universal (all iOS devices), iPhone (including iPod Touch), or iPad. This book's projects are Universal.

  • Use Core Data: If this is enabled, the project will contain a database using Apple's Core Data framework. For this book's projects, disable it.

  • Include Unit Tests: If this is enabled, the project will contain a set of tests using the OCUnit framework. For this book's projects, disable it.

  • Include UI Tests: If this is enabled, the project will contain a set of tests using Apple's UI automation framework for iOS. Disable it for this book's projects.

Click on the Next button to confirm the project options. Now, a file chooser dialog asks you to pick a folder for the project. Pick any location, which we will refer to as <app_project_path>.

Optionally, you may enable the Create Git repository checkbox if you want to put the project under version control using Git. Click on the Create button. Now, Xcode creates and opens the project.

Adding files to the project

Use Finder or Terminal to copy files to the following locations:

  • <app_project_path>/opencv2.framework: This framework contains the standard OpenCV modules. We downloaded or built it previously, as described in the Getting the prebuilt framework with standard modules or Building the framework from source with extra modules section.

  • <app_project_path>/CoolPig/Piggy.png: This may be any cool picture of a pig in grayscale or color. Any species of pig is acceptable, be it a swine, boar, Muppet, or other variety.

Go back to Xcode to view the project. Navigate to the File | Add Files to "CoolPig"… menu item. Now, Xcode opens a file chooser dialog. Select opencv2.framework and click on the Add button. Repeat the same steps for CoolPig/Piggy.png. Note that these files appear in the project navigator pane, which is the leftmost section of the Xcode window. In this pane, drag Piggy.png to the CoolPig | Supporting Files group. When you are finished, the navigator pane should look similar to the following screenshot:

Configuring the project

First, let's configure our app to run in fullscreen mode with no status bar. Select the CoolPig project file at the top of the navigator pane. Now, select the General tab in the editor area, which is the central part of the Xcode window. Find the Deployment Info group, and enable the Hide status bar and Requires full screen checkboxes, as shown in the following screenshot:

The status bar and fullscreen settings are stored in the app's Info.plist file. Select CoolPig | CoolPig | Info.plist in the navigator pane. Now, in the editor area, note that the UIRequiresFullscreen and Status bar is initially hidden properties both have the YES value. However, we still need to add another property to ensure that the status bar will not appear. Hover over the last item in the list, and click on the + button to insert a new property. Enter View controller-based status bar appearance as the property's key and set its value to NO, as shown in the following screenshot:

Next, let's link the project with additional frameworks. OpenCV depends on two of Apple's frameworks called CoreGraphics.framework and UIKit.framework. Optionally, for optimizations, OpenCV can also use a third Apple framework called Accelerate.framework.

Note

The Accelerate framework contains Apple's hardware-accelerated implementation of industry-standard APIs for vector mathematics. Notably, it implements standards called Basic Linear Algebra Subprograms (BLAS) and Linear Algebra Package (LAPACK). OpenCV is designed to leverage these standards on various platforms including iOS.

Select the CoolPig project file in the navigator pane and then select the Build Phases tab in the editor area. Find the Link Binary With Libraries group. Click on the + button, select Accelerate.framework from the dialog, and click on the Add button. Repeat these steps for CoreGraphics.framework and UIKit.framework. Now, the editor area should look similar to the following screenshot:

Now, the linker will be able to find OpenCV's dependencies. However, we need to change another setting to ensure that the compiler will understand the C++ code in OpenCV's header files. Open the Build Settings tab in the editor area and find the Apple LLVM 7.0 - Language group. Set the value of the Compile Sources As item to Objective-C++, as seen in the following screenshot:

Note

Alternatively, we could leave the Compile Sources As item at its default value, which is According to File Type. Then, we would need to rename our source files to give them the extension .mm, which Xcode associates with Objective-C++.

We have just one more thing to configure in the Build Settings tab. Remember that we consider the opencv2_contrib modules to be an optional dependency of our projects, as described earlier in the Making the extra modules optional in our code section. If we did build opencv2.framework with these modules and if we do want to use their functionality, let's create a preprocessor definition, WITH_OPENCV_CONTRIB. Find the Apple LLVM 7.0 - Preprocessing group. Edit Preprocessor Macros | Debug and Preprocessor Macros | Release to add the WITH_OPENCV_CONTRIB text. Now, the settings should look like the following screenshot:

As a final, optional step in the configuration, you may want to set the app's icon. Select CoolPig | CoolPig | Assets.xcassets in the project navigator pane. Assets.xcassets is a bundle, which may contain several variants of the icon for different devices and different contexts (the Home screen, Spotlight searches, and the Settings menu).

Click on the AppIcon list item in the editor area and then drag and drop an image file into each square of the AppIcon grid. If the image's size is incorrect, Xcode will notify you so that you may resize the image and try again. Once you have added your images, the editor area might look similar to the following screenshot:

Laying out an interface

Now, our project is fully configured and we are ready to design its graphical user interface (GUI). Xcode comes with a built-in tool called Interface Builder, which enables us to arrange GUI elements, connect them to variables and events in our code, and even define the transitions between scenes (or informally, screens). Remember that CoolPig's GUI is just a fullscreen image. However, even our simple GUI has a transition between a static loading screen (where the image does not change color) and dynamic main screen (where the image changes color every two seconds). Let's first configure the loading screen and then the main screen.

Select CoolPig | CoolPig | LaunchScreen.storyboard in the navigator pane. This file is a storyboard, which stores the configuration of a set of scenes (or a single scene in this case). A scene hierarchy appears in the editor area. Navigate to View Controller Scene | View Controller | View. A blank view appears on the right-hand side of the editor area, as seen in the following screenshot:

Let's add an image view inside the empty view. Notice the list of available GUI widgets in the lower-right corner of the Xcode window. This area is called the library pane. Scroll through the library pane's contents. Find the Image View item and drag it to the empty view. Now, the editor area should look like this:

Drag the corners of the highlighted rectangle to make the image view fill its parent view. The result should look like this:

We still need to take a further step to ensure that the image view scales up or down to match the screen size on all devices. Click on the Pin button in the toolbar at the bottom of the editor area. The button's icon looks like a rectangle pinned between two lines. Now, a pop-up menu appears with the title Add New Constraints. Constraints define a widget's position and size relative to other widgets.

Specifically, we want to define the image view's margins relative to its parent view. To define a margin on every side, click on the four I-shaped lines that surround the square. They turn red. Now, enter 0 for the top and bottom values and -20 for the left and right values. Some iOS devices have built-in horizontal margins, and our negative values ensure that the image extends to the screen's edge even on these devices. The following screenshot shows the settings:

Click on the Add 4 Constraints button to confirm these parameters.

Finally, we want to show an image! Look at the inspector pane, which is in the top-right area of the Xcode window. Here, we can configure the currently selected widget. Select the Attributes tab. Its icon looks like a slider. From the Image drop-down list, select Piggy.png. From the Mode drop-down list, select Aspect Fill. This mode ensures that the image will fill the image view in both dimensions, without appearing stretched. The image may appear cropped in one dimension. Now, the editor area and inspector pane should look similar to the following screenshot:

So far, we have completed the loading screen's layout. Now, let's turn our attention to the main screen. Select CoolPig | CoolPig | Main.storyboard in the project navigator. This storyboard, too, has a single scene. Select its view. Add an image view and configure it in exactly the same way as the loading screen's image view. Later, in the Connecting an interface element to the code section, we will connect this new image view to a variable in our code.

Writing the code

As part of the Single View Application project template, Xcode has already created the following code files for us:

  • AppDelegate.h: This defines the public interface of an AppDelegate class. This class is responsible for managing the application's life cycle.

  • AppDelegate.m: This contains the private interface and implementation of the AppDelegate class.

  • ViewController.h: This defines the public interface of a ViewController class. This class is responsible for managing the application's main scene, which we saw in Main.Storyboard.

  • ViewController.m: This contains the private interface and implementation of the ViewController class.

For CoolPig, we simply need to modify ViewController.m. Select CoolPig | CoolPig | ViewController.m in the project navigator. The code appears in the editor area. At the beginning of the code, let's add more #import statements to include the header files for several OpenCV modules, as seen in the following code:

#import <opencv2/core.hpp>
#import <opencv2/imgcodecs/ios.h>
#import <opencv2/imgproc.hpp>

#ifdef WITH_OPENCV_CONTRIB
#import <opencv2/xphoto.hpp>
#endif

#import "ViewController.h"

We will need to generate random numbers to create the image's random tint. For convenience, let's define the following macro, which generates a 64-bit floating-point number in the range of 0 to 1:

#define RAND_0_1() ((double)arc4random() / 0x100000000)

Note

The arc4random() function returns a random 32-bit integer in the range of 0 to 2^32-1 (or 0x100000000). The first time it is called, the function automatically seeds the random number generator.

The remainder of ViewController.m deals with the private interface and implementation of the ViewController class. Elsewhere, in ViewController.h, the class is declared as follows:

@interface ViewController : UIViewController
@end

Note that ViewController is a subclass of UIViewController, which is an important class in the iOS SDK. UIViewController manages the life cycle of a set of views and provides reasonable default behaviors as well as many methods that may override these defaults. If we develop applications according to the model-view-controller (MVC) pattern, then UIViewController is the controller or coordinator, which enforces good separation between the platform-specific view or GUI and platform-independent model or "business logic".

Let's turn our attention back to the private interface of ViewController in ViewController.m. The class keeps the original image and updated image as member variables. They are instances of OpenCV's cv::Mat class, which can represent any kind of image or other multidimensional data. ViewController also has a reference to the image view where we will display the image. Another of the class's properties is an NSTimer object, which will fire a callback every two seconds. Finally, the class has a method, updateImage, which will be responsible for displaying a new random variation of the image. Here is the code for ViewController's private interface:

@interface ViewController () {
  cv::Mat originalMat;
  cv::Mat updatedMat;
}

@property IBOutlet UIImageView *imageView;
@property NSTimer *timer;

- (void)updateImage;

@end

Now, let's implement the methods of the ViewController class. It inherits many methods from its parent class, UIViewController, and we could override any of these. First, we want to override the viewDidLoad method, which runs when the scene is loaded from its storyboard. Typically, this is an appropriate time to initialize the view controller's member variables. Our implementation of viewDidLoad will begin by loading Piggy.png from file and converting it to OpenCV's RGB format. If the image was not originally grayscale and OpenCV's extra photo module is available, we will use a function from this module to adjust the white balance. Finally, we will start a timer to invoke our updateImage method every two seconds. Here is our code for viewDidLoad:

@implementation ViewController

- (void)viewDidLoad {
  [super viewDidLoad];
  
  // Load a UIImage from a resource file.
  UIImage *originalImage =
      [UIImage imageNamed:@"Piggy.png"];
  
  // Convert the UIImage to a cv::Mat.
  UIImageToMat(originalImage, originalMat);
  
  switch (originalMat.type()) {
    case CV_8UC1:
      // The cv::Mat is in grayscale format.
      // Convert it to RGB format.
      cv::cvtColor(originalMat, originalMat,
          cv::COLOR_GRAY2RGB);
      break;
    case CV_8UC4:
      // The cv::Mat is in RGBA format.
      // Convert it to RGB format.
      cv::cvtColor(originalMat, originalMat,
          cv::COLOR_RGBA2RGB);
#ifdef WITH_OPENCV_CONTRIB
      // Adjust the white balance.
      cv::xphoto::autowbGrayworld(originalMat,
          originalMat);
#endif
      break;
    case CV_8UC3:
      // The cv::Mat is in RGB format.
#ifdef WITH_OPENCV_CONTRIB
      // Adjust the white balance.
      cv::xphoto::autowbGrayworld(originalMat, originalMat);
#endif
      break;
    default:
      break;
  }
  
  // Call an update method every 2 seconds.
  self.timer = [NSTimer scheduledTimerWithTimeInterval:2.0
      target:self selector:@selector(updateImage)
      userInfo:nil repeats:YES];
}

Note

NSTimer only fires callbacks when the app is in the foreground. This behavior is convenient for our purposes because we only want to update the image when it is visible.

Now, let's implement the updateImage helper method. It will multiply each color channel by a random floating-point number. The following table describes the effects of multiplying various channels by a coefficient, k:

Value of k

Effect of multiplying red channel by k

Effect of multiplying green channel by k

Effect of multiplying blue channel by k

0 <= k < 1

Image becomes darker, with a cyan tint

Image becomes darker, with a magenta tint

Image becomes darker, with a yellow tint

k == 1

No change

No change

No change

k > 1

Image becomes brighter, with a red tint

Image becomes brighter, with a green tint

Image becomes brighter, with a blue tint

The following code generates the random color, multiplies it together with the original image, and displays the result in the image view:

- (void)updateImage {
  // Generate a random color.
  double r = 0.5 + RAND_0_1() * 1.0;
  double g = 0.6 + RAND_0_1() * 0.8;
  double b = 0.4 + RAND_0_1() * 1.2;
  cv::Scalar randomColor(r, g, b);
  
  // Create an updated, tinted cv::Mat by multiplying the
  // original cv::Mat and the random color.
  cv::multiply(originalMat, randomColor, updatedMat);
  
  // Convert the updated cv::Mat to a UIImage and display
  // it in the UIImageView.
  self.imageView.image = MatToUIImage(updatedMat);
}

@end

Tip

Feel free to adjust the range of each random color coefficient to your taste. OpenCV clamps the result of the multiplication so that a color channel's value cannot overflow the 8-bit range of 0 to 255.

We have implemented all the custom logic of CoolPig in just 50 lines of code! The project template, storyboard, iOS SDK, and OpenCV provide many useful abstractions and thus enable us to focus on writing concise, application-specific code.

Connecting an interface element to the code

Let's connect the image view in Main.Storyboard to the imageView property in ViewController.m. Open Main.Storyboard in the project navigator, hold command and click on View Controller in the scene hierarchy. A dialog with a dark background appears. Right-click on the Piggy.png image view in the scene hierarchy and drag it to the circle beside Outlets | imageView in the dark dialog box, as shown in the following screenshot:

Release the mouse button to complete the connection. Close the dark dialog box.

Building and running the application

We are ready to build the app and run it in an iOS simulator or on an iOS device. First, if you want to use an iOS device, connect it to the Mac via a USB cable. The first time you connect a device, Xcode's top toolbar might show a progress bar and message, Processing symbol files. Wait for the message to disappear. Now, click on the CoolPig drop-down menu in Xcode's top toolbar and select the device or simulator that you want to use, such as Devices | Joseph's iPad or iOS Simulators | iPad Pro. Click on the Run button. Its icon is the standard triangular play symbol. Xcode builds the app, copies it to the device or simulator, and then launches it. Watch the pig change colors! For example, the app might look like this on an iPad Mini device:

Tip

If you are using a simulator, you might find that its screen is too large to fit on your Mac's screen. To scale down the simulator's screen, go to the simulator's menu and select Window | Scale | 50% or another value.

Congratulations! We have built and run our first iOS application, including OpenCV for image processing and a pig for artistic reasons.