Book Image

iPhone User Interface Cookbook

By : Cameron Banga
Book Image

iPhone User Interface Cookbook

By: Cameron Banga

Overview of this book

The incredible growth rates for the iPhone, iPod touch, and iPad have pushed consumers to a new “App” economy, with developers racing to the platform. Mobile touch-centric interfaces vary greatly from traditional computing platforms, and programmers as well as designers must learn to adapt to the new form-factor.The iPhone User Interface Cookbook offers a complete breakdown of standard interface design on the iPhone, iPod touch, and iPad. You will learn the tools behind the trade, how to properly utilize standard interface elements, and custom UI tricks that will help your work stand out on the App Store.The book is designed to be a complete overview of interface design on all iOS platforms, offering insight and an inside look into app design. A variety of topics are covered, starting with an overview of tools for the app interface designer, touching upon popular interface components such as the Tab Bar, and offering suggestions for complex game interfaces. Whether you’re new to the platform or a seasoned developer with numerous applications in the App Store, this book strives to teach everyone simple and easy to implement tips for iOS interface design. Regardless of skill level, the iPhone User Interface Cookbook offers a detailed breakdown of all things interface design.
Table of Contents (18 chapters)
iPhone User Interface Cookbook
Credits
About the Author
About the Reviewers
www.PacktPub.com
PacktLib.PacktPub.com
Preface
The Importance of Direct Manipulation
If you need a stylus, you blew it

Appendix A. The Importance of Direct Manipulation

Without tactile feedback for a button press, users will be looking to other cues for confirmation that an interaction has occurred correctly. As user interface designers, it is our job to make sure that it is clear to the user when they’ve properly navigated through our application.

This feedback can occur in a variety of situations, be it when pressing upon the Tab Bar, when manipulating a photo, or when flicking through a list. Because our applications run on a touch screen, it’s important that we provide proper visual feedback nearly every time the user makes contact with the screen.

Visual feedback is most common, but auditory response is also possible in our application as well. Let’s look at how to use these functions to properly provide feedback and indicate to the user that the application is working correctly. By creating clarity for our users, we’ll create a better application that will gain higher reviews and minimize confusion.

Giving the user feedback

We’ve spent time discussing different possible gestures in iOS, such as the pinch or swipe, which will help up determine which interaction is best suited for the activity at hand when developing our application. However, it is not enough to only implement the correct gesture type, as we should also provide adequate visual feedback to the user to help give visual confirmation of the action.

When working in a traditional computer environment, visual feedback is still important; however the user still has a tactile input device to fall back upon. If we’re using a keyboard to type in a word processor and the program does not respond instantaneously, we still have the physical press of the key to reinforce that our action was properly performed.

A delay or pause between press of the keyboard and display of the character on screen can cause frustration for the user, but this action is unlikely to cause confusion. In tapping the keyboard, we reaffirm that we’ve hit the key correctly and a delay or failure to display the pressed key indicates that either the software is working improperly or there has been a hardware error. There is little uncertainty and with this knowledge, the user has some insight as to why the problem has occurred.

However, once we move to the touch screen, visual feedback becomes absolutely vital, as it serves as the primary method for showing the user that an interface element was touched correctly. Much like a word processor that doesn’t register keystrokes, a touch interface that doesn’t provide some sort of visual feedback can be a frustrating experience. But whereas our keyboard provides tactile confirmation upon pressing the key, we have no such luxury when working with a piece of glass on the iPhone.

If we fail to provide some sort of feedback, users will have little to fall back on and explain the error. Did the user miss the button with their finger? Is the application in some sort of processing state that doesn’t allow the button to be pressed? Is the button that the user is attempting to tap actually a button? These are just several questions that arise when improper visual feedback is given.

Providing visual response to a tap or shake isn’t enough though, as this reaction must also occur without any sense of delay to the user. In applications like Apple’s Photos, where images can be zoomed in or out on with a pinch, it is essential that apparent changes occur so that the users can recognize the action as well to allow for precise manipulation. Take an iPhone or iPad at hand and open the Photos or Safari app. While pinching in and out to pan the zoom, it is easy to understand the need for immediate visual reaction from our application.

If an iPhone were to take even an extra second after the pinch to properly proportion the newly sized image, the delay would cause frustration, as the user would need to wait for the photo to reappear on screen before knowing if the zoom level is acceptable. If the image is pinched in a bit too far, the user would then have to adjust the photo, wait for the picture to appear resized, and repeat this process again until the image is satisfactory.

Apple has solved a good portion of our problems for us with all standard system interface elements, which help to make our job easier. If we implement the standard Tab Bar, Navigation Bar, or Scroll View, our interface design will already perform basic and expected actions to visually aid a user. If we’re still uncomfortable with producing acceptable feedback to the user, we should reply upon Apple’s standard interface elements as much as possible. So long as we stay with established standards like those found in the Interface Builder, we should at least provide a minimal required level of visual affirmation.

Visual feedback will become much more difficult once we begin to design custom application elements. When creating these non-standard buttons, sliders, or navigation elements we should proceed with extreme caution. In many situations, we can create a great deal of confusion and difficulty for our user by creating a custom navigation or interaction element, mostly due to the fact that visual feedback will not respond as expected for the user.

If we do design to step outside Apple’s provided interface elements, we should look around iOS for inspiration on how our applications should respond to taps or swipes. Tapping on icons or a button provides a quick shaded overlay that acknowledges the press. When sliding the volume control in the iPod app, the representative slider quickly keeps up with our finger. Rotating an iPad from portrait to landscape view provides a smooth animation to help our brain comprehend the shift in perspective. In many applications, a loading wheel will spin to help signify that a task is currently processing and holding up the device temporarily. These responses are a mere sampling of how iOS responds to various interactions, and implementing a custom interface design; we should look to corresponding features in iOS which most closely mimic what we’re attempting to build for guidance.

Audio clues can work to provide equal aid to the user as well and are often neglected when building an interface. For several examples of sound response in iOS, we can find that tapping the pause or play controls on the iPod application will quickly start a song or a quick chime will accompany a new e-mail. Likewise, game applications often make use of short sound effects to signal that a character has jumped or a weapon has fired.

When looking to implement sound into our applications, we should refrain from excessively loud or startling sounds. Our auditory clues should be soft, subtle, and provide adequate confirmation while not intruding upon our user.

Visual and auditory clues are greatly important for the success of our app, as there is no other sense of possible feedback and users will rely upon such notifications.

Many influential and inspiring books have been written on the topic of human interface design and how the human brain psychologically breaks down and interprets information presented on a computer screen. Here are a few suggested books to read up on for more information on the subject.

  • Written by the father of the Macintosh, Jef Raskin, The Humane Interface (ISBN-13: 978-0201379372) is a masterpiece on user experience and interface design. Raskin dives into great length on a variety of meaningful topics and is able to simplify complex areas of cognitive research into simple lessons on creating better applications for the end user.

  • How to Think Like a Great Graphic Designer by Debbie Millman (ISBN-13: 978-1581154962) is a collection of 20 interviews with renowned designers, focusing on a variety of problems similar to what we’ll be fighting everyday. It’s a great book of insight into the lives of the best designers in the business.