When the iPhone was first announced, the multi-touch screen was the focal point of its high-tech appeal. However, it has always had a fantastic user interface due to the many other sensors that are built in, smoothing out every aspect of the user experience. The accelerometer knew when you tilted your device and rotated web pages automatically. A proximity sensor knew when your phone was up to your face so it would turn off the screen. An ambient light sensor would automatically adjust the backlight based on the room you were in.
Since those early days, each new iteration of the iPhone hardware has introduced more and more sensors that allow new generations of apps to provide even better experiences. Earlier in the book, we looked at very common input methods like multi-touch gestures, and positioning sensors like GPS with Core Location. In this chapter, we're going to take a look at some of the more advanced sensors, covering:
Device information through...