Book Image

Android Studio Arctic Fox Essentials - Kotlin Edition

By : Neil Smyth
Book Image

Android Studio Arctic Fox Essentials - Kotlin Edition

By: Neil Smyth

Overview of this book

Android Studio is an Integrated Development Environment based on the JetBrains IntelliJ IDEA. It offers developers a unique platform to design and develop Android apps using various developer tools. Fully updated for Android Studio Arctic Fox, the goal of this book is to teach the skills necessary to develop Android-based applications using the Kotlin programming language. This book begins with an outline of the steps necessary to set up an Android development and testing environment, followed by an introduction to programming in Kotlin which includes data types, control flow, functions, lambdas, and object-oriented programming. An overview of Android Studio covers areas such as tool windows, the code editor, and the layout editor tool. An introduction to the architecture of Android is followed by an in-depth look at the design of Android applications and user interfaces using the Android Studio environment. Early chapters detail Android architecture components such as view models, lifecycle management, Room database access, the Database Inspector, app navigation, live data, and data binding. More advanced topics such as intents are also covered, as are touch screen handling, gesture recognition, and the recording and playback of audio. This edition of the book also covers printing, transitions, cloud-based file storage, and foldable device support. The concepts of material design are also discussed in detail, including the use of floating action buttons, Snackbars, tabbed interfaces, card views, navigation drawers, and collapsing toolbars. Other key features of Android Studio Arctic Fox and Android taught in this book include the Layout Editor, the ConstraintLayout and ConstraintSet classes, MotionLayout Editor, view binding, constraint chains, barriers, and direct reply to notifications. Chapters also explore more advanced features of Android Studio such as app links, dynamic delivery, Gradle build configuration, and submitting apps to the Google Play developer console.
Table of Contents (93 chapters)
93
Index

35. Detecting Common Gestures Using the Android Gesture Detector Class

The term “gesture” is used to define a contiguous sequence of interactions between the touch screen and the user. A typical gesture begins at the point that the screen is first touched and ends when the last finger or pointing device leaves the display surface. When correctly harnessed, gestures can be implemented as a form of communication between user and application. Swiping motions to turn the pages of an eBook, or a pinching movement involving two touches to zoom in or out of an image are prime examples of the ways in which gestures can be used to interact with an application.

The Android SDK provides mechanisms for the detection of both common and custom gestures within an application. Common gestures involve interactions such as a tap, double tap, long press or a swiping motion in either a horizontal or a vertical direction (referred to in Android nomenclature as a fling).

The goal of...