Book Image

Android Studio 4.0 Development Essentials - Java Edition

By : Neil Smyth
Book Image

Android Studio 4.0 Development Essentials - Java Edition

By: Neil Smyth

Overview of this book

Android rolls out frequent updates to meet the demands of the dynamic mobile market and to enable its developer community to lead advancements in application development. This book focuses on the updated features of Android Studio (the fully integrated development environment launched by Google) to build reliable Android applications using Java. The book starts by outlining the steps necessary to set up an Android development and testing environment. You’ll then learn how to create user interfaces with the help of Android Studio Layout Editor, XML files, and by writing the code in Java. The book introduces you to Android architecture components and advanced topics such as intents, touchscreen handling, gesture recognition, multi-window support integration, and biometric authentication, and lets you explore key features of Android Studio 4.0, including the layout editor, direct reply notifications, and dynamic delivery. You’ll also cover Android Jetpack in detail and create a sample app project using the ViewModel component. Finally, you’ll upload your app to the Google Play Console and handle the build process with Gradle. By the end of this book, you’ll have gained the skills necessary to develop applications using Android Studio 4.0 and Java.
Table of Contents (88 chapters)
88
Index

29.4 Identifying Specific Gestures

When a gesture is detected, the onGesturePerformed callback method is called and passed as arguments a reference to the GestureOverlayView object on which the gesture was detected, together with a Gesture object containing information about the gesture.

With access to the Gesture object, the GestureLibrary can then be used to compare the detected gesture to those contained in the gestures file previously loaded into the application. The GestureLibrary reports the probability that the gesture performed by the user matches an entry in the gestures file by calculating a prediction score for each gesture. A prediction score of 1.0 or greater is generally accepted to be a good match between a gesture stored in the file and that performed by the user on the device display.