Book Image

Android Studio 4.1 Development Essentials – Kotlin Edition

By : Neil Smyth
Book Image

Android Studio 4.1 Development Essentials – Kotlin Edition

By: Neil Smyth

Overview of this book

Android 11 has a ton of new capabilities. It comes up with three foci: a people-centric approach to communication, controls to let users quickly access and manage all of their smart devices, and privacy to give users more ways to control how data on devices is shared. This book starts off with the steps necessary to set up an Android development and testing environment, followed by an introduction to programming in Kotlin. An overview of Android Studio and its architecture is provided, followed by an in-depth look at the design of Android applications and user interfaces using the Android Studio environment. You will also learn about the Android architecture components along with some advanced topics such as touch screen handling, gesture recognition, the recording and playback of audio, app links, dynamic delivery, the AndroidStudio profiler, Gradle build configuration, and submitting apps to the Google Play Developer Console. The concepts of material design are also covered in detail. This edition of the book also covers printing, transitions, and cloud-based file storage; foldable device support is the cherry on the cake. By the end of this course, you will be able to develop Android 11 Apps using Android Studio 4.1, Kotlin, and Android Jetpack. The code files for the book can be found here: https://www.ebookfrenzy.com/retail/as41kotlin/index.php
Table of Contents (95 chapters)
95
Index

34.3 Understanding Touch Actions

An important aspect of touch event handling involves being able to identify the type of action performed by the user. The type of action associated with an event can be obtained by making a call to the getActionMasked() method of the MotionEvent object which was passed through to the onTouch() callback method. When the first touch on a view occurs, the MotionEvent object will contain an action type of ACTION_DOWN together with the coordinates of the touch. When that touch is lifted from the screen, an ACTION_UP event is generated. Any motion of the touch between the ACTION_DOWN and ACTION_UP events will be represented by ACTION_MOVE events.

When more than one touch is performed simultaneously on a view, the touches are referred to as pointers. In a multi-touch scenario, pointers begin and end with event actions of type ACTION_POINTER_DOWN and ACTION_POINTER_UP respectively. In order to identify the index of the pointer that triggered the event, the...