Book Image

Android Studio 4.1 Development Essentials – Java Edition

By : Neil Smyth
Book Image

Android Studio 4.1 Development Essentials – Java Edition

By: Neil Smyth

Overview of this book

For developers, Android 11 has a ton of new capabilities. The goal of this book is to teach the skills necessary to develop Android-based applications using the Java programming language. This book begins with the steps necessary to set up an Android development and testing environment. An overview of Android Studio along with the architecture of Android is covered next, followed by an in-depth look at the design of Android applications and user interfaces using the Android Studio environment. You will also learn about the Android architecture components along with some advanced topics such as touch screen handling, gesture recognition, the recording and playback of audio, app links, dynamic delivery, the AndroidStudio profiler, Gradle build configuration, and submitting apps to the Google Play Developer Console. The concepts of material design, including the use of floating action buttons, Snackbars, tabbed interfaces, card views, navigation drawers, and collapsing toolbars are a highlight of this book. This edition of the book also covers printing, transitions, and cloud-based file storage; the foldable device support is the cherry on the cake. By the end of this course, you will be able to develop Android 11 Apps using Android Studio 4.1, Java, and Android Jetpack. The code files for the book can be found here: https://www.ebookfrenzy.com/retail/androidstudio41/index.php
Table of Contents (88 chapters)
88
Index

28.6 Summary

Any physical contact between the user and the touch screen display of a device can be considered a “gesture”. Lacking the physical keyboard and mouse pointer of a traditional computer system, gestures are widely used as a method of interaction between user and application. While a gesture can be comprised of just about any sequence of motions, there is a widely used set of gestures with which users of touch screen devices have become familiar. A number of these so-called “common gestures” can be easily detected within an application by making use of the Android Gesture Detector classes. In this chapter, the use of this technique has been outlined both in theory and through the implementation of an example project.

Having covered common gestures in this chapter, the next chapter will look at detecting a wider range of gesture types including the ability to both design and detect your own gestures.