Book Image

Augmented Reality for Android Application Development

Book Image

Augmented Reality for Android Application Development

Overview of this book

Augmented Reality offers the magical effect of blending the physical world with the virtual world, which brings applications from your screen into your hands. AR redefines advertising and gaming, as well as education. It will soon become a technology that will have to be mastered as a necessity by mobile application developers. Augmented Reality for Android Application Development enables you to implement sensor-based and computer vision-based AR applications on Android devices. You will learn about the theoretical foundations and practical details of implemented AR applications, and you will be provided with hands-on examples that will enable you to quickly develop and deploy novel AR applications on your own. Augmented Reality for Android Application Development will help you learn the basics of developing mobile AR browsers, how to integrate and animate 3D objects easily with the JMonkeyEngine, how to unleash the power of computer vision-based AR using the Vuforia AR SDK, and will teach you about popular interaction metaphors. You will get comprehensive knowledge of how to implement a wide variety of AR apps using hands-on examples. This book will make you aware of how to use the AR engine, Android layout, and overlays, and how to use ARToolkit. Finally, you will be able to apply this knowledge to make a stunning AR application.
Table of Contents (14 chapters)
Augmented Reality for Android Application Development
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Free Chapter
1
Augmented Reality Concepts and Tools
Index

Index

A

  • accelerometers
    • about / Understanding sensors
    • used, for simple gesture recognition / Simple gesture recognition using accelerometers
  • ADT
    • installing / Installing the Android Developer Tools Bundle and the Android NDK
  • advanced interaction techniques
    • about / Advanced interaction techniques
  • Android Debug Bridge (adb) / Installing JMonkeyEngine
  • Android devices
    • selecting / Which Android devices should you use?
  • Android Native Development Kit (NDK) / What you need for this book, System requirements for development and deployment
  • Android NDK
    • installing / Installing the Android Developer Tools Bundle and the Android NDK
  • AR
    • overview / A quick overview of AR concepts
    • sensory augmentation / Sensory augmentation
    • aspects / Registration in 3D
    • sensor-based AR / Sensor-based AR
    • computer vision-based AR / Computer vision-based AR
    • architecture concepts / AR architecture concepts
  • AR browser
    • content, obtaining from / Getting content for your AR browser – the Google Places API
  • architecture, Vuforia / VuforiaTM architecture
  • AR control flow
    • about / AR control flow
    • display, managing / AR control flow
    • objects / AR control flow
  • AR main loop / AR control flow
  • aspect ratio / Camera characteristics

B

  • Buffer control setting / Camera characteristics

C

  • C++ integration
    • Vulforia, integrating with JME / The C++ integration
  • calculateAccMagOrientation function / Sensor fusion in JME
  • calculatedFusedOrientationTask function / Sensor fusion in JME
  • camera
    • about / Understanding the camera
    • characteristics / Camera characteristics
    • versus screen characteristics / Camera versus screen characteristics
    • accessing, in Android / Accessing the camera in Android
  • camera accessing, in Android
    • Eclipse project, creating / Creating an Eclipse project
    • permissions / Permissions in the Android manifest
    • camera parameters, setting / Setting camera parameters
    • SurfaceView, creating / Creating SurfaceView
  • CameraAccessJMEActivity method / Creating the JME activity
  • camera characteristics
    • about / Camera characteristics
    • resolution / Camera characteristics
    • Frame rate / Camera characteristics
    • white balance / Camera characteristics
    • focus / Camera characteristics
    • pixel format / Camera characteristics
    • playback control setting / Camera characteristics
    • Buffer control setting / Camera characteristics
    • configuring, points / Camera characteristics
  • Camera Coordinate System
    • about / The building blocks of 3D rendering
  • cloud recognition
    • about / Cloud recognition
  • computer vision-based AR / Computer vision-based AR
  • computer vision-based tracking
    • about / Introduction to computer vision-based tracking and VuforiaTM
  • content
    • obtaining, for AR browser / Getting content for your AR browser – the Google Places API
    • managing / Managing your content
  • content management technique
    • cloud recognition / Cloud recognition
  • content management techniques
    • about / Managing your content
    • multi-targets / Multi-targets
  • coordinate system
    • creating / Multi-targets
  • coordinate systems
    • about / The building blocks of 3D rendering
    • World Coordinate System / The building blocks of 3D rendering
    • Camera Coordinate System / The building blocks of 3D rendering
    • Local Coordinate System(s) / The building blocks of 3D rendering
  • Coriolis Effect
    • about / Understanding sensors

D

  • 3D registration
    • in AR / Registration in 3D
  • 3D rendering
    • about / The building blocks of 3D rendering
  • 3D selection
    • performing, ray picking used / Pick the stick – 3D selection using ray picking
  • 6 degrees of freedom (6DOF) tracking / Sensor fusion in JME
  • Dalvik Debug Monitor Server view (DDMS) / Installing JMonkeyEngine
  • depth-only rendering
    • about / Improving recognition and tracking
  • detectShake() method / Simple gesture recognition using accelerometers
  • device
    • location, tracking / JME and GPS – tracking the location of your device
  • device tracking
    • versus user tracking / GPS and GNSS
  • Display module / AR software components
  • displays / Sensory augmentation

E

  • ECEF (Earth-Centered, Earth-Fixed) format
    • about / JME and GPS – tracking the location of your device
  • ENU (East-North-Up) coordinate system
    • about / JME and GPS – tracking the location of your device
  • Euclidian geometry
    • about / The building blocks of 3D rendering

F

  • Fiducial markers
    • about / Understanding frame markers
  • Fiducials
    • about / Choosing physical objects
  • field of view (FOV) / Camera parameters (intrinsic orientation)
  • frame markers
    • about / Understanding frame markers
  • Frames Per Second (FPS) / Camera characteristics

G

  • g-force acceleration
    • about / Understanding sensors
  • gesture recognition
    • accelerometers used / Simple gesture recognition using accelerometers
  • getCameraInstance() method / Creating an activity that displays the camera
  • getParameters() method / Creating the JME activity
  • getRotationMatrixFromOrientation function / Sensor fusion in JME
  • getRotationVectorFromGyro function / Sensor fusion in JME
  • GNSS
    • about / GPS and GNSS
  • GNSS (Global Navigation Satellite System) / Sensor-based AR
  • Google Places API
    • about / Getting content for your AR browser – the Google Places API, Querying for POIs around your current location
  • Google Places results
    • parsing / Parsing the Google Places APIs results
  • GPS
    • handling / Knowing where you are – handling GPS
    • about / GPS and GNSS
  • gyroFunction function / Sensor fusion in JME
  • gyroscopes
    • about / Understanding sensors

H

  • head up (HU) display / Registration in 3D

I

  • inertial measurement unit (IMU) / Understanding sensors
  • inertial sensors
    • handling / Knowing where you look – handling inertial sensors
  • initializeCameraParameters() method / Setting camera parameters
  • initVideoBackground method / Creating the JME application
  • Integrated Development Environment (IDE) / System requirements for development and deployment
  • intrinsic parameters, virtual camera
    • focal length, of lens / Camera parameters (intrinsic orientation)
    • image center / Camera parameters (intrinsic orientation)
    • skew factor / Camera parameters (intrinsic orientation)

J

  • Java integration
    • Vulforia, integrating with JME / The Java integration
  • JME
    • Vuforia, integrating with / Putting it together – VuforiaTM with JME
  • JMonkeyEngine (JME)
    • about / System requirements for development and deployment
    • installing / Installing JMonkeyEngine
    • live camera view / Live camera view in JME
    • activity, creating / Creating the JME activity
    • application, creating / Creating the JME application

L

  • lag / Sensor-based AR
  • listener
    • about / Pick the stick – 3D selection using ray picking
  • Local Coordinate System(s)
    • about / The building blocks of 3D rendering
  • location
    • tracking, of device / JME and GPS – tracking the location of your device
  • Location Manager service / JME and GPS – tracking the location of your device

M

  • magnetometers
    • about / Understanding sensors
  • matrixMultiplication function / Sensor fusion in JME
  • mCamera.getParameters() method / Setting camera parameters
  • MEMS
    • about / Understanding sensors
  • mid-air interaction
    • about / Advanced interaction techniques
  • mobile AR / Choose your style – sensor-based and computer vision-based AR
  • motion sensors
    • TYPE_LINEAR_ACCELERATION / Sensors in JME
    • TYPE_ROTATION_VECTOR / Sensors in JME
    • TYPE_GYROSCOPE / Sensors in JME
    • TYPE_GRAVITY / Sensors in JME
    • TYPE_ACCELEROMETER / Sensors in JME
    • about / Sensors in JME
  • multi-targets
    • about / Multi-targets

N

  • natural feature tracking targets
    • about / Understanding natural feature tracking targets

O

  • objects recognition
    • Vulforia, configuring for / Configuring VuforiaTM to recognize objects
  • onPause() method / Creating an activity that displays the camera
  • onResume() method / Creating an activity that displays the camera, Creating the JME activity
  • onShake() method / Simple gesture recognition using accelerometers
  • onSurfaceChanged() method / Creating the JME activity
  • OpenGL ES (OpenGL for Embedded Systems) / System requirements for development and deployment
  • optical see-through (OST) technology / Displays
  • orientation tracking
    • improving / Improving orientation tracking – handling sensor fusion
  • overlay
    • improving / Improving the overlay

P

  • pattern checking step
    • about / Understanding frame markers
  • phantom object
    • about / Improving recognition and tracking
  • photorealism rendering
    • about / Improving recognition and tracking
  • physical objects
    • selecting / Choosing physical objects
  • Playback control setting / Camera characteristics
  • Points of Interests (POIs) / Sensor-based AR
  • pose estimation step
    • about / Understanding frame markers
  • preparePreviewCallbackBuffer() method / Creating the JME activity
  • proximity-based interaction
    • about / Proximity-based interaction
  • proximity technique
    • about / Proximity-based interaction

Q

  • Qualcomm chipsets
    • about / VuforiaTM architecture
  • query, for POIs
    • around current location / Querying for POIs around your current location

R

  • ray picking
    • used, for 3D selection / Pick the stick – 3D selection using ray picking
    • about / Pick the stick – 3D selection using ray picking
  • real camera / Real camera and virtual camera
  • recognition
    • improving / Improving recognition and tracking
  • rectangle detection
    • about / Understanding frame markers
  • releaseCamera() method / Creating an activity that displays the camera

S

  • scenegraph
    • used, for overlaying 3D model into camera view / Using the scenegraph to overlay a 3D model onto the camera view
  • sensor-based AR / Sensor-based AR
  • sensor fusion
    • handlng / Improving orientation tracking – handling sensor fusion
    • overview / Sensor fusion in a nutshell
    • in JME / Sensor fusion in JME
  • sensors
    • accelerometers / Understanding sensors
    • magnetometers / Understanding sensors
    • gyroscopes / Understanding sensors
    • in JME / Sensors in JME
  • sensory augmentation
    • about / Sensory augmentation
    • displays / Displays
    • 3D registration / Registration in 3D
    • environment interations / Interaction with the environment
  • setTexture method / Creating the JME activity
  • simpleUpdate() method / Creating the JME application
  • software components, AR
    • application layer / AR software components
    • OS/Third Party layer / AR software components
    • AR layer / AR software components
  • surfaceChanged method / Creating SurfaceView

T

  • Tangible User Interface (TUI)
    • about / Advanced interaction techniques
  • tracking
    • improving / Improving recognition and tracking
  • transformations
    • about / The building blocks of 3D rendering

U

  • user tracking
    • versus device tracking / GPS and GNSS

V

  • video see-through (VST) technology / Displays
  • virtual camera
    • about / The building blocks of 3D rendering, Real camera and virtual camera
    • intrinsic parameters / Camera parameters (intrinsic orientation)
    • extrinsic parameters / Camera parameters (intrinsic orientation)
  • virtual control pad
    • about / Advanced interaction techniques
  • Vuforia
    • installing / Installing VuforiaTM
    • about / VuforiaTM architecture
    • architecture / VuforiaTM architecture
    • configuring, for objects recognition / Configuring VuforiaTM to recognize objects
    • URL, for developer website / Configuring VuforiaTM to recognize objects
    • integrating, with JME / Putting it together – VuforiaTM with JME
  • Vuforia Augmented Reality Tools (VART) / What you need for this book, System requirements for development and deployment

W

  • Wiimote / Simple gesture recognition using accelerometers
  • World Coordinate System
    • about / The building blocks of 3D rendering