Introduction
Code for rendering complex scenes can soon become quite difficult to organize. To improve this, we will build a simple rendering framework. This framework will take care of the low-level device and swap chain management, assist with device resource lifecycle management, and allow the application to focus on the elements of the scene instead.
All 3D objects ultimately consist of one or more vertices that together form one of these core primitive shapes: points, lines, or triangles. As we discussed in the previous chapter, vertices can include information such as position, color or texture coordinate, and a normal vector. In this chapter we will learn how to define these structures in shaders and the Input Assembler (IA) fixed pipeline stage.
A vital component to any 3D scene is setting up the camera and projection. We will learn how to initialize each of these and where vertices are transformed from the local object or model space into the World/View/Projection (WVP) space ...