So far, we've seen how textures can be used to sample image data in a fragment shader, but we've only used them in a limited context. Some interesting issues arise when you start to look at texture use in more robust situations.
For example, if you were to zoom in on the cube from the previous demo, you would see that the texture begins to alias pretty severely.
As we zoom in, you can see jagged edges develop around the WebGL logo. Similar problems become apparent when the texture is very small on the screen. Isolated to a single object, such artifacts are easy to overlook, but they can become very distracting in complex scenes.
So why do we see these artifacts in the first place?
Recall from the previous chapter how vertex colors are interpolated, so that the fragment shader is provided a smooth gradient of color. Texture coordinates are interpolated in exactly the same way, with the resulting coordinates being provided to the fragment shader and used to sample color values...