It's important to be clear about what we mean by parallelization: what it is and what it isn't. The minimum definition of parallelization is performing calculations simultaneously. Generally, I use multithreading and parallelization interchangeably, but as we'll see, this "simultaneously" becomes somewhat gray with respect to modern processors and operating systems.
An application's ability to take advantage of processing resources is termed scalability. When we talk about the resources within a given computer, we generally categorize that scalability as vertical scalability which is the ability to scale up within the physical confines of a single node. This type of scalability is easier to achieve in some circumstances compared to the converse: horizontal scalability. Horizontal scalability is the ability of an application or system to scale out to take advantage of other nodes in a network. This generally means the system needs to be architected to be distributable. Attaining...