Parallel programming is easy to understand and utilize when the work to be done is completely independent. It’s the interaction between concurrent tasks of an application that are challenging and therefore require a plan for managing sharing between concurrent tasks. The seemingly most fundamental sharing is simple sharing of data via shared memory, and yet nothing gives rise to more challenges in concurrent programming. All parallel computer designs struggle to offer some relief, varying from simple to exotic solutions, but in all cases the best results come from reduced sharing and the worst from unnecessary and frequent fine-grained sharing of data.
Nothing is more fundamental to parallel programming than understanding both sharing and scaling as well as the general relationship between them. Understanding sharing and how to manage it is the key to parallel programming — less is better.
Read the Full Story.