Parallel Programming vs. Concurrent Programming

Sanjay Kumar
2 min readFeb 20, 2019

--

These two terms can be well defined if we could understand “Executing simultaneously” vs. “in progress at the same time”.

For instance, The Art of Concurrency defines the difference as follows:

A system is said to be concurrent if it can support two or more actions in progress at the same time. A system is said to be parallel if it can support two or more actions executing simultaneously. The key concept and difference between these definitions is the phrase “in progress.”

Even though such definition is concrete, it is not intuitive enough to imagine what “in progress” indicates.

While considering parallel programming, programs are set to use parallel hardware to execute computation more quickly. The main concern for it to achieve efficiency.

More concretely, parallel programming requires us to think about:

  • How to code for the smaller sub-problems of the original large problem? So that multiple actions can strictly be executed at the same time to improve efficiency.
  • What is the optimal use of parallel hardware?

Applications such as matrix multiplication, data analysis, 3D rendering, and particle simulation are some of the examples of parallel paradigm.

Using multiple processing resources (such as CPU, cores) at once to solve a problem faster

A classic example in the programming world could be when a sorting algorithm that has several threads each sort part of an array.

In contrast to the parallel paradigm, multiple actions are not necessarily to be executed simultaneously in a concurrent paradigm because of user-side manageability.

Concurrency cares about beyond efficiency, and it’s main concerns are Modularity, responsiveness, and maintainability. In order to program concurrently, one must ask the following questions :

  • When can an execution start?
  • How can information exchange occur?
  • How does code manage access to shared resources?

Well-organized practical applications such as web server, user interface and database are some of its implementation in the concurrent paradigm. Even if parallelism is lost to some degree, convenience behind systems is more important in concurrent programming.

Multiple execution flows (e.g. threads) accessing a shared resource at the same time

A classic example in the programming world could be when multiple threads make changes to the same data structure.

It is very important to distinguish parallelism from concurrency while seeking an appropriate solution for a large scale problem, but they are considered interchangeably in reality.

--

--