When Should I Use Parallel Programming? Gustafson’s Law Explained

An optimistic perspective on Amdhal’s Law

Kelvin Muchiri
4 min readApr 18, 2023
Photo by Jason Yuen on Unsplash

In our previous article, we looked at how we can use Amdhal’s law to determine when it is appropriate to use parallel programming.

Kindly read When Should I Use Parallel Programming? Amdahl’s Law Explained before reading this post

Amdhal’s Law assumes that the size of our problem is fixed, and given a faster machine, our goal is to run the problem faster.

However, in the majority of cases, this is not entirely true. When given a faster machine, we would want to increase the size of the problem or increase the accuracy of the solution.

If the total amount of time our task ran is 5 seconds, having 3 tasks running in the parallelizable part, we could double the number of tasks running in the parallelizable part from 3 to 6 and our task would still run in 5 seconds.

In Gustafson’s Law, the constant is not the size of the problem, but rather the time that we want to wait for the result. Gustafson observed that the parallel part of the problem scales with the problem size while the sequential part hardly does.

A real-life example would be game development and graphic design. Given a faster machine, programmers will include more graphical or user-friendly features.

Gustafson’s Law

If you apply N processors to a task that has a serial fraction s, scaling the task takes the same amount of time as before.

Work is going N times fast as 1 processor for the parallel part.

scaled speedup = s + p x N
= s + (1-s) x N
= N +(1−N)s

Gustafson’s Law in Action

Suppose we have a program that is 70% parallel and 30% sequential, and we have 10 processors

Scaled speedup = N +(1−N)s
= 10 + (1–10) x 0.3
= 7.3

Suppose for the program above, and we now have 1000 processors

Scaled speedup = N +(1−N)s
= 1000 + (1–1000) x 0.3
= 700.3

If we used Amdhal’s Law, the increase in speed up would be from 2.7 to 3.3.

Gustafson’s Law Caveat

For numerous software programs, it’s feasible to instrument and quantify the amount of time s devoted to serial execution. This can be achieved by placing timers around the serial sections of the program to obtain an estimation of s.

Based on this fraction, Amdahl’s Law and Gustafson’s Law can then be used to estimate speedup. However, neither of these laws considers communication costs or intermediate levels of parallelism.

Scaled speed up by Gustafson’s Law is still limited as the number of processors increases because the communication costs eventually become so high that they negate any benefit gained from increasing the workload size.

In fact, both laws when applied to modern parallel systems may be inaccurate since communication costs have large effects on speed up.

Conclusion

Amdhal’s Law is pessimistic while Gustafson’s Law is optimistic. Looking at parallelism from Amdhal’s Law point of view can be discouraging. Gustafson proved that when we increase the amount of work done by the parallel parts, the bottleneck imposed by the sequential parts becomes less malignant.

Amdhal and Gustafson emerged from different eras. Amdhal presented his arguments in 1967 while Gustafson challenged Amdhal’s arguments in 1988.

Gustafson maintained that Amdhal’s Law was not flawed as some journalists had presented, rather, Amdhal’s Law was the correct answer to the wrong question: “How much can parallel processing reduce the run time of a current workload”

These great men came from different generations and Gustafson presented a different way of thinking.

This makes me wonder how many theories out there could be improved if we re-reviewed them from the perspective of today’s problems.

Thank you so much for reading the article. If you would like this article and would like to support my writing, you can buy me a coffee at https://ko-fi.com/kelvinmuchiri

Resources

[1]: John L. Gustafson. 1988. Reevaluating Amdahl’s law. Commun. ACM 31, 5 (May 1988), 532–533. https://doi.org/10.1145/42411.42415

[2]: Wikipedia. Gustafson’s Law https://en.wikipedia.org/wiki/Gustafson%27s_law

[3]: Gustafson, John. (2011). Gustafson’s Law. 10.1007/978–0–387–09766–4_78.

--

--

Kelvin Muchiri

Computer scientist. I enjoy sharing the graph of thoughts in my head, one node at a time. I write about tech, chaos theory and philosophy