Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Embarrassingly parallel

From Wikipedia, the free encyclopedia
Problem easily dividable into parallel tasks

Inparallel computing, anembarrassingly parallel workload or problem (also calledembarrassingly parallelizable,perfectly parallel,delightfully parallel orpleasingly parallel) is one where little or no effort is needed to split the problem into a number of parallel tasks.[1] This is due to minimal or no dependency upon communication between the parallel tasks, or for results between them.[2]

These differ fromdistributed computing problems, which need communication between tasks, especially communication of intermediate results. They are easier to perform onserver farms which lack the special infrastructure used in a truesupercomputer cluster. They are well-suited to large, Internet-basedvolunteer computing platforms such asBOINC, and suffer less fromparallel slowdown. The opposite of embarrassingly parallel problems areinherently serial problems, which cannot be parallelized at all.

A common example of an embarrassingly parallel problem is 3D video rendering handled by agraphics processing unit, where each frame (forward method) or pixel (ray tracing method) can be handled with no interdependency.[3] Some forms ofpassword cracking are another embarrassingly parallel task that is easily distributed oncentral processing units,CPU cores, or clusters.

Etymology

[edit]

"Embarrassingly" is used here to refer to parallelization problems which are "embarrassingly easy".[4] The term may imply embarrassment on the part of developers or compilers: "Because so many important problems remain unsolved mainly due to their intrinsic computational complexity, it would be embarrassing not to develop parallel implementations of polynomialhomotopy continuation methods."[5] The term is first found in the literature in a 1986 book on multiprocessors byMATLAB's creatorCleve Moler,[6] who claims to have invented the term.[7]

An alternative term,pleasingly parallel, has gained some use, perhaps to avoid the negative connotations of embarrassment in favor of a positive reflection on the parallelizability of the problems: "Of course, there is nothing embarrassing about these programs at all."[8]

Examples

[edit]

A trivial example involves serving static data. It would take very little effort to have many processing units produce the same set of bits. Indeed, the famousHello World problem could easily be parallelized with few programming considerations or computational costs.

Some examples of embarrassingly parallel problems include:

Implementations

[edit]
  • InR (programming language) – The Simple Network of Workstations (SNOW) package implements a simple mechanism for using a set of workstations or aBeowulf cluster for embarrassingly parallel computations.[16] Similar R packages include "future", "parallel" and others.

See also

[edit]

References

[edit]
  1. ^Herlihy, Maurice; Shavit, Nir (2012).The Art of Multiprocessor Programming, Revised Reprint (revised ed.). Elsevier. p. 14.ISBN 9780123977953. Retrieved28 February 2016.Some computational problems are "embarrassingly parallel": they can easily be divided into components that can be executed concurrently.
  2. ^Section 1.4.4 of:Foster, Ian (1995).Designing and Building Parallel Programs. Addison–Wesley.ISBN 9780201575941. Archived fromthe original on 2011-03-01.
  3. ^Alan Chalmers; Erik Reinhard; Tim Davis (21 March 2011).Practical Parallel Rendering. CRC Press.ISBN 978-1-4398-6380-0.
  4. ^Matloff, Norman (2011).The Art of R Programming: A Tour of Statistical Software Design, p.347. No Starch.ISBN 9781593274108.
  5. ^Leykin, Anton; Verschelde, Jan; Zhuang, Yan (2006). "Parallel Homotopy Algorithms to Solve Polynomial Systems".Mathematical Software - ICMS 2006. Lecture Notes in Computer Science. Vol. 4151. pp. 225–234.doi:10.1007/11832225_22.ISBN 978-3-540-38084-9.
  6. ^Moler, Cleve (1986). "Matrix Computation on Distributed Memory Multiprocessors". In Heath, Michael T. (ed.).Hypercube Multiprocessors. Society for Industrial and Applied Mathematics, Philadelphia.ISBN 978-0898712094.
  7. ^The Intel hypercube part 2 reposted on Cleve's Corner blog on The MathWorks website
  8. ^Kepner, Jeremy (2009).Parallel MATLAB for Multicore and Multinode Computers, p.12. SIAM.ISBN 9780898716733.
  9. ^Erricos John Kontoghiorghes (21 December 2005).Handbook of Parallel Computing and Statistics. CRC Press.ISBN 978-1-4200-2868-3.
  10. ^Yuefan Deng (2013).Applied Parallel Computing. World Scientific.ISBN 978-981-4307-60-4.
  11. ^Josefsson, Simon;Percival, Colin (August 2016)."The scrypt Password-Based Key Derivation Function".tools.ietf.org.doi:10.17487/RFC7914. Retrieved2016-12-12.
  12. ^Mathog, DR (22 September 2003)."Parallel BLAST on split databases".Bioinformatics.19 (14):1865–6.doi:10.1093/bioinformatics/btg250.PMID 14512366.
  13. ^How we made our face recognizer 25 times faster (developer blog post)
  14. ^Shigeyoshi Tsutsui; Pierre Collet (5 December 2013).Massively Parallel Evolutionary Computation on GPGPUs. Springer Science & Business Media.ISBN 978-3-642-37959-8.
  15. ^Youssef Hamadi; Lakhdar Sais (5 April 2018).Handbook of Parallel Constraint Reasoning. Springer.ISBN 978-3-319-63516-3.
  16. ^Simple Network of Workstations (SNOW) package

External links

[edit]
Look upembarrassingly parallel in Wiktionary, the free dictionary.
General
Levels
Multithreading
Theory
Elements
Coordination
Programming
Hardware
APIs
Problems
Retrieved from "https://en.wikipedia.org/w/index.php?title=Embarrassingly_parallel&oldid=1338671164"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp