Abstract
We present work assisting, a novel scheduling strategy for mixing data parallelism (loop parallelism) with task parallelism, where threads share their current data-parallel activity in a shared array to let other threads assist. In contrast to most existing work in this space, our algorithm aims at preserving the structure of data parallelism instead of implementing all parallelism as task parallelism. This enables the use of self-scheduling for data parallelism, as required by certain data-parallel algorithms, and only exploits data parallelism if task parallelism is not sufficient. It provides full flexibility: neither the number of threads for a data-parallel loop nor the distribution over threads need to be fixed before the loop starts. We present benchmarks to demonstrate that our scheduling algorithm, depending on the problem, behaves similar to, or outperforms schedulers based purely on task parallelism.
| Original language | English |
|---|---|
| Title of host publication | ARRAY 2024: Proceedings of the 10th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming |
| Publisher | Association for Computing Machinery |
| Pages | 13-24 |
| Number of pages | 12 |
| ISBN (Print) | 979-8-4007-0620-2 |
| DOIs | |
| Publication status | Published - 20 Jun 2024 |
| Event | 10th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming, ARRAY 2024, co-located with PLDI 2024 - Copenhagen, Denmark Duration: 25 Jun 2024 → … |
Conference
| Conference | 10th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming, ARRAY 2024, co-located with PLDI 2024 |
|---|---|
| Country/Territory | Denmark |
| City | Copenhagen |
| Period | 25/06/24 → … |
Bibliographical note
Publisher Copyright:© 2024 Owner/Author.
Keywords
- Data Parallelism
- Parallel computing
- Scheduling