Work Assisting: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

We present work assisting, a novel scheduling strategy for mixing data parallelism (loop parallelism) with task parallelism, where threads share their current data-parallel activity in a shared array to let other threads assist. In contrast to most existing work in this space, our algorithm aims at preserving the structure of data parallelism instead of implementing all parallelism as task parallelism. This enables the use of self-scheduling for data parallelism, as required by certain data-parallel algorithms, and only exploits data parallelism if task parallelism is not sufficient. It provides full flexibility: neither the number of threads for a data-parallel loop nor the distribution over threads need to be fixed before the loop starts. We present benchmarks to demonstrate that our scheduling algorithm, depending on the problem, behaves similar to, or outperforms schedulers based purely on task parallelism.
Original languageEnglish
Title of host publicationARRAY 2024: Proceedings of the 10th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming
PublisherAssociation for Computing Machinery
Pages13-24
Number of pages12
ISBN (Print)979-8-4007-0620-2
DOIs
Publication statusPublished - 20 Jun 2024
Event10th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming, ARRAY 2024, co-located with PLDI 2024 - Copenhagen, Denmark
Duration: 25 Jun 2024 → …

Conference

Conference10th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming, ARRAY 2024, co-located with PLDI 2024
Country/TerritoryDenmark
CityCopenhagen
Period25/06/24 → …

Bibliographical note

Publisher Copyright:
© 2024 Owner/Author.

Keywords

  • Data Parallelism
  • Parallel computing
  • Scheduling

Fingerprint

Dive into the research topics of 'Work Assisting: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling'. Together they form a unique fingerprint.

Cite this