Title: VDS: A Variant of Δ-stepping Algorithm for Parallel SSSP Problem
Abstract: Δ-stepping is a famous parallel algorithm for the single-source shortest path problem. It requires a tuning parameter (delta) to achieve a good trade-off between parallelism and work efficiency. The performance of Δ-stepping changes drastically with the changing value of delta. A poor choice of delta leads to an inefficient Δ-stepping algorithm. For large graphs, finding the best-performing value of delta is difficult. This paper proposes a variant of the Δ-stepping algorithm (VDS). We have evaluated the proposed algorithm on graph500 data sets. Our results show that the proposed algorithm is equally work-efficient and scalable compared to Δ-stepping, and its performance remains almost stable with the changing value of delta. Against the best performing value of delta, VDS’s performance on different deltas varies up to 136%, whereas Δ-stepping’s performance varies up to 430%. For the best performing value of delta, the proposed algorithm is competitive or slightly efficient compared to the Δ-stepping. And, for the most inefficient delta, the proposed algorithm is 2.8–3.6x faster than the Δ-stepping.
Publication Year: 2022
Publication Date: 2022-07-29
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot