Lower bounds for sparse matrix vector multiplication on hypercubic networks
In this paper we consider the problem of computing on a local memory machine the product y = Ax,where A is a random n×n sparse matrix with Θ(n) nonzero elements. To study the average case communication cost of this problem, we introduce four different probability measures on the set of sparse...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Discrete Mathematics & Theoretical Computer Science
1998-12-01
|
Series: | Discrete Mathematics & Theoretical Computer Science |
Online Access: | http://www.dmtcs.org/dmtcs-ojs/index.php/dmtcs/article/view/91 |
Summary: | In this paper we consider the problem of computing on a local memory machine the product y = Ax,where A is a random n×n sparse matrix with Θ(n) nonzero elements. To study the average case communication cost of this problem, we introduce four different probability measures on the set of sparse matrices. We prove that on most local memory machines with p processors, this computation requires Ω((n/p) log p) time on the average. We prove that the same lower bound also holds, in the worst case, for matrices with only 2n or 3n nonzero elements. |
---|---|
ISSN: | 1462-7264 1365-8050 |