Basic Linear Algebra Subprograms
From Wikipedia, the free encyclopedia
Basic Linear Algebra Subprograms (BLAS) are routines which perform basic linear algebra operations such as vector and matrix multiplication. They were first published in 1979, and are used to build larger packages such as LAPACK. BLAS routines are tuned by high performance computing, supercomputer hardware, and software vendors to run extremely fast under a variety of problem sizes. The LINPACK benchmark relies heavily on DGEMM, a BLAS subroutine, for its performance.
Contents |
[edit] Functionality
The BLAS functionality is divided into three levels: 1, 2 and 3.
[edit] Level 1
This level contains vector operations of the form
as well as scalar dot products and vector norms, among other things.
[edit] Level 2
This level contains matrix-vector operations of the form
as well as solving for x with T being triangular, among other things.
[edit] Level 3
This level contains matrix-matrix operations of the form
as well as solving for triangular matrices T, among other things. This level contains the widely used General Matrix Multiply operation.
[edit] Implementations
- refblas
- The official reference implementation from netlib. C and Fortran 77 versions are available.[1]
- ACML
- The AMD Core Math Library, supporting the AMD Athlon and Opteron CPUs under Linux and Windows.[2]
- ATLAS
- Automatically Tuned Linear Algebra Software, an open source implementation of BLAS APIs for C and Fortran 77.[3]
- ESSL
- IBM's Engineering and Scientific Subroutine Library, supporting the PowerPC architecture under AIX and Linux.[4]
- Goto BLAS
- Kazushige Goto's implementation of BLAS.[5]
- HP MLIB
- HP's Math library, supporting IA-64, PA-RISC, x86 and Opteron architecture under HPUX and Linux.
- IMKL
- The Intel Math Kernel Library, supporting the Intel Pentium and Itanium CPUs under Linux and Windows.[6]
- Sun Performance Linaray
- Optimized BLAS and LAPACK for SPARC and AMD64 architectures under Solaris 8, 9, and 10.[7]
- MathKeisan
- NEC's math library, supporting NEC SX architecture under SUPER-UX, and Itanium under Linux. [8]
- uBLAS
- A generic C++ template class library providing BLAS functionality. Part of the Boost library.[9]
- CUDA SDK
- The NVIDIA CUDA SDK includes BLAS functionality for writing C programs that runs on GeForce 8 Series graphics cards.
[edit] The Sparse BLAS
Sparse extensions to the previously dense BLAS exist.
[edit] See also
- ATLAS Automatically Tuned Linear Algebra Software, an optimizing version of BLAS
- Numerical linear algebra, the type of problem BLAS solves
[edit] External links
- BLAS homepage on Netlib.org
- BLAS FAQ
- BLAS operations from the GNU Scientific Library reference manual
- BLAS Quick Reference Guide from LAPACK Users' Guide
- Lawson Oral History One of the original authors of the BLAS discusses its creation in an oral history interview. Charles L. Lawson Oral history interview by Thomas Haigh, 6 and 7 November, 2004, San Clemente, California. Society for Industrial and Applied Mathematics, Philadelphia, PA.
- Dongarra Oral History In an oral history interview, Jack Dongarra explores the early relationship of BLAS to LINPACK, the creation of higher level BLAS versions for new architectures, and his later work on the ATLAS system to automatically optimize BLAS for particular machines. Jack Dongarra, Oral history interview by Thomas Haigh, 26 April, 2005, University of Tennessee, Knoxville TN. Society for Industrial and Applied Mathematics, Philadelphia, PA
- An Overview of the Sparse Basic Linear Algebra Subprograms: The New Standard from the BLAS Technical Forum [10]