New Immissions/Updates:
boundless - educate - edutalab - empatico - es-ebooks - es16 - fr16 - fsfiles - hesperian - solidaria - wikipediaforschools
- wikipediaforschoolses - wikipediaforschoolsfr - wikipediaforschoolspt - worldmap -

See also: Liber Liber - Libro Parlato - Liber Musica  - Manuzio -  Liber Liber ISO Files - Alphabetical Order - Multivolume ZIP Complete Archive - PDF Files - OGG Music Files -

PROJECT GUTENBERG HTML: Volume I - Volume II - Volume III - Volume IV - Volume V - Volume VI - Volume VII - Volume VIII - Volume IX

Ascolta ""Volevo solo fare un audiolibro"" su Spreaker.
CLASSICISTRANIERI HOME PAGE - YOUTUBE CHANNEL
Privacy Policy Cookie Policy Terms and Conditions
Matrix multiplication - Wikipedia, the free encyclopedia

Matrix multiplication

From Wikipedia, the free encyclopedia

This article gives an overview of the various ways to perform matrix multiplication.

Contents

[edit] Ordinary matrix product

By far the most important way to multiply matrices is the usual matrix multiplication. It is defined between two matrices only if the number of columns of the first matrix is the same as the number of rows of the second matrix. If A is an m-by-n matrix and B is an n-by-p matrix, then their product is an m-by-p matrix denoted by AB (or sometimes A · B). The product is given by

(AB)_{ij} = \sum_{r=1}^n a_{ir}b_{rj} = a_{i1}b_{1j} + a_{i2}b_{2j} + \cdots + a_{in}b_{nj}.

for each pair i and j with 1 ≤ im and 1 ≤ jp. The algebraic system of "matrix units" summarises the abstract properties of this kind of multiplication.

[edit] Calculating directly from the definition

The picture to the left shows how to calculate the (1,2) element and the (3,3) element of AB if A is a 4×2 matrix, and B is a 2×3 matrix. Elements from each matrix are paired off in the direction of the arrows; each pair is multiplied and the products are added. The location of the resulting number in AB corresponds to the row and column that were considered.

(AB)_{1,2} = \sum_{r=1}^2 a_{1,r}b_{r,2} = a_{1,1}b_{1,2}+a_{1,2}b_{2,2}
(AB)_{3,3} = \sum_{r=1}^2 a_{3,r}b_{r,3} = a_{3,1}b_{1,3}+a_{3,2}b_{2,3}

[edit] The coefficients-vectors method

This matrix multiplication can also be considered from a slightly different viewpoint : it adds vectors together after being multiplied by different coefficients. If A and B are matrices given by:

\mathbf{A} =   \begin{bmatrix}    a_{1,1} & a_{1,2} & \dots \\    a_{2,1} & a_{2,2} & \dots \\    \vdots & \vdots & \ddots \end{bmatrix} and \mathbf{B} =   \begin{bmatrix}    b_{1,1} & b_{1,2} & \dots \\    b_{2,1} & b_{2,2} & \dots \\    \vdots & \vdots & \ddots \end{bmatrix}

then

\mathbf{AB} =  \begin{bmatrix}    a_{1,1} \begin{bmatrix} b_{1,1} & b_{1,2} & \dots \end{bmatrix} + a_{1,2} \begin{bmatrix} b_{2,1} & b_{2,2} & \dots \end{bmatrix} + \cdots \\\\    a_{2,1} \begin{bmatrix} b_{1,1} & b_{1,2} & \dots \end{bmatrix} + a_{2,2} \begin{bmatrix} b_{2,1} & b_{2,2} & \dots \end{bmatrix} + \cdots \\    \vdots \end{bmatrix}

For example:

\begin{bmatrix}      1 & 0 & 2 \\       -1 & 3 & 1   \end{bmatrix} \cdot   \begin{bmatrix}      3 & 1 \\      2 & 1 \\      1 & 0   \end{bmatrix} = \begin{bmatrix}    1 \begin{bmatrix} 3 & 1 \end{bmatrix} + 0 \begin{bmatrix} 2 & 1 \end{bmatrix} + 2 \begin{bmatrix} 1 & 0 \end{bmatrix} \\    -1 \begin{bmatrix} 3 & 1 \end{bmatrix} + 3 \begin{bmatrix} 2 & 1 \end{bmatrix} + 1 \begin{bmatrix} 1 & 0 \end{bmatrix} \end{bmatrix} = \begin{bmatrix}    \begin{bmatrix} 3 & 1 \end{bmatrix} +   \begin{bmatrix} 0 & 0 \end{bmatrix} +   \begin{bmatrix} 2 & 0 \end{bmatrix} \\    \begin{bmatrix} -3 & -1 \end{bmatrix} + \begin{bmatrix} 6 & 3 \end{bmatrix} +   \begin{bmatrix} 1 & 0 \end{bmatrix} \end{bmatrix}
= \begin{bmatrix}     5 & 1 \\     4 & 2 \end{bmatrix}

The rows in the matrix on the left are the list of coefficients. The matrix on the right is the list of vectors. In the example, the first row is [1 0 2], and thus we take 1 times the first vector, 0 times the second vector, and 2 times the third vector.

[edit] Vector-lists method

The ordinary matrix product can be thought of as a dot product of a column-list of vectors and a row-list of vectors. If A and B are matrices given by:

\mathbf{A} =   \begin{bmatrix}    a_{1,1} & a_{1,2} & a_{1,3} & \dots \\    a_{2,1} & a_{2,2} & a_{2,3} & \dots \\    a_{3,1} & a_{3,2} & a_{3,3} & \dots \\    \vdots & \vdots & \vdots & \ddots \end{bmatrix} = \begin{bmatrix}    A_1 \\    A_2 \\    A_3 \\    \vdots \end{bmatrix} and \mathbf{B} =   \begin{bmatrix}    b_{1,1} & b_{1,2} & b_{1,3} & \dots \\    b_{2,1} & b_{2,2} & b_{2,3} & \dots \\    b_{3,1} & b_{3,2} & b_{3,3} & \dots \\    \vdots & \vdots & \vdots & \ddots \end{bmatrix} = \begin{bmatrix} B_1 & B_2 & B_3 & \dots \end{bmatrix}

where

A1 is the vector of all elements of the form a1,x      A2 is the vector of all elements of the form a2,x     etc,
and B1 is the vector of all elements of the form bx,1      B2 is the vector of all elements of the form bx,2     etc,

then

\mathbf{AB} =   \begin{bmatrix}    A_1 \\    A_2 \\    A_3 \\    \vdots \end{bmatrix} * \begin{bmatrix} B_1 & B_2 & B_3 & \dots \end{bmatrix} =  \begin{bmatrix} (A_1 \cdot B_1) & (A_1 \cdot B_2) & (A_1 \cdot B_3) & \dots \\ (A_2 \cdot B_1) & (A_2 \cdot B_2) & (A_2 \cdot B_3) & \dots \\ (A_3 \cdot B_1) & (A_3 \cdot B_2) & (A_3 \cdot B_3) & \dots \\ \vdots & \vdots & \vdots & \ddots  \end{bmatrix}

[edit] Properties

Matrix multiplication is not commutative (that is, ABBA), except in special cases. It is easy to see why: you cannot expect to switch the proportions with the vectors and get the same result. It is also easy to see how the order of the factors determines the result when one knows that the number of columns in the proportions matrix has to be the same as the number of rows in the vectors matrix: they have to represent the same number of vectors.

Although matrix multiplication is not commutative, the determinants of AB and BA are always equal (if A and B are square matrices of the same size). See the article on determinants for an explanation.

This notion of multiplication is important because if A and B are interpreted as linear transformations (which is almost universally done), then the matrix product AB corresponds to the composition of the two linear transformations, with B being applied first.

Additionally, all notions of matrix multiplication described here share a set of common properties (please replace this internal section link with correct syntax) described below.

[edit] Algorithms

The complexity of matrix multiplication, if carried out naively, is O(n3), but more efficient algorithms do exist. Strassen's algorithm, devised by Volker Strassen in 1969 and often referred to as "fast matrix multiplication", is based on a clever way of multiplying two 2 × 2 matrices which requires only 7 multiplications (instead of the usual 8). Applying this trick recursively gives an algorithm with a cost of O( n^{\log_{2}7}) \approx O(n^{2.807}). In practice, though, it is rarely used since it is awkward to implement and it lacks numerical stability. The constant factor implied in the big O notation is about 4.695.

The algorithm with the lowest known exponent, which was presented by Don Coppersmith and Shmuel Winograd in 1990, has an asymptotic complexity of O(n2.376). It is similar to Strassen's algorithm: a clever way is devised for multiplying two k × k matrices with less than k3 multiplications, and this technique is applied recursively. It improves on the constant factor in Strassen's algorithm, reducing it to 4.537. However, the constant term implied in the O(n2.376) result is so large that the Coppersmith–Winograd algorithm is only worthwhile for matrices that are too big to handle on present-day computers.

Since any algorithm for multiplying two n × n matrices has to process all 2 × n2 entries, there is an asymptotic lower bound of Ω(n2) operations. Most researchers believe that an optimal algorithm will run in essentially O(n2) time (Robinson, 2005). Raz proves a lower bound of Ω(m2logm) for bounded coefficient arithmetic circuits over the real or complex numbers.[1]

[edit] Scalar multiplication

The scalar multiplication of a matrix A = (aij) and a scalar r gives a product rA of the same size as A. The entries of rA are given by

(rA)_{ij} = r \cdot a_{ij}. \,

If we are concerned with matrices over a ring, then the above multiplication is sometimes called the left multiplication while the right multiplication is defined to be

(Ar)_{ij} = a_{ij} \cdot r. \,

When the underlying ring is commutative, for example, the real or complex number field, the two multiplications are the same. However, if the ring is not commutative, such as the quaternions, they may be different. For example

i\begin{bmatrix}      i & 0 \\      0 & j \\    \end{bmatrix} = \begin{bmatrix}     -1 & 0 \\      0 & k \\   \end{bmatrix} \ne \begin{bmatrix}     -1 & 0 \\     0 & -k \\   \end{bmatrix} = \begin{bmatrix}     i & 0 \\     0 & j \\   \end{bmatrix}i

[edit] Hadamard product

For two matrices of the same dimensions, we have the Hadamard product or entrywise product. The Hadamard product of two m-by-n matrices A and B, denoted by AB, is an m-by-n matrix given by (AB)ij = aijbij. For instance

\begin{bmatrix}     1 & 3 & 2 \\      1 & 0 & 0 \\      1 & 2 & 2   \end{bmatrix} \bullet   \begin{bmatrix}      0 & 0 & 2 \\      7 & 5 & 0 \\      2 & 1 & 1   \end{bmatrix} =   \begin{bmatrix}      1 \cdot 0 & 3 \cdot 0 & 2 \cdot 2 \\      1 \cdot 7 & 0 \cdot 5 & 0 \cdot 0 \\      1 \cdot 2 & 2 \cdot 1 & 2 \cdot 1   \end{bmatrix} =   \begin{bmatrix}      0 & 0 & 4 \\      7 & 0 & 0 \\      2 & 2 & 2   \end{bmatrix}

Note that the Hadamard product is a submatrix of the Kronecker product (see below). The Hadamard product is studied by matrix theorists, but it is virtually untouched by linear algebraists. It is discussed in (Horn & Johnson, 1994, Ch. 5).

[edit] Kronecker product

Main article: Kronecker product.

For any two arbitrary matrices A and B, we have the direct product or Kronecker product A B defined as

\begin{bmatrix}      a_{11}B & a_{12}B & \cdots & a_{1n}B \\      \vdots & \vdots & \ddots & \vdots \\      a_{m1}B & a_{m2}B & \cdots & a_{mn}B   \end{bmatrix}.

Note that if A is m-by-n and B is p-by-r then A B is an mp-by-nr matrix. Again this multiplication is not commutative.

For example

\begin{bmatrix}      1 & 2 \\      3 & 1 \\    \end{bmatrix} \otimes   \begin{bmatrix}      0 & 3 \\      2 & 1 \\    \end{bmatrix} =   \begin{bmatrix}      1\cdot 0 & 1\cdot 3 & 2\cdot 0 & 2\cdot 3 \\      1\cdot 2 & 1\cdot 1 & 2\cdot 2 & 2\cdot 1 \\      3\cdot 0 & 3\cdot 3 & 1\cdot 0 & 1\cdot 3 \\      3\cdot 2 & 3\cdot 1 & 1\cdot 2 & 1\cdot 1 \\    \end{bmatrix}  =   \begin{bmatrix}      0 & 3 & 0 & 6 \\      2 & 1 & 4 & 2 \\     0 & 9 & 0 & 3 \\     6 & 3 & 2 & 1   \end{bmatrix}.

If A and B represent linear transformations V1W1 and V2W2, respectively, then A B represents the tensor product of the two maps, V1 V2W1 W2.

[edit] Common properties

All three notions of matrix multiplication are associative:

A(BC) = (AB)C

and distributive:

A(B + C) = AB + AC, and
(A + B)C = AC + BC.

and compatible with scalar multiplication:

c(AB) = (cA)B,
(Ac)B = A(cB), and
(AB)c = A(Bc).

Note that these three separate couples of expressions will be equal to each other only if the multiplication and addition on the scalar field are commutative, i.e. the scalar field is a commutative ring. See Scalar multiplication above for a counter-example such as the scalar field of quaternions.

[edit] Frobenius inner product

The Frobenius inner product, sometimes denoted A:B is the component-wise inner product of two matrices as though they are vectors. In other words, the sum of the entries of the Hadamard product. That is,

A:B=\sum_i\sum_j A_{ij} B_{ij} = \operatorname{trace}(A^T B) = \operatorname{trace}(A B^T).

This inner product induces the Frobenius norm.

[edit] See also

[edit] External links

[edit] References

  • Strassen, Volker, Gaussian Elimination is not Optimal, Numer. Math. 13, p. 354-356, 1969.
  • Coppersmith, D., Winograd S., Matrix multiplication via arithmetic progressions, J. Symbolic Comput. 9, p. 251-280, 1990.
  • Horn, Roger; Johnson, Charles: "Topics in Matrix Analysis", Cambridge, 1994.
  • Robinson, Sara, Toward an Optimal Algorithm for Matrix Multiplication, SIAM News 38(9), November 2005. PDF
  • Petros Drineas, Ravi Kannan Fast Monte-Carlo Algorithms for Approximate Matrix Multiplication FOCS, 2001 [1]
  • Petros Drineas, Ravi Kannan and Michael Mahoney Fast Monte Carlo Algorithms for Matrices I: Approximating Matrix Multiplication SIAM Journal of Computing, 2006 [2]
  • Edith Cohen, David D. Lewis Approximating Matrix Multiplication for Pattern Recognition Tasks SODA, 1997 [3]

Static Wikipedia (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2007 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2006 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu

Static Wikipedia February 2008 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu