Kamandirian statistik
Ti Wikipédia, énsiklopédi bébas
![]() |
Artikel ieu keur dikeureuyeuh, ditarjamahkeun tina basa Inggris. Bantosanna diantos kanggo narjamahkeun. |
Dina tiori probabiliti, keur nyebutkeun yen dua kajadian independent atawa mandiri dumasar kana pamikiran nu gampang yen pangaweruh kana ayana hiji kajadian lain disababkeun ku ayana pangaruh kamungkinan tina hiji kajadian sejenna. Upamana, keur meunang angka "1" dina sakali ngalungkeun dadu sarta meunang deui angka "1" dina alungan dadu kadua ngarupakeun conto kajadian mandiri.
Hal nu sarupa, waktu urang nyebutkeun dua variabel acak bebas, we intuitively mean that knowing something about the value of one of them does not yield any information about the value of the other. For example, the number appearing on the upward face of a die the first time it is thrown and that appearing the second time are independent.
[édit] Kajadian bebas
If two events A and B are independent, then the conditional probability of A given B is the same as the "unconditional" (or "marginal") probability of A, i.e.,
There are at least two reasons why this statement is not taken to be the definition of independence: (1) the two events A and B do not play symmetrical roles in this statement, and (2) problems arise with this statement when events of probability 0 are involved.
When one recalls that the conditional probability P(A | B) is given by
one sees that the statement above is equivalent to
Here A ∩ B is the intersection of A and B, i.e., it is the event that both events A and B occur. Thus we could say:
Thus the standard definition says:
- Two events A and B are independent iff P(A ∩ B)=P(A)P(B).
More generally, and collection of events -- possibly more than just two of them -- are mutually independent precisely if for any finite subset A1, ..., An of the collection we have
This is called the multiplication rule for independent events.
If any two of a collection of random variables are independent, they may nonetheless fail to be mutually independent; this is called pairwise independence.
[édit] Independent random variables
Two random variables X and Y are independent iff for any numbers a and b the events [X ≤ a] and [Y ∈ b] are independent events as defined above. Similarly an arbitrary collection of random variables -- possible more than just two of them -- is independent precisely if for any finite collection X1, ..., Xn and any finite set of numbers a1, ..., an, the events [X1 ≤ a1], ..., [Xn ≤ an] are independent events as defined above.
The measure-theoretically inclined may prefer to substitute events [X ∈ A] for events [X ≤ a] in the above definition, where A is any Borel set. That definition is exactly equivalant to the one above when the values of the random variables are real numbers. It has the advantage of working also for complex-valued random variables or for random variables taking values in any topological space.
Lamun X sarta Y bebas, mangka operator ekspektasi E mibanda sipat nu hade
- E[X· Y] = E[X] · E[Y]
sarta keur varian mibanda
- var(X + Y) = var(X) + var(Y).
Lamun X jeung Y bebas, kovarian cov(X,Y) sarua jeung nol; dina hal sejen mibanda
- var(X + Y) = var(X) + var(Y) + 2 cov(X, Y).
(Pernyataan sabalikna yen lamun dua variabel bebas mangka kovarian-na sarua jeung nol ngarupakeun hal nu teu bener. Tempo taya hubungan.)
Furthermore, if X and Y are independent and have probability densities fX(x) and fY(y), then the combined random variable (X,Y) has a joint density
- fXY(x,y) dx dy = fX(x) fY(y) dx dy.
[édit] Conditionally independent random variables
We define random variables X and Y to be conditionally independent given random variable Z if
- P[(X in A) & (Y in B) | Z in C] = P[X in A | Z in C] · P[Y in B | Z in C]
for any Borel subsets A, B and C of the real numbers.
If X and Y are conditionally independent given Z, then
- P[(X in A) | (Y in B) & (Z in C)]
- = P[(X in A) | (Z in C)]
for any Borel subsets A, B and C of the real numbers. That is, given Z, the value of Y does not add any additional information about the value of X.
Independence can be seen as a special kind of conditional independence, since probability can be seen as a kind of conditional probability given no events.