Static Wikipedia February 2008 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu

Web Analytics
Cookie Policy Terms and Conditions Bayesian inference in phylogeny - Wikipedia, the free encyclopedia

Bayesian inference in phylogeny

From Wikipedia, the free encyclopedia

Bayesian inference in phylogeny generates a posterior distribution for a parameter, composed of a phylogenetic tree and a model of evolution, based on the prior for that parameter and the likelihood of the data, generated by a multiple alignment. The Bayesian approach has become more popular due to advances in computational machinery, especially, Markov Chain Monte Carlo algorithms. Bayesian inference has a number of applications in molecular phylogenetics, for example, estimation of species phylogeny and species divergence times.

Contents

[edit] Basic Bayesian Theory

Recall that for Bayesian inference:

p(\theta | D) = \frac{p(D|\theta)p(\theta)}{p(D)}\

The denominator p(D)\is the marginal probability of the data, averaged over all possible parameter values weighted by their prior distribution. Formally,

p(D) = \int_{\Theta}p(D|\theta)p(\theta)d\theta\

where \Theta\ is the parameter space for \theta\.

In the original Metropolis algorithm, given a current \theta\-value x\, and a new \theta\-value y\, the new value is accepted with probability:

h(y)/h(x) = \frac{p(D|y)p(y)}{p(D|x)p(x)}\

[edit] The LOCAL algorithm of Larget and Simon

The LOCAL algorithm begins by selecting an internal branch of the tree at random. The nodes at the ends of this branch are each connected to two other branches. One of each pair is chosen at random. Imagine taking these three selected edges and stringing them like a clothesline from left to right, where the direction (left/right) is also selected at random. The two endpoints of the first branch selected will have a sub-tree hanging like a piece of clothing strung to the line. The algorithm proceeds by multiplying the three selected branches by a common random amount, akin to stretching or shrinking the clothesline. Finally the leftmost of the two hanging sub-trees is disconnected and reattached to the clothesline at a location selected uniformly at random. This is the candidate tree.

Suppose we began by selecting the internal branch with length $$t_8\ (in Figure (a) (to be added)) that separates taxa A\ and B\ from the rest. Suppose also that we have (randomly) selected branches with lengths t_1\ and t_9\ from each side, and that we oriented these branches as shown in Figure(b). Let m = t_1+t_8+t_9\, be the current length of the clothesline. We select the new length to be m^{\star} = m\exp(\lambda(U_1-0.5))\, where U_1\ is a uniform random variable on (0,1)\. Then for the LOCAL algorithm, the acceptance probability can be computed to be:

\frac{h(y)}{h(x)} \times \frac{{m^{\star}}^3}{m^3}\

[edit] Assessing Convergence

Suppose we want to estimate a branch length of a 2-taxon tree under JC, in which n1 sites are unvaried and n2 are variable. Assume exponential prior distribution with rate \lambda\. The density is p(t) = \lambda e^{-\lambda t}\. The probabilities of the possible site patterns are:

1/4\left(1/4+3/4e^{-4/3t}\right)\

for unvaried sites, and

1/4\left(1/4-1/4e^{-4/3t}\right)\

Thus the unnormalized posterior distribution is:

h(t) =  \left(1/4\right)^{n_1+n_2}\left(1/4+3/4{e^{-4/3t}}^{n_1}\right)\

or, alternately,

h(t) = \left(1/4-1/4{e^{-4/3t}}^{n_2}\right)(\lambda e^{-\lambda t}\

Update branch length by choosing new value uniformly at random from a window of half-width w\ centered at the current value:

t^{\star} = |t+U|\

where U\is uniformly distributed between -w\ and w\. The acceptance probability is:

h(t^{\star})/h(t)\

Example: n_1 = 70\, n_2 = 30\. We will compare results for two values of w\, w = 0.1\ and w = 0.5\. In each case, we will begin with an initial length of 5\ and update the length 2000\ times. (See Figure 3.2 (to be added) for results.)

[edit] Metropolis-coupled MCMC (Geyer)

If the target distribution has multiple peaks, separated by low valleys, the Markov chain may have difficulty in moving from one peak to another. As a result, the chain may get stuck on one peak and the resulting samples will not approximate the posterior density correctly. This is a serious practical concern for phylogeny reconstruction, as multiple local peaks are known to exist in the tree space during heuristic tree search under maximum parsimony (MP), maximum likelihood (ML), and minimum evolution (ME) criteria, and the same can be expected for stochastic tree search using MCMC. Many strategies have been proposed to improve mixing of Markov chains in presence of multiple local peaks in the posterior density. One of the most successful algorithms is the Metropolis-coupled MCMC (or \mathrm{MC}^3\).

In this algorithm, $m$ chains are run in parallel, with different stationary distributions \pi_j(.)\, $$j = 1, 2, \ldots, m\, where the first one, \pi_1 = \pi\ is the target density, while \pi_j\, j = 2, 3, \ldots, m\ are chosen to improve mixing. For example, one can choose incremental heating of the form:

\pi_j(\theta) = \pi(\theta)^{1/[1+\lambda(j-1)]}, \ \ \lambda > 0,

so that the first chain is the cold chain with the correct target density, while chains 2, 3, \ldots, m are heated chains. Note that raising the density π(.) to the power 1/T\ with T>1\ has the effect of flattening out the distribution, similar to heating a metal. In such a distribution, it is easier to traverse between peaks (separated by valleys) than in the original distribution. After each iteration, a swap of states between two randomly chosen chains is proposed through a Metropolis-type step. Let \theta^{(j)}\ be the current state in chain j\, j = 1, 2, \ldots, m\. A swap between the states of chains i\ and j\ is accepted with probability:

\alpha = \frac{\pi_i(\theta^{(j)}\pi_j(\theta^{(i)})}{\pi_i(\theta^{(i)})\pi_j(\theta^{(j)})}\

At the end of the run, output from only the cold chain is used, while those from the hot chains are discarded. Heuristically, the hot chains will visit the local peaks rather easily, and swapping states between chains will let the cold chain occasionally jump valleys, leading to better mixing. However, if \pi_i(\theta)/\pi_j(\theta)\ is unstable, proposed swaps will seldom be accepted. This is the reason for using several chains which differ only incrementally. (See Figure3.3 (to be added)).

An obvious disadvantage of the algorithm is that m\ chains are run and only one chain is used for inference. For this reason, \mathrm{MC}^3\ is ideally suited for implementation on parallel machines, since each chain will in general require the same amount of computation per iteration.


[edit] References

  • Geyer, C.J. (1991) Markov chain Monte Carlo maximum likelihood. In Computing Science and Statistics: Proceedings of the 23rd Symposium of the Interface (ed. E.M. Keramidas), pp. 156-163. Interface Foundation, Fairfax Station, VA.
  • Yang, Z. and B. Rannala. (1997) Bayesian phylogenetic inference using DNA sequences: A Markov chain Monte Carlo method. Molecular Biology and Evolution, 14, 717-724.
  • Larget, B. and D.L. Simon. (1999) Markov chain Monte Carlo algorithms for the Bayesian analysis of phylogenetic trees. Molecular Biology and Evolution, 16, 750-759.
  • Huelsenbeck, J.P. and F. Ronquist. (2001) MrBayes: Bayesian inference in phylogenetic trees. Bioinformatics, 17, 754-755.
  • Ronquist, F. and J.P. Huelsenbeck. (2003) MrBayes3: Bayesian phylogenetic inference under mixed models. Bioinformatics, 19, 1572-1574.
  • Rannala, B. and Z. Yang. (2003) Bayes estimation of species divergence times and ancestral population sizes using DNA sequences from multiple loci. Genetics, 164, 1645-1656.
Topics in phylogenetics
v  d  e
Relevant fields: phylogenetics | computational phylogenetics | molecular phylogeny | cladistics
Basic concepts: synapomorphy | phylogenetic tree | phylogenetic network | long branch attraction
Phylogeny inference methods: maximum parsimony | maximum likelihood | neighbour joining | UPGMA | Bayesian inference | Least Squares
Current topics: PhyloCode | DNA barcoding
List of evolutionary biology topics
Static Wikipedia 2008 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2007 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2006 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu