Item Details

Print View

Mixtures: Estimation and Applications

edited by Kerrie L. Mengersen, Christian P. Robert, D. Michael Titterington
Format
Book
Published
Hoboken, N.J. : Wiley, c2011.
Language
English
Series
Wiley Series in Probability and Statistics
ISBN
9781119993896 (cloth), 111999389X (cloth)
Contents
  • Machine generated contents note: 1. EM algorithm, variational approximations and expectation propagation for mixtures / D. Michael Titterington
  • 1.1. Preamble
  • 1.2. EM algorithm
  • 1.2.1. Introduction to the algorithm
  • 1.2.2. E-step and the M-step for the mixing weights
  • 1.2.3. M-step for mixtures of univariate Gaussian distributions
  • 1.2.4. M-step for mixtures of regular exponential family distributions formulated in terms of the natural parameters
  • 1.2.5. Application to other mixtures
  • 1.2.6. EM as a double expectation
  • 1.3. Variational approximations
  • 1.3.1. Preamble
  • 1.3.2. Introduction to variational approximations
  • 1.3.3. Application of variational Bayes to mixture problems
  • 1.3.4. Application to other mixture problems
  • 1.3.5. Recursive variational approximations
  • 1.3.6. Asymptotic results
  • 1.4. Expectation-propagation
  • 1.4.1. Introduction
  • 1.4.2. Overview of the recursive approach to be adopted
  • 1.4.3. Finite Gaussian mixtures with an unknown mean parameter
  • 1.4.4. Mixture of two known distributions
  • 1.4.5. Discussion
  • Acknowledgements
  • References
  • 2. Online expectation maximisation / Olivier Cappe
  • 2.1. Introduction
  • 2.2. Model and assumptions
  • 2.3. EM algorithm and the limiting EM recursion
  • 2.3.1. Batch EM algorithm
  • 2.3.2. Limiting EM recursion
  • 2.3.3. Limitations of batch EM for long data records
  • 2.4. Online expectation maximisation
  • 2.4.1. Algorithm
  • 2.4.2. Convergence properties
  • 2.4.3. Application to finite mixtures
  • 2.4.4. Use for batch maximum-likelihood estimation
  • 2.5. Discussion
  • 3. Limiting distribution of the EM test of the order of a finite mixture / Pengfei Li
  • 3.1. Introduction
  • 3.2. Method and theory of the EM test
  • 3.2.1. Definition of the EM test statistic
  • 3.2.2. Limiting distribution of the EM test statistic
  • 3.3. Proofs
  • 3.4. Discussion
  • 4. Comparing Wald and likelihood regions applied to locally identifiable mixture models / Bruce G. Lindsay
  • 4.1. Introduction
  • 4.2. Background on likelihood confidence regions
  • 4.2.1. Likelihood regions
  • 4.2.2. Profile likelihood regions
  • 4.2.3. Alternative methods
  • 4.3. Background on simulation and visualisation of the likelihood regions
  • 4.3.1. Modal simulation method
  • 4.3.2. Illustrative example
  • 4.4. Comparison between the likelihood regions and the Wald regions
  • 4.4.1. Volume/volume error of the confidence regions
  • 4.4.2. Differences in univariate intervals via worst case analysis
  • 4.4.3. Illustrative example (revisited)
  • 4.5. Application to a finite mixture model
  • 4.5.1. Nonidentifiabilities and likelihood regions for the mixture parameters
  • 4.5.2. Mixture likelihood region simulation and visualisation
  • 4.5.3. Adequacy of using the Wald confidence region
  • 4.6. Data analysis
  • 4.7. Discussion
  • 5. Mixture of experts modelling with social science applications / Thomas Brendan Murphy
  • 5.1. Introduction
  • 5.2. Motivating examples
  • 5.2.1. Voting blocs
  • 5.2.2. Social and organisational structure
  • 5.3. Mixture models
  • 5.4. Mixture of experts models
  • 5.5. A mixture of experts model for ranked preference data
  • 5.5.1. Examining the clustering structure
  • 5.6. A mixture of experts latent position cluster model
  • 5.7. Discussion
  • 6. Modelling conditional densities using finite smooth mixtures / Robert Kohn
  • 6.1. Introduction
  • 6.2. Model and prior
  • 6.2.1. Smooth mixtures
  • 6.2.2. Component models
  • 6.2.3. Prior
  • 6.3. Inference methodology
  • 6.3.1. General MCMC scheme
  • 6.3.2. Updating β and I using variable-dimension finite-step Newton proposals
  • 6.3.3. Model comparison
  • 6.4. Applications
  • 6.4.1. A small simulation study
  • 6.4.2. LIDAR data
  • 6.4.3. Electricity expenditure data
  • 6.5. Conclusions
  • Appendix: Implementation details for the gamma and log-normal models
  • 7. Nonparametric mixed membership modelling using the IBP compound Dirichlet process / David M. Blei
  • 7.1. Introduction
  • 7.2. Mixed membership models
  • 7.2.1. Latent Dirichlet allocation
  • 7.2.2. Nonparametric mixed membership models
  • 7.3. Motivation
  • 7.4. Decorrelating prevalence and proportion
  • 7.4.1. Indian buffet process
  • 7.4.2. IBP compound Dirichlet process
  • 7.4.3. An application of the ICD: focused topic models
  • 7.4.4. Inference
  • 7.5. Related models
  • 7.6. Empirical studies
  • 7.7. Discussion
  • 8. Discovering nonbinary hierarchical structures with Bayesian rose trees / Katherine A. Heller
  • 8.1. Introduction
  • 8.2. Prior work
  • 8.3. Rose trees, partitions and mixtures
  • 8.4. Avoiding needless cascades
  • 8.4.1. Cluster models
  • 8.5. Greedy construction of Bayesian rose tree mixtures
  • 8.5.1. Prediction
  • 8.5.2. Hyperparameter optimisation
  • 8.6. Bayesian hierarchical clustering, Dirichlet process models and product partition models
  • 8.6.1. Mixture models and product partition models
  • 8.6.2. PCluster and Bayesian hierarchical clustering
  • 8.7. Results
  • 8.7.1. Optimality of tree structure
  • 8.7.2. Hierarchy likelihoods
  • 8.7.3. Partially observed data
  • 8.7.4. Psychological hierarchies
  • 8.7.5. Hierarchies of Gaussian process experts
  • 8.8. Discussion
  • 9. Mixtures of factor analysers for the analysis of high-dimensional data / Suren I. Rathnayake
  • 9.1. Introduction
  • 9.2. Single-factor analysis model
  • 9.3. Mixtures of factor analysers
  • 9.4. Mixtures of common factor analysers (MCFA)
  • 9.5. Some related approaches
  • 9.6. Fitting of factor-analytic models
  • 9.7. Choice of the number of factors q
  • 9.8. Example
  • 9.9. Low-dimensional plots via MCFA approach
  • 9.10. Multivariate t-factor analysers
  • 9.11. Discussion
  • Appendix
  • 10. Dealing with label switching under model uncertainty / Sylvia Fruhwirth-Schnatter
  • 10.1. Introduction
  • 10.2. Labelling through clustering in the point-process representation
  • 10.2.1. Point-process representation of a finite mixture model
  • 10.2.2. Identification through clustering in the point-process representation
  • 10.3. Identifying mixtures when the number of components is unknown
  • 10.3.1. Role of Dirichlet priors in overfitting mixtures
  • 10.3.2. Meaning of K for overfitting mixtures
  • 10.3.3. Point-process representation of overfitting mixtures
  • 10.3.4. Examples
  • 10.4. Overfitting heterogeneity of component-specific parameters
  • 10.4.1. Overfitting heterogeneity
  • 10.4.2. Using shrinkage priors on the component-specific location parameters
  • 10.5. Concluding remarks
  • 11. Exact Bayesian analysis of mixtures / Kerrie L. Mengersen
  • 11.1. Introduction
  • 11.2. Formal derivation of the posterior distribution
  • 11.2.1. Locally conjugate priors
  • 11.2.2. True posterior distributions
  • 11.2.3. Poisson mixture
  • 11.2.4. Multinomial mixtures
  • 11.2.5. Normal mixtures
  • 12. Manifold MCMC for mixtures / Mark Girolami
  • 12.1. Introduction
  • 12.2. Markov chain Monte Carlo Methods
  • 12.2.1. Metropolis-Hastings
  • 12.2.2. Gibbs sampling
  • 12.2.3. Manifold Metropolis adjusted Langevin algorithm
  • 12.2.4. Manifold Hamiltonian Monte Carlo
  • 12.3. Finite Gaussian mixture models
  • 12.3.1. Gibbs sampler for mixtures of univariate Gaussians
  • 12.3.2. Manifold MCMC for mixtures of univariate Gaussians
  • 12.3.3. Metric tensor
  • 12.3.4. An illustrative example
  • 12.4. Experiments
  • 12.5. Discussion
  • 13. How many components in a finite mixture? / Murray Aitkin
  • 13.1. Introduction
  • 13.2. Galaxy data
  • 13.3. Normal mixture model
  • 13.4. Bayesian analyses
  • 13.4.1. Escobar and West
  • 13.4.2. Phillips and Smith
  • 13.4.3. Roeder and Wasserman
  • 13.4.4. Richardson and Green
  • 13.4.5. Stephens
  • 13.5. Posterior distributions for K (for flat prior)
  • 13.6. Conclusions from the Bayesian analyses
  • 13.7. Posterior distributions of the model deviances
  • 13.8. Asymptotic distributions
  • 13.9. Posterior deviances for the galaxy data
  • 13.10. Conclusions
  • 14. Bayesian mixture models: a blood-free dissection of a sheep / Graham E. Gardner
  • 14.1. Introduction
  • 14.2. Mixture models
  • 14.2.1. Hierarchical normal mixture
  • 14.3. Altering dimensions of the mixture model
  • 14.4. Bayesian mixture model incorporating spatial information
  • 14.4.1. Results
  • 14.5. Volume calculation
  • 14.6. Discussion
  • References.
Description
xviii, 311 p. : ill. ; 24 cm.
Notes
Includes bibliographical references and index.
Technical Details
  • Access in Virgo Classic
  • Staff View

    LEADER 10202cam a2200445 a 4500
    001 u5394923
    003 SIRSI
    005 20110803145434.0
    008 110125s2011 njua b 001 0 eng
    010
      
      
    a| 2010053469
    020
      
      
    a| 9781119993896 (cloth)
    020
      
      
    a| 111999389X (cloth)
    035
      
      
    a| (OCoLC)698450396
    040
      
      
    a| DLC c| DLC d| YDX d| YDXCP d| BWX d| INU
    042
      
      
    a| pcc
    050
    0
    0
    a| QA273.6 b| .M59 2011
    082
    0
    0
    a| 519.2/4 2| 22
    245
    0
    0
    a| Mixtures : b| estimation and applications / c| edited by Kerrie L. Mengersen, Christian P. Robert, D. Michael Titterington.
    260
      
      
    a| Hoboken, N.J. : b| Wiley, c| c2011.
    300
      
      
    a| xviii, 311 p. : b| ill. ; c| 24 cm.
    490
    1
      
    a| Wiley series in probability and statistics
    504
      
      
    a| Includes bibliographical references and index.
    505
    0
    0
    g| Machine generated contents note: g| 1. t| EM algorithm, variational approximations and expectation propagation for mixtures / r| D. Michael Titterington -- g| 1.1. t| Preamble -- g| 1.2. t| EM algorithm -- g| 1.2.1. t| Introduction to the algorithm -- g| 1.2.2. t| E-step and the M-step for the mixing weights -- g| 1.2.3. t| M-step for mixtures of univariate Gaussian distributions -- g| 1.2.4. t| M-step for mixtures of regular exponential family distributions formulated in terms of the natural parameters -- g| 1.2.5. t| Application to other mixtures -- g| 1.2.6. t| EM as a double expectation -- g| 1.3. t| Variational approximations -- g| 1.3.1. t| Preamble -- g| 1.3.2. t| Introduction to variational approximations -- g| 1.3.3. t| Application of variational Bayes to mixture problems -- g| 1.3.4. t| Application to other mixture problems -- g| 1.3.5. t| Recursive variational approximations -- g| 1.3.6. t| Asymptotic results -- g| 1.4. t| Expectation-propagation -- g| 1.4.1. t| Introduction -- g| 1.4.2. t| Overview of the recursive approach to be adopted
    505
    0
    0
    g| 1.4.3. t| Finite Gaussian mixtures with an unknown mean parameter -- g| 1.4.4. t| Mixture of two known distributions -- g| 1.4.5. t| Discussion -- t| Acknowledgements -- t| References -- g| 2. t| Online expectation maximisation / r| Olivier Cappe -- g| 2.1. t| Introduction -- g| 2.2. t| Model and assumptions -- g| 2.3. t| EM algorithm and the limiting EM recursion -- g| 2.3.1. t| Batch EM algorithm -- g| 2.3.2. t| Limiting EM recursion -- g| 2.3.3. t| Limitations of batch EM for long data records -- g| 2.4. t| Online expectation maximisation -- g| 2.4.1. t| Algorithm -- g| 2.4.2. t| Convergence properties -- g| 2.4.3. t| Application to finite mixtures -- g| 2.4.4. t| Use for batch maximum-likelihood estimation -- g| 2.5. t| Discussion -- t| References -- g| 3. t| Limiting distribution of the EM test of the order of a finite mixture / r| Pengfei Li -- g| 3.1. t| Introduction -- g| 3.2. t| Method and theory of the EM test -- g| 3.2.1. t| Definition of the EM test statistic -- g| 3.2.2. t| Limiting distribution of the EM test statistic -- g| 3.3. t| Proofs
    505
    0
    0
    g| 3.4. t| Discussion -- t| References -- g| 4. t| Comparing Wald and likelihood regions applied to locally identifiable mixture models / r| Bruce G. Lindsay -- g| 4.1. t| Introduction -- g| 4.2. t| Background on likelihood confidence regions -- g| 4.2.1. t| Likelihood regions -- g| 4.2.2. t| Profile likelihood regions -- g| 4.2.3. t| Alternative methods -- g| 4.3. t| Background on simulation and visualisation of the likelihood regions -- g| 4.3.1. t| Modal simulation method -- g| 4.3.2. t| Illustrative example -- g| 4.4. t| Comparison between the likelihood regions and the Wald regions -- g| 4.4.1. t| Volume/volume error of the confidence regions -- g| 4.4.2. t| Differences in univariate intervals via worst case analysis -- g| 4.4.3. t| Illustrative example (revisited) -- g| 4.5. t| Application to a finite mixture model -- g| 4.5.1. t| Nonidentifiabilities and likelihood regions for the mixture parameters -- g| 4.5.2. t| Mixture likelihood region simulation and visualisation -- g| 4.5.3. t| Adequacy of using the Wald confidence region
    505
    0
    0
    g| 4.6. t| Data analysis -- g| 4.7. t| Discussion -- t| References -- g| 5. t| Mixture of experts modelling with social science applications / r| Thomas Brendan Murphy -- g| 5.1. t| Introduction -- g| 5.2. t| Motivating examples -- g| 5.2.1. t| Voting blocs -- g| 5.2.2. t| Social and organisational structure -- g| 5.3. t| Mixture models -- g| 5.4. t| Mixture of experts models -- g| 5.5. t| A mixture of experts model for ranked preference data -- g| 5.5.1. t| Examining the clustering structure -- g| 5.6. t| A mixture of experts latent position cluster model -- g| 5.7. t| Discussion -- t| Acknowledgements -- t| References -- g| 6. t| Modelling conditional densities using finite smooth mixtures / r| Robert Kohn -- g| 6.1. t| Introduction -- g| 6.2. t| Model and prior -- g| 6.2.1. t| Smooth mixtures -- g| 6.2.2. t| Component models -- g| 6.2.3. t| Prior -- g| 6.3. t| Inference methodology -- g| 6.3.1. t| General MCMC scheme -- g| 6.3.2. t| Updating β and I using variable-dimension finite-step Newton proposals -- g| 6.3.3. t| Model comparison -- g| 6.4. t| Applications -- g| 6.4.1. t| A small simulation study
    505
    0
    0
    g| 6.4.2. t| LIDAR data -- g| 6.4.3. t| Electricity expenditure data -- g| 6.5. t| Conclusions -- t| Acknowledgements -- t| Appendix: Implementation details for the gamma and log-normal models -- t| References -- g| 7. t| Nonparametric mixed membership modelling using the IBP compound Dirichlet process / r| David M. Blei -- g| 7.1. t| Introduction -- g| 7.2. t| Mixed membership models -- g| 7.2.1. t| Latent Dirichlet allocation -- g| 7.2.2. t| Nonparametric mixed membership models -- g| 7.3. t| Motivation -- g| 7.4. t| Decorrelating prevalence and proportion -- g| 7.4.1. t| Indian buffet process -- g| 7.4.2. t| IBP compound Dirichlet process -- g| 7.4.3. t| An application of the ICD: focused topic models -- g| 7.4.4. t| Inference -- g| 7.5. t| Related models -- g| 7.6. t| Empirical studies -- g| 7.7. t| Discussion -- t| References -- g| 8. t| Discovering nonbinary hierarchical structures with Bayesian rose trees / r| Katherine A. Heller -- g| 8.1. t| Introduction -- g| 8.2. t| Prior work -- g| 8.3. t| Rose trees, partitions and mixtures -- g| 8.4. t| Avoiding needless cascades -- g| 8.4.1. t| Cluster models
    505
    0
    0
    g| 8.5. t| Greedy construction of Bayesian rose tree mixtures -- g| 8.5.1. t| Prediction -- g| 8.5.2. t| Hyperparameter optimisation -- g| 8.6. t| Bayesian hierarchical clustering, Dirichlet process models and product partition models -- g| 8.6.1. t| Mixture models and product partition models -- g| 8.6.2. t| PCluster and Bayesian hierarchical clustering -- g| 8.7. t| Results -- g| 8.7.1. t| Optimality of tree structure -- g| 8.7.2. t| Hierarchy likelihoods -- g| 8.7.3. t| Partially observed data -- g| 8.7.4. t| Psychological hierarchies -- g| 8.7.5. t| Hierarchies of Gaussian process experts -- g| 8.8. t| Discussion -- t| References -- g| 9. t| Mixtures of factor analysers for the analysis of high-dimensional data / r| Suren I. Rathnayake -- g| 9.1. t| Introduction -- g| 9.2. t| Single-factor analysis model -- g| 9.3. t| Mixtures of factor analysers -- g| 9.4. t| Mixtures of common factor analysers (MCFA) -- g| 9.5. t| Some related approaches -- g| 9.6. t| Fitting of factor-analytic models -- g| 9.7. t| Choice of the number of factors q -- g| 9.8. t| Example -- g| 9.9. t| Low-dimensional plots via MCFA approach
    505
    0
    0
    g| 9.10. t| Multivariate t-factor analysers -- g| 9.11. t| Discussion -- t| Appendix -- t| References -- g| 10. t| Dealing with label switching under model uncertainty / r| Sylvia Fruhwirth-Schnatter -- g| 10.1. t| Introduction -- g| 10.2. t| Labelling through clustering in the point-process representation -- g| 10.2.1. t| Point-process representation of a finite mixture model -- g| 10.2.2. t| Identification through clustering in the point-process representation -- g| 10.3. t| Identifying mixtures when the number of components is unknown -- g| 10.3.1. t| Role of Dirichlet priors in overfitting mixtures -- g| 10.3.2. t| Meaning of K for overfitting mixtures -- g| 10.3.3. t| Point-process representation of overfitting mixtures -- g| 10.3.4. t| Examples -- g| 10.4. t| Overfitting heterogeneity of component-specific parameters -- g| 10.4.1. t| Overfitting heterogeneity -- g| 10.4.2. t| Using shrinkage priors on the component-specific location parameters -- g| 10.5. t| Concluding remarks -- t| References -- g| 11. t| Exact Bayesian analysis of mixtures / r| Kerrie L. Mengersen
    505
    0
    0
    g| 11.1. t| Introduction -- g| 11.2. t| Formal derivation of the posterior distribution -- g| 11.2.1. t| Locally conjugate priors -- g| 11.2.2. t| True posterior distributions -- g| 11.2.3. t| Poisson mixture -- g| 11.2.4. t| Multinomial mixtures -- g| 11.2.5. t| Normal mixtures -- t| References -- g| 12. t| Manifold MCMC for mixtures / r| Mark Girolami -- g| 12.1. t| Introduction -- g| 12.2. t| Markov chain Monte Carlo Methods -- g| 12.2.1. t| Metropolis-Hastings -- g| 12.2.2. t| Gibbs sampling -- g| 12.2.3. t| Manifold Metropolis adjusted Langevin algorithm -- g| 12.2.4. t| Manifold Hamiltonian Monte Carlo -- g| 12.3. t| Finite Gaussian mixture models -- g| 12.3.1. t| Gibbs sampler for mixtures of univariate Gaussians -- g| 12.3.2. t| Manifold MCMC for mixtures of univariate Gaussians -- g| 12.3.3. t| Metric tensor -- g| 12.3.4. t| An illustrative example -- g| 12.4. t| Experiments -- g| 12.5. t| Discussion -- t| Acknowledgements -- t| Appendix -- t| References -- g| 13. t| How many components in a finite mixture? / r| Murray Aitkin -- g| 13.1. t| Introduction -- g| 13.2. t| Galaxy data -- g| 13.3. t| Normal mixture model
    505
    0
    0
    g| 13.4. t| Bayesian analyses -- g| 13.4.1. t| Escobar and West -- g| 13.4.2. t| Phillips and Smith -- g| 13.4.3. t| Roeder and Wasserman -- g| 13.4.4. t| Richardson and Green -- g| 13.4.5. t| Stephens -- g| 13.5. t| Posterior distributions for K (for flat prior) -- g| 13.6. t| Conclusions from the Bayesian analyses -- g| 13.7. t| Posterior distributions of the model deviances -- g| 13.8. t| Asymptotic distributions -- g| 13.9. t| Posterior deviances for the galaxy data -- g| 13.10. t| Conclusions -- t| References -- g| 14. t| Bayesian mixture models: a blood-free dissection of a sheep / r| Graham E. Gardner -- g| 14.1. t| Introduction -- g| 14.2. t| Mixture models -- g| 14.2.1. t| Hierarchical normal mixture -- g| 14.3. t| Altering dimensions of the mixture model -- g| 14.4. t| Bayesian mixture model incorporating spatial information -- g| 14.4.1. t| Results -- g| 14.5. t| Volume calculation -- g| 14.6. t| Discussion -- t| References.
    650
      
    0
    a| Mixture distributions (Probability theory)
    700
    1
      
    a| Mengersen, Kerrie L.
    700
    1
      
    a| Robert, Christian P., d| 1961-
    700
    1
      
    a| Titterington, D. M.
    830
      
    0
    a| Wiley series in probability and statistics.
    994
      
      
    a| Z0 b| VA@
    596
      
      
    a| 5
    999
      
      
    a| QA273.6 .M59 2011 w| LC i| X030957782 k| CHECKEDOUT l| STACKS m| SCI-ENG t| BOOK
▾See more
▴See less

Availability

Google Preview

Google Books Preview
Library Location Map Availability Call Number
Brown Science and Engineering CHECKED OUT N/A Unavailable