Maximum-likelihood estimation of the random-clumped multinomial model as a prototype problem for large-scale statistical computing

Author/Creator ORCID

Date

2012-05-08

Department

Program

Citation of Original Publication

Andrew M. Raim, Matthias K. Gobbert, Nagaraj K. Neerchal & Jorge G. Morel (2013) Maximum-likelihood estimation of the random-clumped multinomial model as a prototype problem for large-scale statistical computing, Journal of Statistical Computation and Simulation, 83:12, 2178-2194, DOI: 10.1080/00949655.2012.684095

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
This is the submitted Manuscript of an article published by Taylor & Francis in Journal of Statistical Computation and Simulation on 2012-05-08, available online: http://www.tandfonline.com/10.1080/00949655.2012.684095.

Abstract

Numerical methods are needed to obtain maximum-likelihood estimates (MLEs) in many problems. Computation time can be an issue for some likelihoods even with modern computing power. We consider one such problem where the assumed model is a random-clumped multinomial distribution. We compute MLEs for this model in parallel using the Toolkit for Advanced Optimization software library. The computations are performed on a distributed-memory cluster with low latency interconnect. We demonstrate that for larger problems, scaling the number of processes improves wall clock time significantly. An illustrative example shows how parallel MLE computation can be useful in a large data analysis. Our experience with a direct numerical approach indicates that more substantial gains may be obtained by making use of the specific structure of the random-clumped model