Efficient Estimation Techniques of Model Parameters Under Bayesian and Non Bayesian Framework
Conference
Category: Young Statisticians
Proposal Description
Efficient estimation techniques for model parameters vary significantly between Bayesian and non-Bayesian frameworks. In non-Bayesian approaches, methods like Maximum Likelihood Estimation (MLE), Method of Moments (MoM), Generalized Method of Moments (GMM), and Least Squares Estimation (LSE) are commonly employed. MLE maximizes the likelihood function, providing point estimates but lacking measures of uncertainty. MoM and GMM utilize moment conditions, with GMM offering greater flexibility. LSE minimizes the sum of squared errors and is prevalent in linear regression. Conversely, Bayesian techniques such as Markov Chain Monte Carlo (MCMC), Variational Inference (VI), Bayesian Bootstrap, and Empirical Bayes Methods are favored. MCMC generates samples from the posterior distribution, accommodating complex models but being computationally demanding. VI approximates the posterior distribution efficiently. Bayesian Bootstrap resamples from the posterior, estimating parameter uncertainty effectively. Empirical Bayes Methods estimate hyperparameters from data, often with computational efficiency. While Bayesian methods naturally incorporate prior beliefs and provide full posterior distributions, frequentist techniques offer computational simplicity and well-established properties. The choice between them depends on the problem's nature, available data, and computational resources.