"Bayesian statistics offers a framework to handle uncertainty that is based on a more intuitive mental model than the frequentist paradigm."
"Bayesian regression has close ties to regularization techniques while also giving us a principled approach to explicitly expressing prior beliefs. This helps us combat multicollinearity and overfitting."
This post has some very nice visuals and graphics illustrating Bayesian inference and what exactly a confidence interval does and does not tell us. And how, often, a Bayesian conclusion is what we have in mind:
"From the posterior distribution we can get the average and the credible interval (i.e. the uncertainty). On the surface, a credible interval looks just like the type of confidence interval we calculated above with an upper and lower limit. But here you can make probability statements such as P(trueheightininterval)=95%, which is often what we are seeking. We want to know the probability that the parameter lives in the interval and not the proportion of times we'd expect the true parameter to fall within the interval bands if we ran the experiment a large number of times. This is a more intuitive way to think about uncertainty."
Also, how Bayesian estimators are similar to shrinkage estimators:
"In fact, Bayesian regression has close ties to regularization techniques. Ridge regression can be thought of as a Bayesian regression where slope priors are normally distributed with means equal to 0. Lasso regression can be interpreted as a Bayesian regression with Laplace priors on the slopes."
See also:
See also: Overconfident Confidence Intervals.
No comments:
Post a Comment