17 February 2025
- P(D|H)
- long-run frequency
- relatively simple analytical methods to solve roots
- conclusions pertain to data, not parameters or hypotheses
- compared to theoretical distribution when NULL hypothesis is true
- probability of obtaining observed data or MORE EXTREME data
Frequentist | Bayesian | |
---|---|---|
Probability | Long-run frequency \(P(D|H)\) | Degree of belief \(P(H|D)\) |
Parameters | Fixed, true | Random, distribution |
Obs. data | One possible | Fixed, true |
Inferences | Data | Parameters |
n: 10
Slope: -0.1022
t: -2.3252
p: 0.0485
n: 10
Slope: -10.2318
t: -2.2115
p: 0.0579
n: 100
Slope: -10.4713
t: -6.6457
p: 1.7101362^{-9}
Population A | Population B | |
---|---|---|
Percentage change | 0.46 | 45.46 |
Prob. >5% decline | 0 | 0.86 |
\[ \begin{aligned} P(H\mid D) &= \frac{P(D\mid H) \times P(H)}{P(D)}\\[1em] \mathsf{posterior\\belief\\(probability)} &= \frac{likelihood \times \mathsf{prior~probability}}{\mathsf{normalizing~constant}} \end{aligned} \]
\[ \begin{aligned} P(H\mid D) &= \frac{P(D\mid H) \times P(H)}{P(D)}\\ \mathsf{posterior\\belief\\(probability)} &= \frac{likelihood \times \mathsf{prior~probability}}{\mathsf{normalizing~constant}} \end{aligned} \]
The normalizing constant is required for probability - turn a frequency distribution into a probability distribution
|
\(P(D\mid H)\) |
- conclusions pertain to hypotheses
- computationally robust (sample size,balance,collinearity)
- inferential flexibility - derive any number of inferences
subjectivity?
intractable \[P(H\mid D) = \frac{P(D\mid H) \times P(H)}{P(D)}\]
\(P(D)\) - probability of data from all possible hypotheses
Marchov Chain Monte Carlo sampling
Marchov Chain Monte Carlo sampling
Marchov Chain Monte Carlo sampling
Marchov Chain Monte Carlo sampling
Marchov Chain Monte Carlo sampling
Marchov Chain Monte Carlo sampling
Marchov Chain Monte Carlo sampling
- Aim: samples reflect posterior frequency distribution
- samples used to construct posterior prob. dist.
- the sharper the multidimensional “features” - more samples
- chain should have traversed entire posterior
- inital location should not influence
Metropolis-Hastings
http://twiecki.github.io/blog/2014/01/02/visualizing-mcmc/
https://chi-feng.github.io/mcmc-demo/app.html?algorithm=GibbsSampling&target=banana
Gibbs
NUTS
- thinning
- burnin (warmup)
- chains
- MCMCpack
- winbugs (R2winbugs)
- jags (R2jags)
- stan (rstan, rstanarm, brms)
- https://github.com/stan-dev/rstan/wiki/RStan-Getting-Started
- https://learnb4ss.github.io/learnB4SS/articles/install-brms.html