Given $Y \sim Q$, and that $\phi(Y) \sim P$, transformation sampling simulates $X\sim P$ by setting $X \equiv \phi(Y)$. Examples include:
- Inversion sampling is a special case where $Y \sim U[0,1]$, and $\phi$ is the inverse function of the cdf $F_{X}$.
- Sampling gamma distributions: if $Y_{1},\dots,Y_{\alpha} \overset{iid.}{\sim} \exp(1)$, then returning $\left( \sum_{i=1}^{\alpha}Y_{i} \right) / \beta \sim \mathrm{Gamma}(\alpha,\beta)$ samples a Gamma distribution (with integer $\alpha$).
### Sampling Gaussians
1D Gaussians: if $Y \sim N(0,1)$, then $\mu+\sigma Y \sim N(\mu,\sigma^{2})$. Hence the algorithm for sampling $X \sim N(\mu,\sigma^{2})$ is:
- (1) Sample $Y \sim N(0,1)$.
- (2) Return $X=\mu +\sigma Y$.
Multivariate Gaussians: if $\mathbf{Y} \sim N(\underline{0},I)$ contains independent Gaussians, then $\underline{\mu}+A\mathbf{Y} \sim N(\underline{\mu},\Sigma)$, where $\Sigma=AA^{T}$ is the **covariance matrix**. Hence the algorithm to simulate $\mathbf{X} \sim N(\underline{\mu},\Sigma)$ is:
- (1) Find $A:A A^{T}=\Sigma$, existence guaranteed by $\Sigma$ positive definite.
- (2) Sample $\mathbf{Y} \sim N(\underline{0}, I)$, i.e. $Y_{1},\dots,Y_{n} \overset{iid.}{\sim}N(0,1)$.
- (3) Return $\mathbf{X}=A\mathbf{Y}+\underline{\mu}$.