Definition of Gamma Distribution

A continuous random variable $X$ is said to have an gamma distribution with parameters $\alpha$ and $\beta$ if its p.d.f. is given by $$ \begin{equation*} f(x)=\left\{ \begin{array}{ll} \frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x}, & \hbox{$x>0;\alpha, \beta >0$;} \\ 0, & \hbox{Otherwise.} \end{array} \right. \end{equation*} $$

The parameter $\alpha$ is the scale parameter and $\beta$ is the shape parameter of gamma distribution.

Notation: $X\sim G(\alpha, \beta)$.

Graph of Gamma Distribution

Following is the graph of probability density function of gamma distribution with parameter $\alpha=1$ and $\beta=1,2,4$.

Another form of gamma distribution

Another form of gamma distribution is $$ \begin{equation*} f(x)=\left\{ \begin{array}{ll} \frac{1}{\alpha^\beta \Gamma(\beta)} x^{\beta -1}e^{-\frac{x}{\alpha}}, & \hbox{$x>0;\alpha, \beta >0$;} \\ 0, & \hbox{Otherwise.} \end{array} \right. \end{equation*} $$

Notation: $X\sim G(1/\alpha, \beta)$.

One parameter Gamma Distribution

Letting $\alpha=1$ in $G(\alpha, \beta)$, the probability density function of $X$ is as follows:

$$ \begin{equation*} f(x)=\left\{ \begin{array}{ll} \frac{1}{\Gamma(\beta)}x^{\beta -1}e^{-x}, & \hbox{$x>0;\beta >0$;} \\ 0, & \hbox{Otherwise.} \end{array} \right. \end{equation*} $$

which is one parameter gamma distribution.

Mean of Gamma Distribution

The mean or expected value of gamma random variable is $E(X)= \dfrac{\beta}{\alpha}$

Proof

The expected value of gamma random variable is $$ \begin{eqnarray*} E(X) &=& \int_0^\infty x\frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\int_0^\infty x^{\beta+1 -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\frac{\Gamma(\beta+1)}{\alpha^{\beta+1}}\;\quad (\text{Using }\int_0^\infty x^{n-1}e^{-\theta x}\; dx = \frac{\Gamma(n)}{\theta^n} )\\ &=& \frac{\beta}{\alpha},\;\quad (\because\Gamma(\beta+1) = \beta \Gamma(\beta)) \end{eqnarray*} $$

Variance of Gamma distribution

The variance of gamma random variable is $V(X) = \dfrac{\beta}{\alpha^2}$.

Proof

The variance of random variable $X$ is given by

$$ \begin{equation*} V(X) = E(X^2) - [E(X)]^2. \end{equation*} $$

Let us find the expected value of $X^2$. $$ \begin{eqnarray*} E(X^2) &=& \int_0^\infty x^2\frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\int_0^\infty x^{\beta+2 -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\frac{\Gamma(\beta+2)}{\alpha^{\beta+2}}\\ &=& \frac{\beta(\beta+1)}{\alpha^2},\;\quad (\because\Gamma(\beta+2) = (\beta+1) \beta\Gamma(\beta)) \end{eqnarray*} $$ Thus, variance of $X$ is $$ \begin{eqnarray*} V(X)&=&E(X^2) - [E(X)]^2\\ &=&\frac{\beta(\beta+1)}{\alpha^2}-\bigg(\frac{\beta}{\alpha}\bigg)^2\\ &=&\frac{\beta}{\alpha^2}. \end{eqnarray*} $$

Harmonic Mean of Gamma Distribution

The harmonic mean of gamma random variable is $H=\frac{\beta-1}{\alpha}$.

Proof

The reciprocal of the harmonic mean of gamma random variable is $$ \begin{eqnarray*} \frac{1}{H}&=& E(1/X) \\ &=& \int_0^\infty \frac{1}{x}\frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\int_0^\infty x^{\beta-1 -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\frac{\Gamma(\beta-1)}{\alpha^{\beta-1}}\\ &=& \frac{\alpha}{\beta-1},\;\quad (\because\Gamma(\beta) = (\beta-1) \Gamma(\beta-1)) \end{eqnarray*} $$ Therefore, harmonic mean of gamma random variable is $$ \begin{equation*} H = \frac{\beta-1}{\alpha}. \end{equation*} $$

Mode of Gamma distribution

The mode of gamma random variable is $\dfrac{\beta-1}{\alpha}$.

Proof

The p.d.f. of gamma distribution with parameter $\alpha$ and $\beta$ is

$$ \begin{equation*} f(x) = \frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x},\; x>0;\alpha, \beta >0 \end{equation*} $$ Taking log of $f(x)$, we get $$ \begin{equation*} \log f(x) = \log\bigg(\frac{\alpha^\beta}{\Gamma(\beta)}\bigg)+(\beta-1)\log x -\alpha x. \end{equation*} $$ Differentiating $\log f(x)$ w.r.t. $x$ and equating to zero, we get $$ \begin{eqnarray*} & & \frac{d\log f(x)}{dx}=0 \\ &\Rightarrow& 0+ \frac{\beta-1}{x}-\alpha =0\\ &\Rightarrow& x=\frac{\beta-1}{\alpha}. \end{eqnarray*} $$ Also, $$ \begin{equation*} \frac{d^2\log f(x)}{dx^2}= -\frac{(\beta-1)}{x^2}<0. \end{equation*} $$ Hence, $f(x)$ is maximum at $x =\dfrac{\beta-1}{\alpha}$. Therefore, mode of gamma random variable is $\dfrac{\beta-1}{\alpha}$.

Raw Moments of Gamma Distribution

The $r^{th}$ raw moment of gamma random variable is $$ \begin{equation*} \mu_r^\prime =\frac{\Gamma(\beta+r)}{\alpha^{r}\Gamma(\beta)}. \end{equation*} $$

Proof

The $r^{th}$ raw moment of gamma random variable is $$ \begin{eqnarray*} \mu_r^\prime &=& E(X^r) \\ &=& \int_0^\infty x^r\frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\int_0^\infty x^{\beta+r -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\frac{\Gamma(\beta+r)}{\alpha^{\beta+r}}\\ &=& \frac{\Gamma(\beta+r)}{\alpha^{r}\Gamma(\beta)}. \end{eqnarray*} $$

M.G.F. of Gamma Distribution

The moment generating function of gamma random variable is $\bigg(1-\dfrac{t}{\alpha}\bigg)^{-\beta}$, if $t<\alpha$.

Proof

The moment generating function of $X$ is $$ \begin{eqnarray*} M_X(t) &=& E(e^{tX}) \\ &=& \int_0^\infty e^{tx}\frac{\alpha^\beta}{\Gamma(\beta)}x^{\beta -1}e^{-\alpha x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\int_0^\infty x^{\beta -1}e^{-(\alpha-t) x}\; dx\\ &=& \frac{\alpha^\beta}{\Gamma(\beta)}\frac{\Gamma(\beta)}{(\alpha-t)^\beta}\\ & & \text{ (integral converges only if $t<\alpha$})\\ &=& \frac{\alpha^\beta}{(\alpha-t)^\beta}\\ &=& \bigg(1-\frac{t}{\alpha}\bigg)^{-\beta}, \text{ (if $t<\alpha$}) \end{eqnarray*} $$

Additive Property of Gamma Distribution

The sum of two independent gamma variates is also gamma variate. That is, if $X_1$ and $X_2$ be two independent gamma variate with parameters $(\alpha, \beta_1)$ and $(\alpha, \beta_2)$ respectively, then $Y=X_1+X_2 \sim G(\alpha, \beta_1+\beta_2)$ distribution.

Proof

Let $X_1$ and $X_2$ be two independent gamma variate with parameters $(\alpha, \beta_1)$ and $(\alpha, \beta_2)$ respectively. Let $Y=X_1+X_2$.

Then the m.g.f. of $Y$ is $$ \begin{eqnarray*} M_Y(t) &=& E(e^{tY}) \\ &=& E(e^{t(X_1+X_2)}) \\ &=& E(e^{tX_1} e^{tX_2}) \\ &=& E(e^{tX_1})\cdot E(e^{tX_2})\\ & &\qquad (\because X_1, X_2 \text{ are independent })\\ &=& M_{X_1}(t)\cdot M_{X_2}(t)\\ &=& \bigg(1-\frac{t}{\alpha}\bigg)^{-\beta_1}\cdot \bigg(1-\frac{t}{\alpha}\bigg)^{-\beta_2}\\ &=& \bigg(1-\frac{t}{\alpha}\bigg)^{-(\beta_1+\beta_2)}. \end{eqnarray*} $$ which is the m.g.f. of gamma variate with parameter $(\alpha, \beta_1+\beta_2)$. Hence, by uniqueness theorem of MGF, $Y=X_1+X_2$ is a gamma variate with parameter $(\alpha, \beta_1+\beta_2)$.

C.G.F. of Gamma Distribution

The cumulant generating function of gamma random variable is $$ \begin{equation*} K_X(t) = \beta\bigg(\frac{t}{\alpha}+\frac{t^2}{2\alpha^2}+\cdots +\frac{(r-1)!}{\alpha^r}\frac{t^r}{r!}+\cdots\bigg). \end{equation*} $$

Proof

The cumulant generating function of $X$ is $$ \begin{eqnarray*} K_X(t)& = & \log_e M_X(t)\\ &=& \log _e \bigg(1-\frac{t}{\alpha}\bigg)^{-\beta}\\ &=&-\beta \log \bigg(1-\frac{t}{\alpha}\bigg)\\ &=& \beta\bigg(\frac{t}{\alpha}+\frac{t^2}{2\alpha^2}+\cdots \bigg)\\ & & \qquad (\because \log (1-a) = -(a+\frac{a^2}{2}+\frac{a^3}{3}+\cdots))\\ &=& \beta\bigg(\frac{t}{\alpha}+\frac{t^2}{2\alpha^2}+\cdots +\frac{(r-1)!}{\alpha^r}\frac{t^r}{r!}+\cdots\bigg)\\ \end{eqnarray*} $$ Thus the $r^{th}$ cumulant of gamma distribution is $$ \begin{eqnarray*} k_r & =& \text{coefficient of } \frac{t^r}{r!}\text{ in } K_X(t)\\ &=& \frac{\beta (r-1)!}{\alpha^r}, r=1,2,\cdots \end{eqnarray*} $$ Thus $$ \begin{eqnarray*} k_1 &=& \frac{\beta}{\alpha} =\mu_1^\prime \\ k_2 &=& \frac{\beta}{\alpha^2}=\mu_2\\ k_3 &=& \frac{2\beta}{\alpha^3}=\mu_3\\ k_4 &=& \frac{6\beta}{\alpha^4}=\mu_4-3\mu_2^2\\ \Rightarrow \mu_4 &=& \frac{3\beta(2+\beta)}{\alpha^4}. \end{eqnarray*} $$ The coefficient of skewness of gamma distribution is $$ \begin{eqnarray*} \beta_1 &=& \frac{\mu_3^2}{\mu_2^3} \\ &=& \frac{(\frac{2\beta}{\alpha^3})^2}{(\frac{\beta}{\alpha^2})^3}\\ &=& \frac{4}{\beta} \end{eqnarray*} $$ The coefficient of kurtosis of gamma distribution is $$ \begin{eqnarray*} \beta_2 &=& \frac{\mu_4}{\mu_2^2} \\ &=& \frac{\frac{3\beta(2+\beta)}{\alpha^4}}{(\frac{\beta}{\alpha^2})^2}\\ &=& \frac{6+3\beta}{\beta} \end{eqnarray*} $$

Particular Cases of Gamma distribution

  • For $\beta =1$, $G(\alpha, \beta)$ distribution becomes an exponential distribution with parameter $\alpha$.

  • For $\alpha = \dfrac{1}{2}$ and $\beta = \dfrac{\nu}{2}$, the $G(\alpha, \beta)$ distribution becomes chi-square ($\chi^2$) distribution with $\nu$ degrees of freedom.

Related Resources