\documentclass[12pt]{article}


\usepackage{amsmath}
\usepackage{a4,graphicx}
\begin{document}
\title{ ON BIVARIATE GEOMETRIC DISTRIBUTION}
\author {\bf K. Jayakumar and Davis Antony Mundassery \\
Department of Statistics \\
University of Calicut\\
Kerala-673\;635, \ India.}
\date{}
\maketitle
\begin{abstract}
Characterizations of a bivariate geometric distribution using univariate and bivariate geometric compounding are obtained. Autoregressive models with marginals
as bivariate geometric distribution are developed. Various bivariate geometric models analogous to important bivariate exponential distributions like, Marshall-Olkin bivariate exponential, Downton's bivariate exponential and Hawkes' bivariate exponential are presented.
\end{abstract}
\vspace{0.15in}
\textbf {Key words}:- Autoregressive process, Bivariate exponential distribution, Bivariate geometric distribution, Characterization, Compounding.
\section{Introduction}

\hspace{0.25in}
In this paper our aim is to obtain certain characterizations of bivariate geometric distribution using compounding . It is also devoted to introduce new bivariate geometric distributions.

\vspace{0.2in}
It is well known that bivariate analogues of univariate distributions can be developed by appropriately extending the generating functions of the univariate random variables. Consider the random variable $X$ defined as the number of failures preceding first success in a sequence of Bernoulli trials. The probability generating function (pgf) of $X$ is

\vspace{0.2in}\hspace{1.5in}
$\displaystyle\ P(s)=\frac{1}{1+c(1-s)} , \ \ c \ >\ 0.$

\vspace{0.2in}
A natural extension of this can obtained as follows: A bivariate random variable $(X, Y)$ has bivariate geometric distribution (BGD($c_{1} ,c_{2} ,\theta ^{2} ))$ if its pgf is

$ \displaystyle \pi (s_{1} ,s_{2} )=\frac{1}{(1+c_{1} (1-s_{1} ))(1+c_{2} (1-s_{2} ))-\theta ^{2} (1-s_{1} )(1-s_{2} )}. $ \hspace{0.6in} (1)

\vspace{0.2in}
\noindent
Each of the marginal pgf is readily seen to that of univariate geometric.

\vspace{0.2in}
Downton (1970) gave one interpretation for a bivariate geometric distribution in terms of shock model. Assume that two components are affected by shocks with probability $p_{1} $ the first component survives, with probability $p_{2} $ the second component survives and with probability $1-p_{1} -p_{2} $ both components fail. Let $N_{1} $ and $N_{2} $ are the number of shocks to the first and second components respectively before the first failure. Then $(N_{1} ,N_{2} )$ follows BGD$\displaystyle\left(\frac{p_{1} }{p_{0} } ,\frac{p_{2} }{p_{0} } ,1\right)$ distribution with joint probabilities,

P(N_{1} =n_{1} ,N_{2} =n_{2} )\quad =\quad \left(\begin{array}{c} {n_{1} +n_{2} } \\ {n_{1} } \end{array}\right)\, p_{1}^{n_{1} } \, p_{2}^{n_{2} } p_{0} , \ \ \ p_{0} +p_{1} +p_{2} =1,

\hspace{4.0in}
$n_{1} ,n_{2} =0,1,2,3...$

\noindent {Its pgf is}

\vspace{0.15in}\hspace{1.0in}
$\displaystyle g(s_{1} ,s_{2} )\quad =\quad \frac{1}{1+\frac{p_{1} }{p_{0} } (1-s_{1} )+\frac{p_{2} }{p_{0} } (1-s_{2} )} $.
\hspace{1.0in}(2)

\vspace{0.2in}
One objective of this paper is to obtain characterizations of BGD($c_{1} ,c_{2} ,\theta ^{2} )$ using a scheme generated by compounding with geometric distribution. For this we make use of an operator $ '\oplus '$ defined as follows:

\vspace{0.2in}
If $X$ has pgf $\pi (s)$ then $p\oplus X$ is defined (in distribution) by pgf $\pi (1-p+ps)$ or $p\oplus X=\sum \limits _{j=1}^{X}N_{j} $ where $P(N_{j} =1)=1-P(N_{j} =0) = p,$ all random variables $N_{j} $ being independent. A bivariate extension of this is considered. If $(X,Y)$ \ \ has \ pgf \ \ $\pi (s_{1} ,s_{2} )$ \ then $(p\oplus X,\; p\oplus Y)$ is \ \ defined \ by $\pi (1-p+ps_{1} , \ 1-p+ps_{2} )$ .

\vspace{0.2in}
Let $\{ (X_{i} ,\; Y_{i} ),i\ge 1\} $ be a sequence of independent and identically distributed random variables with pgf $\pi (s_{1} ,s_{2} )$ . Define

U_{N} =X_{1} +X_{2} +...+X_{N} and
V_{N} =Y_{1} +Y_{2} +...+Y_{N},


\vspace{0.1in}\noindent where $N $ \;follows geometric \ \ distribution \ \ such \ \ that $P(N = n) = (1-p)^{n-1} p ,$ \ \ $n$ = 1,2,3... The pgf of $(U_{N} ,V_{N} )$ is given by
\eta (s_{1} ,s_{2} ) = \sum \limits _{n=1}^{\infty }\left(\pi (s_{1} ,s_{2} )\right) ^{n} P(N=n)

\vspace{0.1in}\hspace{1.9in}
$\displaystyle =\frac{p\pi (s_{1} ,s_{2} )}{1-(1-p)\pi (s_{1} ,s_{2} )}. $
\hspace{1.6in}(3)

\vspace{0.2in}
Block (1977) discussed a bivariate geometric distribution which was also derived as a shock model. A bivariate random variable $(N_{1} ,N_{2} )$ has the geometric distribution with parameters $p_{00} ,p_{10} ,p_{01} \;$ and\; $p_{11} $ if it has the following survival function

\vspace{0.2in}
\hspace{0.25in}
$\displaystyle\overline{F}(n_{1} ,n_{2} ) = P(N_{1} >n_{1} ,\, \, N_{2} >n_{2} ) $

\vspace{0.2in}
\hspace{0.9in}
$=\; \left\{\begin{array}{l} {p_{11}^{n_{1} } (p_{01} +p_{11} )^{n_{2} -n_{1} } \quad \quad \quad \quad if\; n_{1} \le n_{2} } \\ {p_{11}^{n_{2} } (p_{10} +p_{11} )^{n_{1} -n_{2} } \quad \quad \quad \quad if\; n_{2} \le n_{1} } \end{array}\right. $ ,\hspace{0.95in}(4)

\vspace{0.2in}
\noindent
where $p_{00} +p_{10} +p_{01} \; +\; p_{11} =1$ , $p_{10} +p_{11} <1$ , $p_{01} +\; p_{11} <1.$

\vspace{0.2in}
\noindent
By compounding with this bivariate geometric distribution we can generate many bivariate exponential and geometric distributions available in the literature. Now, define

\vspace{0.1in}
\hspace{1.25in}
$\displaystyle\ U_{N_{1} } =\sum \limits _{i=1}^{N_{1} }X_{i}$ \ \ and \ \ $\displaystyle\ V_{N_{2} } =\sum \limits _{i=1}^{N_{2} }Y_{i}, $

\vspace{0.2in}
\noindent
where $ (N_{1} , N_{2} )$ follow bivariate geometric distribution given in (4). Block(1977) has given the expression for the Laplace transform of bivariate random variable $(U_{N_{1} } ,V_{N_{2} } )$ obtained by geometric compounding of a set of independent and identically distributed bivariate random variables. A discrete analogue of this result can be obtained as


\vspace{0.2in}
$\displaystyle \eta (s_{1} ,\, \, s_{2} )=\pi (s_{1} ,\, \, s_{2} )(p_{00} +p_{10} \eta (s_{1} ,\, 1)+p_{01} \eta (1,\, \, s_{2} )+p_{11} \eta (s_{1} ,\, \, s_{2} ))$ .
\hspace{0.35in}(5)

\vspace{0.2in}
\noindent From (5) we get,

\vspace{0.2in}
\hspace{1.25in}
$\left.
\begin{array}{cc}
\displaystyle\eta (s_{1} ,\, 1) & = \displaystyle \frac{(p_{00} +p_{01} )\pi (s_{1} ,\, 1)}{1-(p_{10} +p_{11} )\pi (s_{1} ,\, 1)}\\

\displaystyle\eta (1,\, \, s_{2} ) & = \displaystyle \frac{(p_{00} +p_{10} )\pi (1,\; s_{2} )}{1-(p_{01} +p_{11} )\pi (1,\; s_{2} )}
\end{array}
\right)$ \hspace{1.1in}(6)

\vspace{0.2in}
In Section 2, we obtain characterizations of BGD($c_{1} ,c_{2} ,\theta ^{2} )$. Autoregressive models with marginals BGD($c_{1} ,c_{2} ,\theta ^{2} )$ are developed in Section 3 and in Section 4 different bivariate geometric models are constructed using bivariate compounding.
\section{ Characterization of Bivariate Geometric Distribution}

\hspace{0.25in}
We consider now univariate geometric compounding of a set of independent and identically distributed bivariate geometric random variables. The following theorem gives a characterization of BGD($c_{1} ,c_{2} ,\theta ^{2} )$

\vspace{0.2in}
\noindent
{\bf Theorem 2.1}

\vspace{0.1in}
Let $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ be a sequence of independent and identically distributed random variables and $N$ follows geometric distribution such that $P( N = n ) = (1-p)^{n-1} p , \ \ 0 < p < 1, \ \ n$ = 1,2,3,... \ \ Then $(p\oplus U_{N} ,p\oplus V_{N} )$ and $(X_{i} ,Y_{i} )$ are identically distributed if and only if $(X_{i} ,Y_{i} )$ have BGD $(c_{1} ,c_{2} ,c_{1} c_{2} )$

\vspace{0.2in}
\noindent
{\bf Proof}

\vspace{0.1in}

The pgf of $(p\oplus U_{N} ,p\oplus V_{N} )$ is

\vspace{0.2in}
\hspace{0.9in}
$\displaystyle\eta (s_{1} ,s_{2} )=E(e^{-s_{1} p\oplus U_{N} -s_{2} p^{} \oplus V_{N} } /N=n)$

\vspace{0.2in}
\hspace{1.5in}
$=\sum \limits _{n=1}^{\infty }\left(\pi (1-p+ps_{1} ,1-p+ps_{2} )\right) ^{n} P(N=n)$

\vspace{0.2in}
\hspace{1.5in}
$ \displaystyle = \frac{p\pi (1-p+p^{} s_{1} ,1-p+ps_{2} )}{1-(1-p)\pi (1-p+ps_{1} ,1-p+ps_{2} )}. $\hspace{0.7in} (7)

\vspace{0.2in}
\noindent
Substituting $\pi (s_{1} ,s_{2} )$ from (1), we get

\vspace{0.2in}
\hspace{0.9in}
$\displaystyle\eta (s_{1} ,s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}. $

\vspace{0.2in}
Conversely assume that $(p\oplus U_{N} ,p\oplus V_{N} )$ follows BGD $(c_{1} ,c_{2} ,c_{1} c_{2} ).$ Then from (7) we have,

\vspace{0.2in}
$\displaystyle\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} =\frac{p\pi (1-p+ps_{1} ,1-p+ps_{2} )}{1-(1-p)\pi (1-p+ps_{1} ,1-p+ps_{2} )}. $

\vspace{0.2in}
\noindent
Solving we get,

\hspace{1.25in} $\displaystyle\pi (s_{1} ,s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}. $

\noindent{\bf Theorem 2.2}

\vspace{0.1in}
Consider a sequence $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ of independent and identically distributed random variables and $N,$ independent of $(X_{i} ,Y_{i} )$ , with geometric distribution such that $P( N = n ) = (1-p)^{n-1} p , \ \ 0 < \ p \ < 1 ,\ \ n $= 1,2,3,...\ \ Then $(p\oplus U_{N} ,p\oplus V_{N} )$ follows BGD $(1,\; 1,\; q)$ where $q= 1- p$ if and only $(X_{i} ,Y_{i} )$ have BGD $(1,\; 1,\; 0)$ .

\vspace{0.1in}
\noindent
{\bf Proof}

\vspace{0.15in}
Taking the pgf of $(X_{i} ,Y_{i} )$ as \pi (s_{1} ,s_{2} )=\frac{1}{\left(1+(1-s_{1} )\right)\left(1+(1-s_{2} )\right)} .

\vspace{0.2in}
\noindent
From (7) the pgf of $(p\oplus U_{N} ,p\oplus V_{N} )$ is

\vspace{0.2in}
\hspace{1.15in}
$\displaystyle\eta (s_{1} ,s_{2} )=\frac{p}{\left(1+(1-s_{1} )\right)\left(1+(1-s_{2} )\right)-1+p} $

\vspace{0.2in}
\hspace{1.75in}
$\displaystyle =\frac{1}{1+(1-s_{1} )+(1-s_{2} )+p(1-s_{1} )(1-s_{1} )}. $

\vspace{0.2in}
\noindent
Comparing with (1) we get $\theta ^{2} =q$ . Hence $(p\oplus U_{N} ,p\oplus V_{N} )$ follows BGD $(1,\; 1,\; q)$ .

\vspace{0.2in}
The \ \ converse of the theorem \ \ is obtained by \ \ substituting the\ \ pgf of $(p\oplus U_{N} ,p\oplus V_{N} )$ in (7). We have

\vspace{0.2in}
$\displaystyle \frac{1}{1+(1-s_{1} )+(1-s_{2} )+p(1-s_{1} )(1-s_{2} )}$

\vspace{0.2in}
\hspace{2.2in}
$\displaystyle=\frac{p\pi (1-p+ps_{1} ,1-p+ps_{2} )}{1-(1-p)\pi (1-p+ps_{1} ,1-p+ps_{2} )} .$

\vspace{0.1in}\noindent
Solving, we obtain

\vspace{0.15in}\hspace{1.55in} $\displaystyle\pi (s_{1} ,s_{2} )=\frac{1}{(1+(1-s_{1} ))(1+(1-s_{2} ))}. $

\vspace{0.2in}
\noindent
{\bf Theorem 2.3}

\vspace{0.1in}
Suppose $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ is a sequence of independent and identically distributed random variables according to BGD $(1,\; 1,\; 1)$ . Then $(p\oplus U_{N} ,p\oplus V_{N} )$ and $(X_{i} ,Y_{i} )$ are identically distributed if and only if $ N $is geometric.

\vspace{0.1in}
\noindent
{\bf Proof}

\vspace{0.1in}
The $'$if $'$ part of the proof is discussed in Theorem 2.1.

\vspace{0.2in}
\noindent
Suppose that $(X_{i} ,Y_{i} )$ and $(p\oplus U_{N} ,p\oplus V_{N} )$ are identically distributed. But the pgf of $(p\oplus U_{N} ,p\oplus V_{N} )$ is given by

\vspace{0.2in}
\hspace{0.5in}
$\eta (s_{1} ,s_{2} )$ = $\sum \limits _{n=1}^{\infty }\left(\pi (1-p+ps_{1} ,1-p+ps_{2} )\right) ^{n} P(N=n)$ .

\vspace{0.2in}
\noindent
Therefore from the assumption, we get

\vspace{0.2in}
$\displaystyle\sum \limits _{n=1}^{\infty }\left(\frac{1}{1+p(1-s_{1} )+p(1-s_{2} )} \right)^{n} P(N=n)= \left(1+(1-s_{1} )+(1-s_{2} )\right)^{-1} $ .

\vspace{0.225in}
\noindent
Expanding with respect to $n,$

\vspace{0.2in}
$\displaystyle \sum \limits _{n=1}^{\infty }\sum \limits _{j=0}^{\infty }\frac{(-1)^{j} (j+n-1)!\left((1-s_{1} )+(1-s_{2} )\right)^{j} p^{j} }{j!(n-1)!} P(N=n)$

\vspace{0.2in}
\hspace{2.75in}
$\displaystyle = \sum \limits _{j=0}^{\infty }(-1)^{j} \left((1-s_{1} )+(1-s_{2} )\right)^{j}.$

\vspace{0.225in}
\noindent
Comparing the coefficients of $\left((1-s_{1} )+(1-s_{2} )\right)^{j} $ , we get,

\vspace{0.2in}
\hspace{0.5in}
$\displaystyle \sum \limits _{n=1}^{\infty }\frac{n(n+1)(n+2)...(n+j-1)p^{j} P(N=n)}{j!} $ = 1, \ \ for $j$ = 1,2,3..

\vspace{0.1in}
\noindent
Therefore,

\hspace{1.0in}
$E(N)$ = $\displaystyle\frac{1}{p} $ , $\displaystyle\ E\left(N(N+1)\right) = \frac{2}{p^{2} } $ and so on .

\vspace{0.1in}
\noindent
Consider

\vspace{0.1in}
$\displaystyle\ E(1-t)^{-N} =1+\frac{t}{1!} E(N)+\frac{t^{2} }{2!} E\left(N(N+1)\right)+\frac{t^{3} }{3!} E\left(N(N+1)(N+2)\right)+...$

\vspace{0.2in}
\hspace{0.8in}
= $\displaystyle \frac{p}{p-t} $ .

\vspace{0.1in}
\noindent
But,

\hspace{1.35in}
$\displaystyle\ E(1-t)^{-N} = \sum \limits _{n=1}^{\infty }(1-t)^{-n} P(N=n) .$

\vspace{0.25in}
\noindent
Also

\vspace{0.2in}
\hspace{0.65in}
$\displaystyle p \sum \limits _{n=1}^{\infty }(1-t)^{-n} (1-p)^{n-1} =\frac{p}{1-p} \sum \limits _{n=1}^{\infty }\left(\frac{1-p}{1-t} \right)^{n} = \displaystyle\frac{p}{p-t}. $

\vspace{0.2in}
\noindent
Therefore,

\vspace{0.2in}
\hspace{0.7in}
$\sum \limits _{n=1}^{\infty }(1-t)^{-n} P(N=n)$ = $p\sum \limits _{n=1}^{\infty }(1-t)^{-n} (1-p)^{n-1} . $

\vspace{0.2in}
\noindent
Comparing, we get

\vspace{0.2in}
\hspace{1.55in}
$\displaystyle\ P( N = n ) = (1-p)^{n-1} p,$ \ \ for $n$ = 1,2,3,…

\vspace{0.2in}
\noindent
{\bf Theorem 2.4}

\vspace{0.15in}
Let $N_{1} $ and $N_{2} $ are two independent random variables following geometric distribution with parameters a and b. Suppose that $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ is a sequence of independent and identically distributed random variables. Then $(a\oplus U_{N_{1} } ,a\oplus V_{N_{1} } )$ and $(b\oplus U_{N_{1} } ,b\oplus V_{N_{2} } )$ are identically distributed if $(X_{i} ,Y_{i} )$ have BGD $(c_{1} ,c_{2} ,c_{1} c_{2} ).$

\vspace{0.2in}
\noindent
{\bf Proof}

\vspace{0.15in}
Suppose $(X_{i} ,Y_{i} )$ have BGD $(c_{1} ,c_{2} ,c_{1} c_{2} )$ .
Substituting in (7) we obtain the pgf of $(a\oplus U_{N_{1} } ,a\oplus V_{N_{1} } )$ as

\eta (s_{1} ,s_{2} ) = \frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}.

\vspace{0.2in}
\noindent
Similarly we can show that $(b\oplus U_{N_{2} } ,b\oplus V_{N_{2} } )$ has the same pgf. Hence the theorem.

\vspace{0.2in}
In the next theorems we obtain characterizations of BGD $(c_{1} ,c_{2} ,0)$ using bivariate geometric compounding .

\vspace{0.2in}
\noindent
{\bf Theorem 2.5 }

\vspace{0.15in}
Let $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ be a sequence of independent and identically distributed random variables. Suppose $(N_{1} ,N_{2} )$ have bivariate geometric distribution given in (4). Define $p_{00} =0$ , $p_{01} =\mu _{1} $ ,$p_{10} =\mu _{2} $ and $p_{11} =1-(\mu _{1} +\mu _{2} ).$ Then $(U_{N_{1} } ,V_{N_{2} } )$ has BGD $\displaystyle\left(\frac{c_{1} }{\mu _{1} } ,\frac{c_{2} }{\mu _{2} } ,0\right)$ if and only if $(X_{i} ,Y_{i} )$ follows BGD $(c_{1} ,c_{2} ,c_{1} c_{2} ).$

\vspace{0.45in}
\noindent
{\bf Proof}

\vspace{0.1in}
Substituting the values of $p_{00} ,\; p_{10} ,\; p_{01} ,\; p_{11} $ and $\pi (s_{1} ,\; s_{2} )$ in (5), and using (6) we get,

\vspace{0.1in}
$\displaystyle\eta (s_{1} ,\; s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}$
$\displaystyle\left(\mu _{2} \left(1+\frac{c_{1} (1-s_{1} )}{\mu _{1} } \right)^{-1}\right.$

\vspace{.2in}\hspace{1.7in}
+$\displaystyle\left.\mu _{1} \left(1+\frac{c_{2} (1-s_{2} )}{\mu _{2} } \right)^{-1} +(1-\mu _{1} -\mu _{2} )\eta (s_{1} ,\; s_{2} )\right).$

\vspace{0.2in}
\noindent
Solving for $\displaystyle\eta (s_{1} ,\, \, s_{2} )$ , we \ \ get,

\vspace{0.2in}\hspace{1.85in} $\displaystyle\eta (s_{1} ,s_{2} ) = \frac{1}{\left(1+\frac{c_{1} (1-s_{1} )}{\mu _{1} } \right)\, \, \left(1+\frac{c_{2} (1-s_{2} )}{\mu _{2} } \right)} .$

\vspace{0.2in}
Conversely, assuming that $\displaystyle\eta (s_{1} ,s_{2} ) = \frac{1}{\left(1+\frac{c_{1} (1-s_{1} )}{\mu _{1} } \right)\, \, \left(1+\frac{c_{2} (1-s_{2} )}{\mu _{2} } \right)} .$

\vspace{0.1in}
\noindent
From (5), we get,

\vspace{0.2in}
\noindent
$\displaystyle\frac{1}{\left(1+\frac{c_{1} (1-s_{1} )}{\mu _{1} } \right)\left(1+\frac{c_{2} (1-s_{2} )}{\mu _{2} } \right)} \quad $

\vspace{0.1in}
\hspace{1.0in}
=$\displaystyle\eta (s_{1} ,s_{2} )\left(p_{00} \, +p_{10} \frac{1}{\left(1+\frac{c_{1} (1-s_{1} )}{\mu _{1} } \right)} +{\kern 1pt} p_{01} \frac{1}{\left(1+\frac{c_{2} (1-s_{2} )}{\mu _{2} } \right)}+\right.$

\vspace{0.1in}\hspace{2.75in}
$\displaystyle\left.{\kern 1pt} p_{11} \frac{1}{\left(1+\frac{c_{1} (1-s_{1} )}{\mu _{1} } \right)\left(1+\frac{c_{2} (1-s_{2} )}{\mu _{2} } \right)} \right) .$

\vspace{0.15in}
\noindent
Substituting the values of $\displaystyle \ p_{00} ,p_{10} ,p_{01} \; $and$\; p_{11} $ , gives

\displaystyle \pi (s_{1} ,s_{2} )\; =\; \frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}.

\vspace{0.15in}
\noindent
{\bf Theorem 2.6}

\vspace{0.1in}
Consider a sequence of independent and identically distributed random variables $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ and $(N_{1} ,N_{2} )$ follow bivariate geometric distribution given in (4) with $p_{11} =p,\; p_{10} +p_{11} =p_{1} \; ,p_{01} +p_{11} =p_{2} \; $and\;$ p_{00} =0$ . Then $(q_{1} \oplus U_{N_{1} } ,q_{2} \oplus V_{N_{2} } )$ follows BGD $(c_{1} ,c_{2} ,0)$ if and only if $(X_{i} ,Y_{i} )$ has BGD $(c_{1} ,c_{2} ,c_{1} c_{2} )$ where $q_{i} =1-p_{i} .$

\vspace{0.2in}
\noindent
{\bf Proof}

\vspace{0.1in}
From (5) and (6), we have

\vspace{0.2in}
\hspace{0.5in}
$\displaystyle \eta (s_{1} ,s_{2} )=\frac{\pi (s_{1} ,s_{2} )\left(\frac{p_{10} (p_{00} +p_{01} )\pi (s_{1} ,1)}{1-(p_{10} +p_{11} )\pi (s_{1} ,1)} +\frac{p_{01} (p_{00} +p_{10} )\pi (1,\; s_{2} )}{1-(p_{01} +p_{11} )\pi (1,\; s_{2} )} \right)}{1-p_{11} \pi (s_{1} ,s_{2} )} $

\vspace{0.25in}
\noindent
Assume that $(X_{i} ,Y_{i} )$ has BGD $(c_{1} ,c_{2} ,c_{1} c_{2} )$ , then pgf of $(q_{1} \oplus U_{N_{1} } ,q_{2} \oplus V_{N_{2} } )$ is

\vspace{0.25in}\hspace{0.5in}
$\displaystyle \eta (s_{1} ,s_{2} )=\frac{q_{1} q_{2} }{1+c_{1} q_{1} (1-s_{1} )+c_{2} q_{2} (1-s_{2} )-p}$

\vspace{0.225in}
\hspace{1.9in}
$\displaystyle \left(\frac{1}{1+c_{1} q_{1} (1-s_{1} )-p_{1} } +\frac{1}{1+c_{2} q_{2} (1-s_{2} )-p_{2} } \right).$

\vspace{0.2in}
\noindent
On simplification , we get

\vspace{0.2in}
\hspace{0.5in}
$\displaystyle\eta (s_{1} ,s_{2} )=\frac{1}{\left(1+c_{1} (1-s_{1} )\right)\left(1+c_{2} (1-s_{2} )\right)} .$

\vspace{0.25in}
Conversely, suppose that $(q_{1} \oplus U_{N_{1} } ,q_{2} \oplus V_{N_{2} } )$ follows BGD $(c_{1} ,c_{2} ,0)$ . From (5) we have

\vspace{0.225in}
\noindent
$\displaystyle \frac{1}{\left(1+c_{1} (1-s_{1} )\right)\left(1+c_{2} (1-s_{2} )\right)} =$

\vspace{0.225in}
\hspace{0.35in}
$\displaystyle \frac {\pi (1-q_{1} +q_{1} s_{1} ,1-q_{2} +q_{2} s_{2} )}{1-p\pi (1-q_{1} +q_{1} s_{1} ,1-q_{2} +q_{2} s_{2} )} \left(\frac{q_{2} }{1+c_{1} (1-s_{1} )} +\frac{q_{1} }{1+c_{2} (1-s_{2} )} \right).$

\vspace{0.25in}
\noindent
Solving we get, \pi (s_{1} ,s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} .

\vspace{0.2in}
\noindent
{\bf Theorem 2.7}

\vspace{0.15in}
Let $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ be a sequence of independent and identically distributed random variables. Let $(N_{1} ,N_{2} )$ have bivariate geometric distribution given in (4) and $p_{11} =p,\; p_{10} +p_{11} =p_{1} \; ,p_{01} +p_{11} =p_{2} \; $and$\; p_{00} =0$ . Then $(q\oplus U_{N_{1} } ,q\oplus V_{N_{2} } )$ has pgf BGD $(q,\; q,\; 0)$ if and only if $(X_{i} ,Y_{i} )$ have pgf BGD $(q_{1} ,q_{2} ,q_{1} q_{2} )$ where $q = 1-p.$

\vspace{0.35in}
\noindent
{\bf Proof}

\vspace{0.1in}
Suppose that $(X_{i} ,Y_{i} )$ have BGD $(q_{1} ,q_{2} ,q_{1} q_{2} )$ . Substituting \ \ in (5), \ \ we get \ \ the pgf of $(q\oplus U_{N_{1} } ,q\oplus V_{N_{2} } )$ as

\vspace{0.2in}
\noindent
$\displaystyle \eta (s_{1} ,s_{2} )= \frac{q_{1} q_{2} }{1+q_{1} q(1-s_{1} )+q_{2} q(1-s_{2} )-p}$

\vspace{0.2in}
\hspace{2.0in}
$\displaystyle \left(\frac{1}{1+q_{1} q(1-s_{1} )-p_{1} } +\frac{1}{1+q_{2} q(1-s_{2} )-p_{2} } \right).$


\vspace{0.1in}\noindent
On simplification we get,

\displaystyle\eta (s_{1} ,s_{2} )=\frac{1}{\left(1+q(1-s_{1} )\right)\left(1+q(1-s_{2} )\right)}.

\vspace{0.2in}
Conversely take $(q\oplus U_{N_{1} } ,q\oplus V_{N_{2} } )$ as BGD $(q,\; q,\; 0).$
From (5), we have

\vspace{0.225in}
\noindent
$\displaystyle \frac{1}{\left(1+q(1-s_{1} )\right)\left(1+q(1-s_{2} )\right)}$

\vspace{0.2in}
\hspace{0.5in}
$\displaystyle =\frac{\pi (1-q+qs_{1} ,1-q+qs_{2} )}{1-p\pi (1-q+qs_{1} ,1-q+qs_{2} )} \left(\frac{q_{2} }{1+q(1-s_{1} )} +\frac{q_{1} }{1+q(1-s_{2} )} \right).$

\vspace{0.2in}
\noindent
and simplifying we get,

\vspace{0.1in}
\displaystyle \pi (s_{1} ,s_{2} )=\frac{1}{1+q_{1} (1-s_{1} )+q_{2} (1-s_{2} )}.

\section{Bivariate Autoregressive Geometric Process}

\hspace{0.25in}
Here we develop first order autoregressive process with marginals as bivariate geometric distribution.

\vspace{0.2in}
\noindent
{\bf Theorem 3.1}

\vspace{0.1in}Let a first order bivariate autoregressive process have the structure

\vspace{0.15in}
$ \displaystyle (X_{0} ,Y_{0} )\ \ = (\varepsilon _{1} ,\psi _{1} )$ and for $ n $ =
1,2,3...

\vspace{0.1in}
$(X_{n} ,Y_{n} )=(\rho \oplus X_{n-1} +\ U _{n} ,\rho \oplus Y_{n-1} +\ V _{n} ),$ $0<\rho <1. $ \hspace{1.25in} (8)

\vspace{0.15in}
\noindent
Then the process is stationary with marginals follow BGD $(c_{1} ,c_{2} ,c_{1} c_{2} )$ if and only $(\ U _{n} ,\ V _{n} )\ \underline{\underline{d}}\ \ (I_{n} \varepsilon _{n} ,I_{n} \psi _{n} )$ where $ \{I_{n} ,n\ge 1 \}$ and $ \{ \varepsilon _{n} ,\psi _{n} \} $ are two independent sequences of independent and identically distributed random variables such that $I_{n} $ has Bernoulli distribution with $\displaystyle\ P(I_{n} =0)=1-P(I_{n} =1)=\rho $ and $(\varepsilon _{n} ,\psi _{n} )$ follow BGD $(c_{1} ,c_{2} ,c_{1} c_{2} ).$

\vspace{0.2in}\noindent
{\bf Proof }

\vspace{0.1in}
The pgf of (8) is

\vspace{0.2in}
\hspace{0.25in}
$\displaystyle \pi _{X_{n} ,Y_{n} } (s_{1} ,s_{2} )$\ \ \ =\ \ \ $\pi _{X_{n-1} ,Y_{n-1} } (1-\rho +\rho s_{1} ,1-\rho +\rho s_{2} )\pi _{U _{n} ,\ V _{n} } (s_{1} ,s_{2} ) \ \ \ \ \ \ \ \ \ \ (9)$

\vspace{0.2in}
\noindent
Suppose that the process is stationary with marginals follow BGD $(c_{1} ,c_{2} ,c_{1} c_{2}), $
we have ,
\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} =\frac{1}{1+c_{1} \rho (1-s_{1} )+c_{2} \rho (1-s_{2} )} \pi _{ U ,V } s_{1} ,s_{2} ).

\vspace{0.2in}\noindent
Solving we get,

\vspace{0.15in}\hspace{1.2in}$\displaystyle \pi _{U ,V } (s_{1} ,s_{2} )=\rho +\frac{1-\rho }{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}.$

\vspace{0.2in}
Converse of the proof is obtained by mathematical induction. When $n $\ = 1 from (9),we have

\pi _{X_{1} ,Y_{1} } (s_{1} ,s_{2} )=\pi _{X_{0} ,Y_{0} } (1-\rho +\rho s_{1} ,1-\rho +\rho s_{2} )\pi _{U _{1} ,\ V _{1} } (s_{1} ,s_{2} ).

\vspace{0.15in}
\noindent
Under the assumption,
\pi _{X_{1} ,Y_{1} } (s_{1} ,s_{2} )=\frac{1}{1+c_{1} \rho (1-s_{1} )+c_{2} \rho (1-s_{2} )} \left(\rho +\frac{1-\rho }{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} \right).

\vspace{0.15in}\hspace{0.65in}
= $ \displaystyle \frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}.$

\vspace{0.2in}
\noindent
Hence the process is stationary with marginals BGD $(c_{1} ,c_{2} ,c_{1} c_{2} )$

\vspace{0.2in}
\noindent
{\bf Theorem 3.2}

\vspace{0.1in}
Consider an autoregressive process defined by

\vspace{0.125in}
$\displaystyle\ (X_{0} ,Y_{0} ) \ = \ (\varepsilon _{0} ,\psi _{0} )$

\vspace{0.05in}

$(X_{n} ,Y_{n} )=(V_{n} \oplus X_{n-1} +V_{n}\oplus \varepsilon _{n} ,V_{n} \oplus Y_{n-1} +V_{n} \oplus\psi _{n} )$ {\bf , } $n\ge 1$ \hspace{0.75in} (10)

\vspace{0.125in}
\noindent
where $(\varepsilon _{n} ,\psi _{n} )$ is a sequence of independently and identically distributed random variables $V_{n} $ is a sequence of independent and identically distributed random variables and independent of $(\varepsilon _{n} ,\psi _{n} )$ such that $V_{n} $ has uniform distribution on (0,1). Then the process $(X_{n} ,Y_{n} )$ is stationary if and only if $(\varepsilon _{0} ,\psi _{0} )$ follows bivariate geometric.

\vspace{0.2in}\noindent
{\bf Proof}

\vspace{0.1in}
Suppose that the process is stationary and $(X_{0} ,Y_{0} ) = (\varepsilon _{0} ,\psi _{0} )$

\vspace{0.15in}
\noindent
The pgf of (10) is

\vspace{0.2in}
\hspace{0.25in}
$\displaystyle\pi _{X_{n} ,Y_{n} } (s_{1} ,s_{2} )
=E(s_{1}^{V_{n} \oplus X_{n-1} +V_{n} \oplus \varepsilon _{n}} s_{2}^{V_{n} \oplus\ Y_{n-1} +V_{n}\oplus \psi _{n} } )$

\vspace{0.2in}
\hspace{1.2in}
=$\displaystyle\int \limits _{0}^{1}\pi _{X_{n-1} ,Y_{n-1} } (1-v_{n} +v_{n} s_{1} ,1-v_{n} +v_{n} s_{2} )$

\vspace{0.1in}
\hspace{2.25in}
$\displaystyle\pi _{\varepsilon _{n} ,\psi _{n} } (1-v_{n} +v_{n} s_{1} ,1-v_{n} +v_{n} s_{2} ) dv.$ \hspace{0.35in} (11)

\vspace{0.15in}
\noindent
Under the assumption

\vspace{0.15in}
\hspace{1.1in}
$\displaystyle\pi _{X,Y} (s_{1} ,s_{2} )$ = $\int \limits _{0}^{1}\pi _{X,Y}^{2} (1-v+vs_{1} ,1-v+vs_{2} ) dv.$


\vspace{0.2in}\noindent
Let $\displaystyle\pi _{X,Y} (1-s_{1} ,1-s_{2} )=\gamma _{X,Y} (s_{1} ,s_{2} ).$

\vspace{0.2in}
\noindent
Hence

\hspace{1.15in} $\displaystyle\gamma _{X,Y} (s_{1} ,s_{2} )=\int \limits _{0}^{1}\gamma _{X,Y}^{2} (vs_{1} ,vs_{2} ) dv$ .

\noindent
Take $s_{j} =\delta _{j} s$ for $j$ = 1,2.

\vspace{0.2in}
\noindent
Thus,\hspace{0.8in}$\displaystyle \gamma _{X,Y} \left((\delta _{1} ,\delta _{2} )s\right)=\int \limits _{0}^{1}\gamma _{X,Y}^{2} \left((\delta _{1} ,\delta _{2} )sv\right)dv $ .

\vspace{0.2in}
\noindent
If $sv = t,$

\hspace{0.75in}
$\displaystyle\ s\gamma _{X,Y} \left((\delta _{1} ,\delta _{2} )s\right)=\int \limits _{0}^{s}\gamma _{X,Y}^{2} \left((\delta _{1} ,\delta _{2} )t\right)dt .$

\vspace{0.2in}
\noindent
Differentiating with respect to $s$ and dividing by $\displaystyle\gamma _{X,Y}^{2} \left((\delta _{1} ,\delta _{2} )s\right)$ , we get

\vspace{0.2in}
\hspace{1.0in}
$\displaystyle \frac{s\gamma _{X,Y}^{'} \left((\delta _{1} ,\delta _{2} )s\right)}{\gamma _{X,Y}^{2} \left((\delta _{1} ,\delta _{2} )s\right)} +\frac{1}{\gamma _{X,Y} \left((\delta _{1} ,\delta _{2} )s\right)} =1$ .

\vspace{0.2in}
\noindent
Writing

\vspace{0.15in}\hspace{0.9in}
$\displaystyle\gamma _{X,Y} \left((\delta _{1} ,\delta _{2} )s\right)=\frac{1}{1+\omega \left((\delta _{1} ,\delta _{2} )s\right)} $ , we get

\vspace{0.25in}
\hspace{1.1in}
$\omega \left((\delta _{1} ,\delta _{2} )s\right)=\mu ^{*} s$, \ where $\mu ^{*} $ is a function of $(\delta _{1} ,\delta _{2} )$

\vspace{0.15in}\noindent
That is,

\vspace{0.15in}\hspace{0.95in}$\displaystyle\gamma _{X,Y} \left((\delta _{1} ,\delta _{2} )s\right)=\frac{1}{1+\mu ^{*} s} $

\vspace{0.2in}
\hspace{1.95in}
= $\displaystyle\frac{1}{1+c_{1} s_{1} +c_{2} s_{2} } .$

\noindent
Hence

\vspace{0.2in}
\hspace{1.1in} $\displaystyle\pi _{X,Y} (s_{1} ,s_{2} ) = \frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}. $

\vspace{0.225in}
Conversely, suppose $\displaystyle\ (X_{0} ,Y_{0} ) = (\varepsilon _{0} ,\psi _{0} )$ and $(\varepsilon _{0} ,\psi _{0} )$ has bivariate geometric distribution. From (11),

\vspace{0.2in}\hspace{1.15in}
$\displaystyle\gamma _{X_{1} ,Y_{1} } (s_{1} ,s_{2} )=\int \limits _{0}^{1}\gamma _{\varepsilon _{0} ,\psi _{0} }^{2} (vs_{1} ,vs_{2} ) dv$

\vspace{0.2in}
\hspace{2.25in}
= \ \ $\displaystyle\frac{1}{1+c_{1} s_{1} +c_{2} s_{2} } .$

\noindent
Therefore ,

\vspace{0.15in}
\hspace{1.3in} $\displaystyle\pi _{X_{1} ,Y_{1} } (s_{1} ,s_{2} ) = \frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} $ .

\vspace{0.2in}\noindent
Hence $(X_{1} ,Y_{1} )$ is distributed as bivariate geometric distribution. By induction we get $(X_{n} ,Y_{n} )$ follows bivariate geometric distribution. Hence the process is stationary.

\section{ Bivariate Geometric Distribution.}

\hspace{0.25in}
In this section we construct various bivariate geometric distribution using bivariate geometric compounding discussed in (1.5). We discuss the discrete analogues many important bivariate exponential distribution like, Marshall-Olkin (1967) bivariate exponential, \ \ Downton (1970) bivariate \ \ exponential and Hawkes'(1972) bivariate exponential.

\vspace{0.2in}
\noindent
{\bf Theorem 4.1}
\vspace{0.1in}

Let $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ be a sequence of independent and identically \ \ distributed \ \ random \ \ variables with pgf $\displaystyle\pi (s_{1} ,s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} .$ Suppose $(N_{1} ,N_{2} )$ has bivariate geometric distribution given in (4) \ \ such that \ \ $p_{00} =\mu _{12} ,p_{10} =\mu _{2} ,p_{01} =\mu _{1} \; $ and$\; p_{11} =1-\mu $ where $\mu =\mu _{1} +\mu _{2} +\mu _{12} $ . Then the distribution of $(U_{N_{1} } ,V_{N_{2} } )$ is the bivariate geometric which is analogous to Marshall-Olkin bivariate exponential distribution.

\vspace{0.2in}
\noindent
{\bf Proof}

\vspace{0.1in}
Substituting the values of $p_{00} ,p_{10} ,p_{01} ,\; p_{11} $ and $\pi (s_{1} ,s_{2} )$ in (5) we get,

\vspace{0.2in}
$\displaystyle \eta (s_{1} ,s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}$

\vspace{0.2in}
\hspace{1.5in}
$\displaystyle (\mu _{12} +\mu _{2} \eta (s_{1} ,1)+\mu _{1} \eta (1,s_{2} )+(1-\mu )\eta (s_{1} ,s_{2} )). $ \hspace{0.25in}(12)

\vspace{0.2in}
\noindent
Solving for $\eta (s_{1} ,1)$ and $\eta (1,s_{2} )$ , we get

\vspace{0.2in}
$\displaystyle \eta (s_{1} ,1)=\frac{\mu _{2} +\mu _{12} }{\mu _{2} +\mu _{12} +c_{1} (1-s_{1} )} $ \ \ and \ \ $\displaystyle \eta (1,s_{2} )=\frac{\mu _{1} +\mu _{12} }{\mu _{1} +\mu _{12} +c_{2} (1-s_{2} )} $ .

\vspace{0.2in}
\noindent
Substituting in (12)gives,

\vspace{0.2in}\noindent
$\displaystyle \eta (s_{1} ,s_{2} )=\frac{1}{1+c_{1} (1-s_{1} )+c_{2} (1-s_{2} )} $

\vspace{0.25in}
\hspace{0.15in}
$\displaystyle \left(\mu _{12} +\frac{\mu _{1} (\mu _{2} +\mu _{12} )}{\mu _{2} +\mu _{12} +c_{1} (1-s_{1} )} +\frac{\mu _{2} (\mu _{1} +\mu _{12} )}{\mu _{1} +\mu _{12} +c_{2} (1-s_{2} )} +(1-\mu )\eta (s_{1} ,s_{2} )\right).$

\vspace{0.25in}
\noindent
On simplification ,

\vspace{0.25in}
$\displaystyle \eta (s_{1} ,\; s_{2} )=\frac{1}{\mu +c_{1} (1-s_{1} )+c_{2} (1-s_{2} )}$

\vspace{0.25in}
\hspace{1.3in}
$\displaystyle \left(\mu _{12} +\mu _{1} \left(1+\frac{c_{1} (1-s_{1} )}{1-\mu _{1} } \right)^{-1} +\mu _{2} \left(1+\frac{c_{2} (1-s_{2} )}{1-\mu _{2} } \right)^{-1} \right)$

\vspace{0.25in} In the next theorem, we give bivariate geometric distribution analogues to Downton's bivariate exponential distribution.

\vspace{0.2in}
\noindent
{\bf Theorem 4.2}

\vspace{0.1in}
Suppose \ \ that $\{ (X_{i} , \ Y_{i} ),i\ge 1\} $ are \ \ independent \ \ and identically \ \ distributed random variables \ \ with pgf $\eta (s_{1} ,s_{2} ) \ \ \ = \ \displaystyle\left(1+\frac{1-s_{1} }{1+\mu } \right)^{-1} \left(1+\frac{1-s_{2} }{1+\mu } \right)^{-1},$\ \ $\displaystyle\ 0 \ <\alpha _{1} ,\alpha _{2} \le \ 1.$ Take $(N_{1} ,N_{2} )$ follows bivariate geometric distribution given in (4) and $p_{00} =(1+\mu )^{-1} ,\; p_{10} =p_{01} =0\; $and\;$ p_{11} =\mu (1+\mu )^{-1} $ , $\mu >0$ .Then $(U_{N_{1} } ,V_{N_{2} } )$ has bivariate geometric distribution which is the discrete analogue of Downton's bivariate exponential distribution.

\vspace{0.2in}
\noindent
{\bf Proof}

\vspace{0.15in}
From (5) we get,

\vspace{0.225in}
\noindent
$\displaystyle \eta (s _{1} ,s _{2} )=\left(1+\frac{1-s_{1} }{1+\mu } \right)^{-1} \left(1+\frac{1-s_{2} }{1+\mu } \right)^{-1} \left((1+\mu )^{-1} +\mu (1+\mu )^{-1} \eta (s_{1} ,s_{2} )\right)$

\vspace{0.225in}
\hspace{0.3in}
$\displaystyle = \frac{1}{(1+\mu )\left(1+\frac{1-s_{1} }{1+\mu } \right)\, \, \left(1+\frac{1-s_{2} }{1+\mu } \right)-\mu } $

\vspace{0.225in}
$\displaystyle = \frac{1+\mu }{(1+\mu )^{2} +(1+\mu )(1-s_{1} )+(1+\mu )(1-s_{2} )+(1-s_{1} )(1-s_{2} )-\mu (1+\mu )} $

\vspace{0.225in} \hspace{0.3in}
$\displaystyle = \frac{1}{(1+(1-s_{1} ))(1+(1-s_{2} ))-({\tfrac{\mu }{1+\mu }} )(1-s_{1} )(1-s_{2} )}. $

\vspace{0.225in}
\noindent
The bivariate geometric form of Hawkes' bivariate exponential distribution is given in the following theorem.

\vspace{0.2in}
\noindent
{\bf Theorem 4.3}

\vspace{0.1in}
Let $\{ (X_{i} ,Y_{i} ),i\ge 1\} $ be a sequence of independent and identically distributed \ \ random variables \ \ whose pgf is $\displaystyle\pi (s_{1} ,s_{2} )=\frac{1}{\left(1+\mu _{1} (1-s_{1} )\right)\left(1+\mu _{2} (1-s_{2} )\right)} $ . Assume that $(N_{1} ,N_{2} )$ have bivariate geometric \ \ distribution given in (4) and $p_{10} =p_{01} ,\gamma _{1} =p_{01} +p_{00} $ and $\gamma _{2} =p_{10} +p_{00} $ . Then the distribution $(U_{N_{1} } ,V_{N_{2} } )$ has the bivariate geometric form of Hawkes' bivariate exponential distribution.

\vspace{0.2in}
\noindent
{\bf Proof}

\vspace{0.1in}
From (6), we have

\vspace{0.2in}
\hspace{1.0in}
$\displaystyle \eta (s_{1} ,1)=\pi (s_{1} ,1)\left(\gamma _{1} +(1-\gamma _{1} )\pi (s_{1} ,1)\right).$

\vspace{0.225in}
\noindent
Substituting $\pi (s_{1} ,1)$ and solving $\eta (s_{1} ,1)$ ,

\vspace{0.225in}
\noindent
we get, \ \ \ \ $ \displaystyle \eta (s_{1} ,1)=\displaystyle\frac{1}{1+\frac{\mu _{1} (1-s_{1} )}{\gamma _{1} } } . $ \ \ \ \ Similarly \ \ \ $ \displaystyle \eta (1,s_{2} )=\frac{1}{1+\frac{\mu _{2} (1-s_{2} )}{\gamma _{2} } }. $

\vspace{0.2in}
\noindent
Again from (5)

\vspace{0.2in}
$\displaystyle \eta (s_{1} ,s_{2} )=\frac{1}{(1+\mu _{1} (1-s_{1} ))(1+\mu _{2} (1-s_{2} ))} $

\vspace{0.225in}
\hspace{1.25in}
$\displaystyle \left(p_{00} +p_{10} \frac{1}{1+\frac{\mu _{1} (1-s_{1} )}{\gamma _{1} } } +p_{01} \frac{1}{1+\frac{\mu _{2} (1-s_{2} )}{\gamma _{2} } } +p_{11} \pi (s_{1} ,s_{2} )\right)$

\vspace{0.25in}
\noindent
$ =\displaystyle \frac{p_{00} \left(1+\frac{\mu _{1} (1-s_{1} )}{\gamma _{1} } \right) \left(1+\frac{\mu _{2} (1-s_{2} )}{\gamma _{2} } \right)+p_{10} \left(1+\frac{\mu _{2} }{\gamma _{2} } (1-s_{2} )\right)+p_{01} \left(1+\frac{\mu _{1} }{\gamma _{1} } (1-s_{1} )\right)}{\left(1+\frac{\mu _{1} }{\gamma _{1} } (1-s_{1} )\right) \left(1+\frac{\mu _{2} }{\gamma _{2} } (1-s_{2} )\right)\left((1+\mu _{1} (1-s_{1} ))(1+\mu _{2} (1-s_{2} ))-p_{11} \right)} $

\vspace{0.25in}
Thus $(U_{N_{1} } ,V_{N_{2} } )$ has the bivariate geometric form of Hawkes' bivariate exponential distribution.

\begin{thebibliography}{99}

\bibitem{block}
Block , H. W. (1977) A family of bivariate life distributions. {\it Theory and Applications of Reliability: with Emphasis on Bayesian and Nonparametric Methods}, C. P. Tsokos and I.N. Shimi, eds, Academic Press, 349 - 372.

\bibitem{Downton}
Downton, F. (1970) Bivariate exponential distributions in reliability theory. {\it Journal of Royal Statistical Society}, B, {\bf 32}, 408-417.

\bibitem{Hawkes}
Hawkes, A. G. (1972) A bivariate exponential distribution with applications to reliability. {\it Journal of Royal Statistical Society}, B, {\bf 34}, 129-131.

\bibitem{Marshall}
Marshall, A.W. and Olkin, I. (1967) A multivariate exponential distribution. {\it Journal of American Statistical Association}, {\bf 62}, 30-44.

\end{thebibliography}


\end{document}

A masters course on animal behaviour for Zoology students

This course is meant for getting a basic training for JAM preparation based on the UG syllabus for Waves.

Test paper on molecular symmetry, identifying symmetry of molecules; point group; assign point group to molecules

Elementary quantum mechanics-History- Postulates- Particle in one dimensional, two dimensional boxes etc- Simple Harmonic Oscillator-Angular momentum-Particle on ring, sphere- Hydrogen atom