## Services on Demand

## Journal

## Article

## Indicators

- Cited by SciELO
- Access statistics

## Related links

- Cited by Google
- Similars in SciELO
- Similars in Google

## Share

## Computational & Applied Mathematics

##
*Print version* ISSN 2238-3603*On-line version* ISSN 1807-0302

### Comput. Appl. Math. vol.28 no.1 São Carlos 2009

#### http://dx.doi.org/10.1590/S0101-82052009000100004

**Unitary invariant and residual independent matrix distributions**

**Arjun K. Gupta ^{I}; Daya K. Nagar^{II, } ^{*}; Astrid M. Vélez-Carvajal^{II} **

^{I}Department of Mathematics and Statistics, Bowling Green State University, Bowling Green, Ohio 43403-0221, USA, E-mail: gupta@bgnet.bgsu.edu

^{II}Departamento de Matemáticas, Universidad de Antioquia, Calle 67, No. 53-108, Medellín, Colombia E-mail: dayaknagar@yahoo.com

**ABSTRACT**

Define *Z*_{13} = *A*^{1/2}*Y*(*A*^{1/2})* ^{H}* (

*A*and

*Y*are independent) and

*Z*

_{15}=

*B*

^{1/2}

*Y*(

*B*

^{1/2})

*(*

^{H}*B*and

*Y*are independent), where

*Y, A*and

*B*follow inverted complex Wishart, complex beta type I and complex beta type II distributions, respectively. In this article several properties including expected values of scalar and matrix valued functions of

*Z*

_{13}and

*Z*

_{15}are derived.

**Mathematical subject classification:**Primary: 62E15; Secondary: 62H99.

**Key words:** beta distribution, inverted complex Wishart, complex random matrix, Gauss hypergeometric function, residual independent, unitary invariant, zonal polynomial.

**1 Introduction**

This paper deals with complex random quadratic forms involving complex Wishart, inverted complex Wishart, complex beta type I and complex beta type II matrices. To define these distributions we need some notations from complex matrix algebra. Let *A* = (*a _{ij}*) be an

*m*×

*m*matrix of complex numbers. Then,

*A*denotes conjugate transpose of

^{H}*A*;

tr(*A*) = *a*_{11}+···+ *a _{mm}*;

etr(

*A*) = exp(tr(

*A*));

det(

*A*) = determinant of

*A*;

*A* = *A ^{H}* > 0 means that

*A*is Hermitian positive definite; 0 <

*A*=

*A*<

^{H}*I*means that

_{m}*A*and

*I*-

_{m}*A*are Hermitian positive definite and

*A*

^{1/2}denotes square root of

*A*=

*A*> 0.

^{H}Now, we define complex Wishart, inverted complex Wishart, complex matrix variate beta type I, complex matrix variate beta type II distributions.

**Definition 1.1. ***The m × m random Hermitian positive definite matrix X is said to have a complex Wishart distribution with parameters m, **ν**( > m) and *Σ

*=*Σ

^{H}> 0, written as X ~ W_{m}(*ν*, Σ

*), if its p.d.f. is given by*

{* _{m} *(

*ν*) det (Σ)

*}*

^{ν}^{-1 }det (

*X*)

^{ν}^{-m}etr (– Σ

^{-1}

*X*),

*X = X*

^{H}> 0,*where _{m} (a) is the complex multivariate gamma function defined by*

**Definition1.2.** *The m* ×* m random Hermitian positive definite matrix Y is said to be distributed as inverted complex Wishart with parameters m, **µ*(__>__ *m*) *and *Ψ = Ψ* ^{H}* > 0, denoted by

*Y ~ I*

*W*(

_{m}*µ*, Ψ),

*if its p.d.f. is given by*

{* _{m}* (

*µ*) }

^{-1 }det(Ψ)

*det(*

^{µ}*Y*)

^{–(}

^{µ}

^{+m}^{) }etr (-

*Y*

^{-1}

*Ψ*),

*Y*=

*Y*> 0.

^{H}Note that if *Y* ~ *IW _{m}*(

*µ*,Ψ), then

*Y*

^{-1}~

*W*(

_{m}*µ*,Ψ

^{-1}). The inverted complex Wishart distribution was first derived by Tan [17] as posterior distribution of Σ in a complex multivariate regression model. Later Shaman [16] studied some of its properties and applied it to spectral estimation. For

*m*= 1, the inverted complex Wishart density slides to an inverted gamma density given by

{* _{m}* (

*µ*) }

^{-1}

*ψ*

^{µ }*y*

^{-(}

^{µ}

^{+1)}exp (–

*ψ*

*y*

^{-1}),

*y*> 0,

*µ*> 0,

*ψ*> 0.

This distribution is designated by *y* ~ *IG* (*µ*, *ψ*).

**Definition 1.3. ***The m × m random Hermitian positive definite matrix X is said to have a complex matrix variate beta type I distribution, denoted as X ~ (a, b), if its p.d.f. is given by *

{_{m}(a, b)}^{-1} det (*X*)^{a-m} det (*I _{m}* –

*X*)

^{b-m}, 0 <

*X*=

*X*<

^{H}*I*,

_{m}*where a > m - *1*, b > m-1 and _{m}(a, b) is the complex multivariate beta function defined by*

**Definition 1.4. ***The m × m random Hermitian positive definite matrix Y is said to have a complex matrix variate beta type II distribution, denoted as Y ~ (a,b), if its p.d.f. is given by*

{_{m}(*a, b*)}^{-1} det (*Y*)^{a-m} det (*I _{m}* –

*Y*)

^{-(a+b)}, Y =

*Y*< 0,

^{H}* where a > m - *1*, and b > m - *1.

The complex matrix variate beta distributions arise in various problems in multivariate statistical analysis. Several test statistics in complex multivariate analysis of variance and covariance are functions of complex beta matrices. These distributions can be derived using complex Wishart matrices (Tan [18]). The relationship between beta type I and type II matrices can be stated as follows. If *X* ~ (*a, b*), then (*I _{m}-X*)

^{-1/2}

*X*(

*I*)

_{m}- X^{-1/2}~ (

*a, b*). Further, if

*Y*~ (

*a, b*), then

*Y*

^{-1}~ (

*b, a*) and (

*I*)

_{m}+ Y^{-1/2}

*Y*(

*I*)

_{m}+ Y^{-1/2}~ (

*a, b*).

It is well known in the statistical literature that the complex matrix variate beta type I and type II, complex Wishart (Σ = *I _{m}*), and complex inverted Wishart (ψ =

*I*) distributions are unitary invariant and residually independent (Khatri [9], Tan [18], Nagar, Bedoya and Arias [14], Bedoya, Nagar and Gupta [1], Goodman [4], Shaman [16]) and therefore belong to the class of unitary invariant and residual independent matrix (UNIARIM) distributions,

_{m}*, defined as follows (Khatri, Khattree and Gupta [12]).*

_{m} **Definition 1.5. ***The m × m random Hermitian positive definite matrix X is said to have an UNIARIM distribution if *

(i)

for any m×m unitary matrix U, distributions of X and U XU^{H}are identical, and(ii)

for any lower triangular factorization X = TT^{H}, T = (T_{ij}), T_{ii}(m_{i}× m_{i}), i = 1,..., k are independent, for any partition {m_{1}, m_{2}, ..., m_{k}} of m.

When *X* has UNIARIM distribution we will write *X* ∈ _{m}. Next, we state several properties of UNIARIM distributions.

**Theorem 1.6. ***Let X *∈ _{m}. Partition X as X = , X_{11}* (q × q). Then X*_{11}* and X*_{22·1}* = X*_{22}* - X*_{21}* X*_{12}* are independent, X*_{11} ∈ _{q}, and X_{22·1} ∈* _{m-q}.*

**Theorem 1.7. ** *Let X *∈* _{m} and Y *∈

_{m}be independent. Then, for any square root T of Y, the distribution of Z = TXT^{H}belongs to_{m}. Further, if T_{1}

*and T*

_{2}

*are two different square roots of Y, then and have identical distributions.*

From the above theorem it follows that if *U* = (*U*_{1}, *U*_{2}), *U _{i}* (

*m*×

*m*),

_{i}*i*= 1,2,

*m*

_{1}+

*m*

_{2}=

*m*is a random unitary matrix independent of

*Z*∈

*then*

_{m},*∈*

*and*

_{m1}*∈*

*are independent. Further, for*

_{m2}**c**∈

^{m},

**c**≠

**0**, (i)

**c**

*has same distribution as*

^{H}Z**c**/**c**^{H}**c***z*

_{11}, where

*Z*= (

*z*), and (ii)

_{ij}**c**

^{H}**c**/**c**^{H}Z^{-1}

*has same distribution as 1/*

**c***z*

^{11}where

*Z*

^{-1}= (

*z*). Furthermore, if

^{ij}*E*(

*Z*),

*E*(

*Z*

^{-1}), and

*E*(

*Z*

*),*

^{α}*α*an integer, exist, then

*E*(

*Z*) =

*aI*(

_{m}, E*Z*

^{-1}) =

*bI*and

_{m}*E*(

*Z*

*) =*

^{α}^{α}

*I*, where

_{m}*a*=

*E*(

*x*

_{11}

*y*

_{11}),

*b*=

*E*(

*x*

^{11})

*E*(

*y*

^{11}), and the constant

^{α}depends on moments of order less than or equal to

*α*of

*X*and

*Y*.

Let *Z*^{[i]} = (*z*_{αβ}), 1 __<__ *α*, *β* __<__ *i, i* __<__ *m* be a submatrix of *Z* = (*z*_{αβ}), 1 __<__ *α*, *β* __<__ *m* and *Y* = *TT ^{H}, X* =

*RR*be lower triangular factorizations. Then

^{H}where det(*Z*^{[0]}) = 1, are independent and *E*[det(*Z*)* ^{α}*] = provided the expectations involved exist.

In Section 2, we give several UNIARIM distributions by using Theorem 1.7. Section 3 gives a number of results on random quadratic forms *A*^{1/2} *Y*(*A*^{1/2})^{H} (*A* and *Y* are independent) and *B*^{1/2}*Y*(*B*^{1/2})^{H} (*B* and *Y* are independent), where *Y* ~ *IW _{m}*(

*µ*,

*I*),

_{m}*A*~ (

*a, b*) and

*B*~ (

*c, d*) by exploiting the fact that they too belong to the class of UNIARIM distributions and using properties that are available for UNIARIM distributions. Finally, in Appendix, we give certain known results on complex matrix variate beta type I and type II, complex Wishart and inverted complex Wishart distributions, confluent hypergeometric functions and zonal polynomials of Hermitian matrix argument.

**2 Generating UNIARIM Distributions**

In the previous section it is stated that if *X* ∈ _{m} and *Y* ∈ _{m} are independent, then for any square root *T* of *Y*, the distribution of *Z* = *T X T ^{H}* belongs to

_{m}. In this section we exploit this property to generate a number of UNIARIM distributions.

Let *A _{i} *~ (

*a*),

_{i}, b_{i}*B*~ (

_{i}*c*),

_{i}, d_{i}*i*= 1, 2,

*A*~ (

*a, b*),

*B*~ (

*c, d*) and define

and

| ( |

Then, from Theorem 1.7, it follows that *Z _{i}* ∈

_{m},

*i*= 1, 2, 3, 4. From Cui, Gupta and Nagar [3], the p.d.f. of

*Z*

_{1}is available as

where _{2}_{1} is the Gauss hypergeometric function of Hermitian matrix argument (James [8]). The density of* Z*_{2} and *Z*_{3} can be shown to be (Gupta, Nagar and Vélez-Carvajal [7]),

and

respectively. Note that the distribution of *Z*_{4} is same as that of *Z*_{3}.

Next, let *X _{i} *~

*W*(ν

_{m}_{i},

*I*),

_{m}*i*= 1, 2,

*Y*~

_{i}*I*

*W*(

_{m}*µ*

_{i},

*I*),

_{m}*i*= 1, 2,

*X*~

*W*(ν,

_{m}*I*), and

_{m}*Y*~

*I*

*W*(

_{m}*µ*,

*I*), where

_{m}*X*

_{1}and

*X*

_{2}are independent,

*Y*

_{1}and

*Y*

_{2}are independent, and

*X*and

*Y*are independent. Let

and

*Z*_{8} = *Y*^{1/2} *X*(*Y*^{1/2})^{H}.

Then, the p.d.f. of *Z*_{5} is (Gupta and Nagar [6]),

where _{δ}(·) is the Herz's type II Bessel function of Hermitian matrix argument. Since ~ *W _{m}*(

*µ*

_{1},

*I*), ~

_{m}*W*(

_{m}*µ*

_{2},

*I*), the p.d.f. of

_{m}*Z*

_{6}, obtained from the p.d.f. of

*Z*

_{5}, is given as

Note that *Z*_{7} ~ (*µ*, *ν*) and *Z*_{8} ~ (*µ*, *ν*) which follows from Tan [18] and Cui, Gupta and Nagar [3].

Further, define the following complex random matrices which again belong to the class _{m}:

*Z*_{9} = *A*^{1/2} *X*(*A*^{1/2})^{H},

*Z*_{10} = *X*^{1/2} *A*(*X*^{1/2})* ^{H}*,

*Z*_{11} = *B*^{1/2} *X*(*B*^{1/2})^{H},

*Z*_{12} = *X*^{1/2} *B*(*X*^{1/2})* ^{H}*,

*Z*_{13} = *A*^{1/2} *Y*(*A*^{1/2})^{H},

*Z*_{14} = *Y*^{1/2} *A*(*Y*^{1/2})* ^{H}*,

*Z*_{15} = *B*^{1/2} *Y*(*B*^{1/2})^{H},

and

*Z*_{16} = *Y*^{1/2} *B*(*Y*^{1/2})^{H}.

The random matrices *Z*_{9} and *Z*_{10} have the same density given by

where is the confluent hypergeometric function of Hermitian matrix argument. The random matrices *Z*_{11} and *Z*_{12} have the same density derived as

Similarly, the random matrices *Z*_{13} and *Z*_{14} have the same density derived in Theorem 3.1 and the random matrices *Z*_{15} and *Z*_{16} have the same density obtained in Theorem 3.3.

**3 Properties of UNIARIM Distributions**

In the previous sections we have discussed a number of properties of UNIARIM distributions and generated them by noting that if *X* ∈ _{m} and *Y* ∈ _{m} are independent, then for any square root *T* of *Y*, the distribution of *Z* = *T X T ^{H}* belongs to

_{m}.

Several properties, distributional results, expected values, etc. of random matrices defined in Section 2 are available in the literature. The distributional results and properties of *Z*_{1}, *Z*_{2}, *Z*_{3} and *Z*_{4} were studied by Gupta, Nagar and Vélez-Carvajal [7]. Results related to *Z*_{5} and *Z*_{6} were derived by Gupta and Nagar [6] and properties of *Z*_{9}, *Z*_{10}, *Z*_{11} and *Z*_{12} were obtained by Khatri, Khattree and Gupta [12].

In this section we derive distributions of *Z*_{13} and *Z*_{15} and study their properties. First we obtain the densities of *Z*_{13} and *Z*_{15}.

**Theorem 3.1. ** * Let A and Y be independent, A ~ (a, b) and Y ~ IW _{m}(*

*µ*

*, I*

_{m}). Then, the density of Z_{13}

*= A*

^{1/2}

*Y (A*

^{1/2}

*)*

^{H}is derived aswhere _{1}_{1} is the confluent hypergeometric function of Hermitian matrix argument.

**Proof. ** The joint density of *A* and *Y* is given by

where 0 < *A* = *A ^{H}* <

*I*and

_{m}*Y*=

*Y*> 0. Making the transformation

^{H}*Z*

_{13}=

*A*

^{1/2}

*Y*(

*A*

^{1/2})

^{H}with the Jacobian

*J*(

*A, Y*→

*A, Z*

_{13}) = det(

*A*)

^{-m}and integrating

*A*we obtain

where *Z*_{13} = > 0. Now, integration using (A.1) yields the result.

The distribution of *Z*_{13} is designated by (*µ*, *a, b*).

**Corollary 3.2. ***If x and y are mutually independent, x ~ B ^{I}(a, b ) and y ~ IG(*

*µ*, 1

*), then xy ~ (*

*µ*

*, a, b). Further, the p.d.f. of z*

_{13}

*= xy is given by*

* where _{1}F_{1} is the confluent hypergeometric function of scalar argument (Luke [13]).*

**Theorem 3.3. ** *Let B and Y be independent, B ~ (c, d) and Y ~ IW _{m}(*

*µ*

*, I*

_{m}). Then, the density of Z_{15}= B^{1/2}

*Y (B*

^{1/2}

*)*

^{H}is derived as

**Proof. ** The joint density of *B* and *Y* is given by

where *B* = *B ^{H}* > 0 and

*Y*=

*Y*> 0. Transforming

^{H}*Z*

_{15}=

*B*

^{1/2}

*Y*(

*B*

^{1/2})

*with the Jacobian*

^{H}*J*(

*B, Y*→

*B, Z*

_{13}) = det(

*B*)

^{-m}and integrating

*B*we obtain

where *Z*_{15} = > 0. The desired result is obtained by using (A.2).

The above distribution will be denoted by (*µ*, *c, d*).

**Corollary 3.4. ***If x and y are independent, x ~ B ^{II}(c, d) and y ~ IG(*

*µ*, 1

*), then xy ~*

*(*

*µ*

*, c, d). Further, the p.d.f. of z*

_{13}

*= xy is given by*

*where **ψ** is the confluent hypergeometric function of scalar argument (Luke *[13]*).*

Our next two results are of importance in deriving marginal distributions of certain submatrices of *Z*_{13} and *Z*_{15}.

**Theorem 3.5. ** *Let A and Y be independent, A ~ **(a, b) and Y ~ IW _{m}*

_{1}

_{+m}_{2}

*(*

*µ*

*, I*

_{m}). Then,are independent. Further,

*are independent where A*_{11·2}, *Y*_{11·2}, *A*_{22·1}* and Y*_{22·1} *are Schur complements of A*_{22}, *Y*_{22}, *A*_{11} *and Y*_{11}, *respectively*.

**Proof. ** From Theorem A.8 and Theorem A.9, *A*_{11} , *Y*_{11}, *A*_{22·1} and *Y*_{22·1} are independent, *A*_{11} ~ (*a, b*), *Y*_{11} ~ I(*µ* - *m*_{2},*I _{m}*

_{1}),

*A*

_{22·1}~ (

*a*-

*m*

_{1},

*b*) and

*Y*

_{22·1}~ I(

*µ*,

*I*

_{m}_{2}). Now, application of Theorem 3.1 yields the desired result. The proof of the second part is similar.

**Theorem 3.6. ** *Let B and Y be independent, Y ~ I **(**µ**, I _{m}) and B ~ *

*(c , d). Then,*

*are independent. Further, *

*are independent where Y*_{11·2}, *B*_{11·2}, *Y*_{22·1} *and B*_{22·1} *are Schur complements of Y*_{22}, *B*_{22}, Y_{11} *and B*_{11}, *respectively. *

**Proof. ** From Theorem A.8 and Theorem A.10, *Y*_{11} , *B*_{11}, *Y*_{22·1} and *B*_{22·1} are independent, *Y*_{11} ~ *I*(*µ* - *m*_{2},* I _{m}*

_{1}),

*B*

_{11}~ (

*c, d - m*

_{2}) ,

*Y*

_{22·1}~

*I*(

*µ*,

*I*

_{m}_{2}) and

*B*

_{22·1}~ (

*c - m*

_{1},

*d*). Now, application of Theorem 3.3 yields the desired result. The proof of the second part is similar.

*3.1 Properties of Z*_{13}

In this section some properties of the random matrix *Z*_{13} are derived using the fact that *Z*_{13} belongs to the class of UNIARIM distributions.

(i) Let

Z_{13}= ,Z_{1311}(m_{1}×m_{1}) ,m_{1}+m_{2}=m. Then, using Theorem 1.6, Theorem 1.7 and Theorem 3.5,Z_{1311}and its Schur complementZ_{1322·1}are independent,Z_{1311}~ (µ-m_{2},a, b) andZ_{1322·1}~ (µ,a - m_{1},b). Further,Z_{1322}and its Schur complementZ_{1311·2}are independent,Z_{1322}~ (µ-m_{1},a, b) andZ_{1311·2}~ (µ,a - m_{2},b).(ii) For a

q×mcomplex non-random matrixCof rankq(<m),

(*CC ^{H}*)

^{-1/2}

*CZ*

_{13}

*C*(

^{H}*CC*)

^{H}^{-1/2}~ (

*µ*-

*m*

^{2},

*a, b*)

and

(iii) For

c∈,^{m}c≠0,

and

Note that the distributions of

do not depend on **c**. Thus, if **y** (*m* × 1) is a complex random vector independent of *Z*_{13}, and *P*(**y** ≠ **0**) = 1, then it follows that

and

(iv) Let

Z_{13}= (z_{13ij}) and . Then,z_{13ii}~ (µ-m+ 1,a, b),i= 1,...,mand 1/ ~ (µ,a - m+ 1,b),i= 1,...,m.(v) Let = (

z_{13jk}), 1<j, k<i. Define

Then, the random variables *v*_{1}, ...,* v _{m}* are mutually independent and using Theorem A.11, Theorem A.7 and Corollary 3.1,

*v*~ (

_{i}*µ*-

*i*+ 1,

*a, b*),

*i*= 1,...,

*m*. Further the distribution of det(

*Z*

_{13}) is the same as that of .

*3.2 Moments of functions of Z*_{13}

In this section we derive several expected values of scalar and complex matrix valued functions of the complex random matrix *Z*_{13}.

Using the representation *Z*_{13} = *A*^{1/2} *Y*(*A*^{ 1/2})^{H} and (A.10)-(A.17), one obtains

Further, using (A.4), (A.5) and above expressions, the expected values of (tr *Z*_{13})^{2} and ^{2} are evaluated as

and

Now, applying the results [*n*]_{(2)} = *n*(*n* + 1), [*n*] = *n*(*n* - 1), [-*n* + *m*]_{(2)} = (*n - m*)(*n - m *- 1), [-*n* + *m*] = (*n - m*)(*n - m* + 1), _{(2)}(*I _{m}*) =

*m*(

*m*+ 1)/2 and (

*I*) =

_{m}*m*(

*m*- 1)/2 in the above expressions and simplifying, we obtain

and

Similarly, using (A.6)-(A.8), the expected values of (tr *Z*_{13})^{3} and ^{3} are obtained as

and

respectively. Now, using the results [*n*]_{(3)} = *n*(*n* + 1)(*n* + 2), [*n*]_{(2,1)} = *n*(*n* + 1)(*n* - 1), = *n*(*n* - 1)(*n* - 2), [-*n* + *m*]_{(3)} = - (*n - m*) (*n - m *- 1)(*n - m* - 2), [-* n + m*]_{(2,1)} = -(*n - m*) (*n - m - *1)(*n - m* + 1), = - (*n - m*) (*n - m *+ 1)(*n - m* + 2), _{(3)}(*I _{m}*) =

*m*(

*m*+ 1)(

*m*+ 2)/6,

_{(2,1)}(

*I*) = 2

_{m}*m*(

*m*

^{2 }- 1)/3 and (

*I*) =

_{m}*m*(

*m*- 1)(

*m*- 2)/6 in the above expressions and simplifying, we obtain

where *µ* > *m* + 1, and

where *a* > *m* + 2. Similarly, the expected values of tr(*Z*_{13}) tr and tr are obtained as

and

respectively. Since, *E* = _{α}*I _{m}*, we have

*E*[tr] =

_{α}

*m*. Thus, the coefficient of

*m*in

*E*[tr] is

_{α}. Therefore, evaluating

*E*[tr], E[tr],

*E*[tr] and

*E*[tr using the technique describe above, and computing the coefficients of

*m*in the resulting expressions, one obtains

and

*3.3 Properties of Z*_{15}

In this section we give several properties of *Z*_{15}.

(i) Let

Z_{15}= ,Z_{1511}(m_{1}×m_{1}) ,m_{1}+m_{2}=m. Then, using Theorem 1.6, Theorem 1.7 and Theorem 3.6,Z_{1511}andZ_{1522·1}are independent,Z_{1511}~ (µ-m_{2},c, d - m_{2}) andZ_{1522·1}~ (µ,c - m_{1},d). Further,Z_{1522}andZ_{1511·2}are independent,Z_{1522}~ (µ-m_{1},c, d - m_{1}) andZ_{1511·2}~ (µ,c - m_{2},d).(ii) For a

q × mcomplex non-random matrixCof rankq(<m),

(*CC ^{H}*)

^{-1/2}

*CZ*

_{15}

*C*(

^{H}*CC*)

^{H}^{-1/2}~ (

*µ*-

*m*

_{2},

*c, d - m*

_{2})

and

(*CC ^{H}*)

^{-1/2}(

*C*

*C*)

^{H}^{-1}(

*CC*)

^{H}^{-1/2}~ (

*µ*,

*c*-

*m*

_{1},

*d*)

(iii) If

y(m× 1) is a non-random complex vector withy≠0, or a complex random vector independent ofZ_{15}withP(y≠0) = 1, then it follows that

and

(iv) Let

Z_{15}= (z_{15ij}) and = . Then, fori= 1,...,m, z_{15ii}~ (µ-m+ 1,c, d - m+ 1), and 1/ ~ (µ,c - m+ 1,d).(v) Let = (

z_{15jk}), 1<j, k<i. Define

Then, the random variables *v*_{1},..., *v _{m}* are mutually independent and using Theorem A.11, Theorem A.12 and Corollary 3.4,

*v*~ (

_{i}*µ*-

*i*+ 1,

*c, d*-

*i*+ 1),

*i*= 1,...,

*m*. Further, the distribution of det(

*Z*

_{15}) is the same as that of .

*3.4 Moments of functions of Z*_{15}

This section deals with expected values of scalar and complex matrix valued functions of the complex random matrix *Z*_{15}.

Using the representation *Z*_{15} = *Y*^{1/2} *B*(*Y*^{1/2})* ^{H}* ~ (

*µ*,

*c, d*), (A.10)-(A.13), (A.18), (A.19), and the technique of computing expected values given in Subsection 3.2, we obtain

and

**Acknowledgements. ** The authors would like to thank the worthy referee for his comments and suggestions which improved the presentation of this article.

**REFERENCES**

[1] E. Bedoya, D.K. Nagar and A.K. Gupta, *Moments of the complex matrix variate beta distribution.* PanAmerican Mathematical Journal, **17**(2) (2007), 21-32. [ Links ]

[2] Y. Chikuse, *Partial differential equations for hypergeometric functions of complex matrices and their applications.* Annals of the Institute of Statistical Mathematics, **28** (1976), 187-199. [ Links ]

[3] Xinping Cui, A.K. Gupta and D.K. Nagar, *Wilks' factorization of the complex matrix variate Dirichlet distributions.* Revista Matemática Complutense, **18**(2) (2005), 315-328. [ Links ]

[4] N.R. Goodman, *Statistical analysis based on a certain multivariate complex Gaussian distribution (an introduction).* Annals of Mathematical Statistics, **34** (1963), 152-177. [ Links ]

[5] A.K. Gupta and D.K. Nagar, *Matrix Variate Distributions.* Chapman & Hall/CRC, Boca Raton (2000). [ Links ]

[6] A.K. Gupta and D.K. Nagar, *Product of independent complex Wishart matrices.* International Journal of Statistics and System, **2**(1) (2007), 59-73. [ Links ]

[7] A.K. Gupta, D.K. Nagar and A.M. Vélez-Carvajal, *Quadratic forms of independent complex beta matrices.* International Journal of Applied Mathematics and Statistics, **13** (D08) (2008), 76-91. [ Links ]

[8] A.T. James, *Distributions of matrix variate and latent roots derived from normal samples.* Annals of Mathematical Statistics, **35** (1964), 475-501. [ Links ]

[9] C.G. Khatri, *Classical statistical analysis based on a certain multivariate complex Gaussian distribution.* Annals of Mathematical Statistics, **36** (1965), 98-114. [ Links ]

[10] C.G. Khatri, *On certain distribution problems based on positive definite quadratic functions in normal vectors.* Annals of Mathematical Statistics, **37**(2) (1966), 468-479. [ Links ]

[11] C.G. Khatri, *On the moments of traces of two matrices in three situations for complex multivariate normal populations.* Sankhyā, **A32** (1970), 65-80. [ Links ]

[12] C.G. Khatri, R. Khattree and R.D. Gupta, *On a class of orthogonal invariant and residual independent matrix distributions.* Sankhyā, **B53**(1) (1991), 1-10. [ Links ]

[13] Y.L. Luke, *The Special Functions and Their Approximations.* Volume I, Academic Press, New York (1969). [ Links ]

[14] D.K. Nagar, E. Bedoya and E.L. Arias, *Non-central complex matrix-variate beta distribution.* Advances and Applications in Statistics, **4**(3) (2004), 287-302. [ Links ]

[15] D.K. Nagar and A.K. Gupta, *Expectations of functions of complex Wishart matrix.* Applied Mathematics and Computations (submitted for publication). [ Links ]

[16] P. Shaman, *The inverted complex Wishart distribution and its application to spectral estimates.* Journal of Multivariate Analysis, **10** (1980), 51-59. [ Links ]

[17] W.Y. Tan, *On the complex analogue of Bayesian estimation of a multivariate regression model.* Annals of Institute of Statistical Mathematics, **25** (1973), 135-152. [ Links ]

[18] W.Y. Tan, *Some distribution theory associated with complex Gaussian distribution.* Tamkang Journal, **7** (1968), 263-302. [ Links ]

Received: 27/V/08. Accepted: 03/XII/08.

#768/08.

* Supported by the Comité para el Desarrollo de la Investigación, Universidad de Antioquia research grant no. IN515CE.

**Appendix**

The confluent hypergeometric function _{1}_{1} of Hermitian matrix argument has the integral representation (James [8] and Chikuse [2]),

valid for all Hermitian *X*, Re(*α*) > *m* - 1 and Re(*γ* - *α*) > *m* - 1. The confluent hypergeometric function of *m × m* Hermitian matrix *R* is defined by (Chikuse [2]),

where Re(*R*) > 0 and Re(*α*) > *m* - 1.

Let * _{κ}*(

*X*) be a zonal polynomial of an

*m × m*Hermitian matrix

*X*corresponding to the partition

*κ*= (

*k*

_{1}, ...,

*k*

_{m}),

*k*

_{1 }+ ... +

*k*=

_{m}*k, k*

_{1}

__>__...

__>__

*k*

_{m}__>__0. Then, for small values of

*k*, explicit formulas for

*(*

_{κ}*X*) are available as (James [8], Khatri [11]),

Also, substituting *X* = *I _{m}* in (A.3)-(A.8), it is easy to see that

_{(1)}(

*I*) =

_{m}*m*,

_{(2)}(

*I*) =

_{m}*m*(

*m*+ 1)/2, (

*I*) =

_{m}*m*(

*m*- 1)/2,

_{(3)}(

*I*) =

_{m}*m*(

*m*+ 1)(

*m*+ 2)/6,

_{(2,1)}(

*I*) = 2

_{m}*m*(

*m*

^{2 }- 1)/3, (

*I*) =

_{m}*m*(

*m*- 1)(

*m*- 2)/6. The

*complex generalized hypergeometric coefficient*[

*a*]

*is defined as*

_{κ}with (*a*)* _{r}* =

*a*(

*a*+ 1)...(

*a*+

*r*- 1) = (

*a*)

_{r}_{-1}(

*a*+

*r*- 1) for

*r*= 1,2, ..., and (

*a*)

_{0}= 1. Further, substituting appropriately in (A.9), it is easy to observe that [

*a*]

_{(2)}=

*a*(

*a*+ 1), =

*a*(

*a*- 1), [

*a*]

_{(3)}=

*a*(

*a*+ 1)(

*a*+ 2), [

*a*]

_{(2,1)}=

*a*(

*a*+ 1)(

*a*- 1) and =

*a*(

*a*- 1)(

*a*- 2).

If *Y* ~ *I**W _{m}*(

*µ*,

*Ψ*),

*A*~ (

*a, b*) and

*B*~ (

*c, d*) then (Bedoya, Nagar and Gupta [1], James [8], Khatri [10], Nagar and Gupta [15], Shaman [16]),

and

where *R* is an *m × m* Hermitian matrix.

**Theorem A.7. ** *Let A ~ W _{m}(*

*ν*

*, I*

_{m}) and A = TT^{H}, where T = (t_{ij}) is a complex lower triangular matrix with t_{ii}> 0 and t_{ij}= t_{1}

_{ij}+*t*

_{2}

*1*

_{ij}, j < i. Then, t_{ij},

__<__j__<__i__<__m are independently distributed,*~ G(*

*ν*

*- i +*1

*),*1

*0,1*

__<__i__<__m and t_{ij}~ N(*),*1

__<__j < i__<__m. **Proof. ** See Goodman [4].

The univariate gamma distribution denoted by *G*(*a*) is defined by the p.d.f.

{Γ (*a*)}^{-1} exp(-*x*)*x*^{a-1}, *x* > 0, *a* > 0.

**Theorem A.8. ***Let Y ~ IW _{m}(*

*µ*,Ψ

*) and partition Y and*

*Ψ*

*as*

*where Y _{11} and *Ψ

_{11}are q × q matrices. Then, Y_{11}

*and its Schur complement Y*

_{22·1}

*are independent, Y*

_{11}

*~ IW*

_{q}(*µ*

*- m + q,*Ψ

_{11}

*) and Y*

_{22·1}

*~ IW*

_{m-q}(*µ*, Ψ

_{22·1}

*). Further, Y*

_{22}

*and its Schur complement Y*

_{11·2}

*are independent, Y*

_{22}

*~ IW*

_{m-q}(*µ*

*- q,*Ψ

_{22}

*) and Y*

_{11·2}

*~ I W*

_{q}(*µ*, Ψ

_{11·2}

*).*

**Proof. ** See Tan [17].

**Theorem A.9. ***Let A ~ (a, b) and partition A as *

*Then, *

(i)

A_{11}and its Schur complement A_{22·1}are independent, A_{11}~ (a, b) and A_{22·1}~ (a - q, b) and(ii)

A_{22}and its Schur complement A_{11·2}are independent, A_{22}~ (a, b) and A_{11·2}~ (a - m + q, b).

**Proof. ** See Tan [18].

**Theorem A.10. ***Let B ~ (c, d) and partition B as *

* Then, *(i) *B*_{11} *and its Schur complement B*_{22·1} *are independent, B*_{11} ~ (*c, d - m + q*) *and B*_{22·1} ~ (*c - q, d*) *and* (ii) *B*_{22} *and its Schur complement B*_{11·2} *are independent, B*_{22} ~ (*c, d - q*) *and B*_{11·2} ~ (*c - m + q, d*).

**Proof. ** See Tan [18].

**Theorem A.11. ** *Let A ~ (a, b) and A = TT ^{H}, where T = (t_{ij}) is a complex lower triangular matrix with positive diagonal elements. Then, are independently distributed, ~ B^{I}(a - i + *1

*, b), i =*1

*,..., m.*

The univariate beta type I distribution, denoted by *B ^{I}*(

*a, b*), is defined by the p.d.f.

{*B*(*a, b*)}^{-1} *x*^{a-1} (1 - *x*)^{b-1}, 0 < *x* < 1,

where *B*(*a, b*) = *.*

**Theorem A.12. ** *Let B ~ (c, d) and B = TT ^{H}, where T = (t_{ij}) is a complex lower triangular matrix with positive diagonal elements. Then, are independently distributed, ~ B^{II}(c - i + *1

*, d - m + i), i =*1

*,..., m.*

The univariate beta type II distribution, denoted by *B*^{II}(*a, b*), is defined by the p.d.f.

{*B*(*a, b*)}^{-1} *x*^{a-1} (1 + *x*)^{-(a+b)}, *x* > 0.