2006, Vol. 17, No. 2, pp. 493 ∼ 498
The UMVUE of [P(Y> X)] k
in a Two Parameter Exponential Distribution
Joongdae Kim1)
Abstract
We shall consider the UMVUE of [P(Y > X)]k in a two parameter exponential distribution.
Keywords : Exponential distribution, UMVUE
1. Introduction
A two parameter exponetial distribution is given by
f(x ;μ,σ) = 1
σ e - ( x - μ )/σ
, x > μ, where σ > 0, μ∈R1 .
The problem of estimating of the probability that a random variable X is less than an independent random variable Y, arises in a reliability. When X represents the random value of a stress that a device will be subjected to in service and Y represents the strength that varies from item to item in the population of devices, then the reliability R, i.e. the probability that a randomly selected device functions successfully, is equal to P(Y>X). The same problem also arises in the context of statistical tolerancing where X represents the diameter of a draft and Y the diameter of a bearing that is to be mounted on the shaft. The probability that the bearing fits without interference is then P(Y>X). In biometry X represents a patient's remaining years when treated with drug B. If the choice of drug is left to the patient, person's deliberations will center on whether P(Y>X) is less than or greater than 1/2.
Woo and Lee(2001) studied the MLE and the UMVUE of the right-tail 1) Associated Professor, Department of computer Information, Andong Sciences College, Andong, 760-300, Korea.
E-mail: jdkim@andong-c.ac.kr
perobability inn a levy distribution and Kim, et al(2003) studied an inference on P(Y<X) in an exponential distribution.
Here we shall find the UMVUE of [P(Y > X)] k in a two parameter exponential distribution when the scale parameter is known.
2. The UMVUE of [P(Y> X)] k
Let X and Y be independently random variables with p.d.f.'s:
fX( x ;μx, σx) = 1
σx e- ( x - μx)/σx, x > μx, . and fY(y ;μ,σ) = 1
σy e- ( y - μy)/σy, y > μy .
From th result of Kim, et al.(2003), the reliability is given as the following:
R≡P( Y > X) =
{
1 -ρ + 11ρ + 1ρeδ/σey, if δ < 0- δ/ σx, if δ≥0 ,where, ρ =σx/σy, and δ =μy- μx .
Assume independent random samples X1, X2, ..., Xm and Y1,Y2,...,Yn are drawn from fX(x) and fY(y), respectively.
Let X( 1 ), X( 2 ),...,X( m ) and Y( 1 ),Y( 2 ),...,Y( n ) be the corresponding ordered statistics. Then, from Johnson, et al.(1994), X( 1 ) and Y( 1 ) are complete sufficient statistics for μx and μy, respectively when σx and σy are known.
An unbiased estimator of R=P(Y>X) is given by:
Z≡
{
0, elsewhere1, if X1< Y1And using Lehmann-Scheffe Theorem in Rohatgi(1976), the UMVUE of R=P(Y>X) is
E( Z∣X( 1 ), Y( 1 )) = P( X1< Y1∣X( 1 ), Y( 1 ))
= P( X1- X( 1 )< Y1- Y( 1 )+ D∣X( 1 ),Y( 1 ))
= P( X1- X( 1 )< Y1- Y( 1 )+ D∣D), D≡Y( 1 )- X( 1 ), which is a function of D, σx, and σy , since the distributions of X1- X( 1 )
and Y1- Y( 1 ) don't involve μx and μy, respectively. Hence an unbiased estimator of R=P(Y>X) based on D will be its UMVUE when σx and σy are known.
Theorem 1. Let U≡
{
( n - 1)( σx+ m σy)mn( σx+ σy) eD/σy, if D < 0 1 - ( m - 1)( n σx+ σy)
mn( σx+ σy) e- D/σx, if D > 0 .
Then U is the UMVUE of R=P(Y>X) only when σx and σy are known.
Proof. From the result of Ali, et al.(2004), the pdf of D is given by:
fD(d) =
{
nσxmn+ m σy e- m( δ - d)/σx, if d < δ mn
nσx+ m σy e- n( d - δ)/σy
, if d≥δ .
From the pdf of D and Lehmann-Scheffe Theorem, it's sufficient for us to show that an statistics U is an unbiased estimator.
For δ≥0,
E( U ) = ( n - 1)( σx+ m σy) mn(σx+ σy) ⌠
⌡
0
- ∞ed/σy mn
nσx+ m σy e- mδ/σx+ md/σxdd + ⌠
⌡
δ 0
mn
nσx+ mσy e- mδ/σx+ md/σxdd + ⌠⌡
∞ δ
mn
nσx+ m σy e - nd/σy+ n δ/σydd - ( m - 1)( n σx+ σy)
mn( σx+ σy) [ mn nσx+ mσy
⌠⌡
δ
0e- d/σx- m δ/σx+ md /σxdd + mn
nσx+ m σy ⌠
⌡
∞
δ e - d/σx- nd/ σy+ n δ/σydd ] By exponential integrals, we can obtain the expectation of U:
E(U)= 1 - σx
σx+ σy e- δ/σx , δ≥0, Similarly for δ < 0 , E(U)= σy
σx+ σy eδ/σy , and hence, the statistics D is an unbiased estimator of R=P(Y>X). And hence the statistics U is the UMVUE of R=P(Y>X).
Let Rk≡[ P( Y > X) ]k=
{
σk y
(σx+ σy)k ⋅e kδ/σy, if δ < 0
∑k
i = 0(- 1)i k
( )
i ( σxσ+ σx y )i⋅e - i δ/σx, if δ≥0 ,where k is less a positive integer than both sample sizes m and n.
Now when σx and σy are known, we shall consider the UMVUE of Rk for k < min (m, n) .
An unbiased estimator of Rk is given by
Zk≡
{
0, else1, if X1< Y1,X2< Y2,...,Xk< Yk .From Lehmann-Scheffe Theorem in Rohatgi(1976), the UMVUE of Rk is E( Zk∣X( 1 ), Y( 1 )) =P( X1< Y1,X2< Y2,...,Xk< Yk∣X( 1 ), Y( 1 ))
= P( X1- X( 1 )< Y1- Y( 1 )+ D, X2- X( 1 )< Y2- Y( 1 )+ D,..., Xk- X( 1 )< Yk- Y( 1 )+ D ∣D) ,
which will be a function of D, σx, and σy , since the distributions of Xi- X( 1 ) and Yi- Y( 1 ) don't involve μx and μy, respectively.
Therefore, by Lehmann-Scheffe Theorem in Rohatgi(1976), an unbiased estimator of Rk= [P(Y > X)]k which is a function of D will be its UMVUE of Rk when
σx and σy are known.
Theorem 2. Let
Uk≡
{
( n - k)( k σx+ m σy)σk - 1ymn(σx+ σy)k ⋅e kD/σy, if D < 0
∑
k
i = 0(- 1)i k
( )
i ( m - i)( n σx+ i σy)σi - 1xmn(σx+ σy)i ⋅e - i D/σx, if D≥0 .
If k is less a positive integer than both sample sizes m and n, and σx and σy are known, then Uk is an unbiased estimator and hence UMVUE of
Rk= [P(Y > X)]k . Proof. For δ < 0 ,
E( Uk) = ( n - k)( k σx+ m σy)σk - 1y mn(σx+ σy)k [ ⌠⌡
δ - ∞
mn
nσx+ mσy e- m δ/σx+ d(k/σy+ m/σx)dd +
⌠⌡
0 δ
mn
nσx+ m σy enδ/σy- d ( n/σy- k/σy)dd ] +∑k
i = 0(- 1)i k
( )
i ( m - i)( n σx+ iσy)σi - 1x mn(σx+ σy)i ⋅⌠⌡
∞ 0
mn
nσx+ m σy enδ/σy- (n/σy+ i/σx)ddd . By exponential integrals,
E(Uk) =[ P( Y > X) ]k+ B⋅ enδ/σy
nσx+mσy σy , where
B≡ 1
( σx+ σy)k [∑
k
i = 0(- 1)i k
( )
i ( m - i) σix( σx+ σy)k - i- σk - 1y ( kσx+ m σy) ] . From definition of B, coefficient of σky is zero, "coefficient of σxσk - 1y " isk
( )
0 m(
k - 1k)
-( )
k1 ( m - 1)(
k - 1k - 1)
- k= 0 ,and coefficient of σk - tx ⋅σty , ( t < k - 1) is given by
∑
k - t
i = 0(-1)i k
( )
i(
k - it)
( m - i) =- k!t!( k - t - 1)! ∑
k - t - 1
j = 0 (-1)j k-t-1
(
j)
= 0 .Therefore, E(Uk) =[ P( Y > X) ]k when δ < 0 . For δ > 0 ,
E( Uk) = ( n - k)( k σx+ m σy)σk - 1y mn(σx+ σy)k
⌠⌡
0 - ∞
mn
nσx+ mσy e- mδ/σx+ (k/σy+ m/σx)ddd +
∑k
i = 0(- 1)i k
( )
i ( m - i)( n σx+ i σy)σi - 1x mn( σx+ σy)i [ ⌠⌡δ 0
mn
nσx+ m σy e- m δ/σx+ (m - i)d /σx
dd + mn
nσx+ m σy
⌠⌡
∞
δ enδ/σy- ( n/σy+ i/σx)ddd ], By exponential integrals,
E(Uk) =[ P( Y > X)]k+ B⋅ e - m δ/σx⋅σx
(nσx+ m σy)(σx+ σy)k ,
where B≡( n - k) σky-∑
k
i = 0(- 1)i k
( )
i ( n σx+ i σy)(σx+ σy)k - iσi - 1x . From the definition of B, similarly B=0 can be shown, and henceE(Uk) =[ P( Y > X) ]k when δ≥0 .
Therefore , Uk is an unbiased estimator of Rk= [P(Y > X)]k and hence it's the UMVUE of Rk= [P(Y > X)]k . Q.E.D.
To get the UMVUE of Var( U) = E(U2) - E2(U) , Var(U)
ˆ
= U2- Rˆ2 , andfrom Theorem 2, the UMVUE of R2 is given by
R2
ˆ= U
2= ( n - 2)( 2 σx+ m σy)σy
mn(σx+ σy)2 ⋅e2D/σy, if D < 0
1 - 2( m - 1)( nσx+ σy)
mn( σx+ σy) e- D/σx+ ( m - 2)( n σx+ 2σy)σx
mn(σx+ σy)2 e- 2D/σx, if D≥0 Clearly, 0 < U2< 1, for m, n>2.
References
1. Johnson, N.L., Kotz, S. & Balakrishnan, N.(1994), Continuous Univariate Distribution, Vol.1 2nd edition, John Wiley & Sons, New York
2. Kim. J., Moon, Y., and Kang, J. (2003), Inference on P(Y<X) in an Exponential Distribution, J. of Korean Data & Information Science Society 14-4, 989-995
3. Rohatgi, V.K.(1976), An Introduction to Probability Theory and Mathematical Statistics, John Wiley & Sons, New York
4. Woo, J and Lee, H.(2001), The MLE and the UMVUE of the right-tail perobability in a levy distribution , J. of Korean Data & Information Science Society 12-2, 65-69,
[ received date : Jan. 2006, accepted date : Mar. 2006 ]