Bayesian and Frequentist Prediction Using Progressive Type II Censored with Binomial Removals

Full text

(1)

Bayesian and Frequentist Prediction Using Progressive

Type-II Censored with Binomial Removals

Ahmed A. Soliman1, Ahmed H. Abd Ellah2, Nasser A. Abou-Elheggag2, Rashad M. El-Sagheer3*

1Faculty of Science, Islamic University, Madinah, Saudi Arabia 2Mathematics Department, Faculty of Science, Sohag University, Sohag, Egypt 3Mathematics Department, Faculty of Science, Al-Azhar University, Cairo, Egypt

Email: *Rashadmath@Yahoo.com

Received June 30, 2013; revised July 27, 2013; accepted August 9, 2013

Copyright © 2013 Ahmed A. Soliman et al. This is an open access article distributed under the Creative Commons Attribution Li-cense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

ABSTRACT

In this article, we study the problem of predicting future records and order statistics (two-sample prediction) based on progressive type-II censored with random removals, where the number of units removed at each failure time has a dis- crete binomial distribution. We use the Bayes procedure to derive both point and interval bounds prediction. Bayesian point prediction under symmetric and symmetric loss functions is discussed. The maximum likelihood (ML) prediction intervals using “plug-in” procedure for future records and order statistics are derived. An example is discussed to illus- trate the application of the results under this censoring scheme.

Keywords: Bayesian Prediction; Burr-X Model; Progressive Censoring; Random Removals

1. Introduction

In many practical problems of statistics, one wishes to use the results of previous data (past samples) to predict a future observation (a future sample) from the same population. One way to do this is to construct an interval which will contain the future observation with a specified probability. This interval is called a prediction interval. Prediction has been applied in medicine, engineering, business, and other areas as well. Hahn and Meeker [1] have recently discussed the usefulness of constructing prediction intervals. Bayesian prediction bounds for a future observation based on certain distributions have been discussed by several authors. Bayesian prediction bounds for future observations from the exponential dis- tribution are considered by Dunsmore [2], Lingappaiah [3], Evans and Nigm [4], and Al-Hussaini and Jaheen [5]. Bayesian prediction bounds for future lifetime under the Weibull model have been derived by Evans and Nigm [6,7], and Bayesian prediction bounds for observable having the Burr type-XII distribution were obtained by Nigm [8], Al-Hussaini and Jaheen [9,10], and Ali Mousa and Jaheen [11,12]. Prediction was reviewed by Patel [13], Nagraja [14], Kaminsky and Nelson [15], and Al- Hussaini [16], and for details on the history of statistical

prediction, analysis, and applications, see, for example, Aitchison and Dunsmore [17], Geisser [18]. Bayesian prediction bounds for the Burr type-X model based on records have been derived from Ali Mousa [19], and Bayesian prediction bounds from the scaled Burr type X model were obtained by Jaheen and AL-Matrafi [20]. Bayesian prediction with Outliers and random sample size for the Burr-X model was obtained by Soliman [21], Bayesian prediction bounds for order statistics in the one and two-sample cases from the Burr type X model were obtained by Sartawi and Abu-Salih [22]. Recently, Ah- madi and Balakrishnan [23] discussed how one can pre- dict future usual records (order statistics) from an inde- pendent Y-sequence based on order statistics (usual re- cords) from an independent X-sequence and developed nonparametric prediction intervals. Ahmadi and Mir Mo- stafaee [24], Ahmadi et al. [25] obtained prediction in- tervals for order statistics as well as for the mean lifetime from a future sample based on observed usual records from an exponential distribution using the classical and Bayesian approaches, respectively.

The rest of the paper is as follows. In Section 2, we present some preliminaries as the model, priors and the posterior distribution. In Section 3, Bayesian predictive distribution for the future lower records (two-sample pre- diction) is based on progressive type-II censored with

(2)

random removals. In Section 4, the ML prediction both point and interval prediction using “plug-in” procedure are derived. In Section 5, Bayesian predictive distribution for the future order statistics based on progressive type-II censored with random removals. In Section 6, the ML prediction both point and interval prediction using “plug- in” procedure for the future order statistics are derived. A practical example using generating data set Progressively type-II censored random sample from Burr-X distribu- tion, and a simulation study has been carried out in or- der to compare the performance of different methods of prediction are presented in Section 8. Finally we con- clude the paper in Section 8.

2. The Model, Prior and Posterior

Distribution

Let random variable X have an Burr-X distribution with Parameter . the probability density function and the cumulative distribution function of X are respectively

 

 

 

 

1

2 2

2

2 exp 1 exp , , 0

( ) 1 exp , , 0.

f x x x x x F x x x

 

   

   

,

(1)

Suppose that denote a

progressively type-II censored sample, where 1 2

x R1, 1

 

, x R2, 2

, ,

xm,Rm

xx

m

x

R r

with pre-determined number of removals, say

1 1 2 2 the conditional likelihood

function can be written as,

,R r,

  ,Rmrm,

 

 

1

; m 1 ri

i i

i

Lx R r C f x F x

 

 , (2) whereCn n

 r1 1



n  r1 r2 2 , ,

n  r1rm

0 ri

n m   r1  ri1

2,3, , 1 i  m

i

U

p x

1

m

1,

and for

,

substituting (1), (2) into (3) we get

 

1

1 0 0 1

exp 1 ln ,

m

m

r

r m

m

i

k k i

G k

  

  

 

 

 

  (3)

where

 

1 1

 

2

1

1k km m , 1 ex

i i

m

r r

G U

k k

     

      

   

 . (4)

If the parameter  is unknown, from the In-likelihood function given by (3), the MLEs, ˆMLE ˆ can be ob- tained by the following equations,

 

1

1 0 0 1

ln ln m 1 ln .

m

r

r m

i

k k i

m G k

  

  

 

  

  Ui (5)

The first derivative of ln

 

 with respect to  is

 

1

1 0 0 1

ln

1 ln ,

m

m

r

r m

i

k k i

m

G k U

    

  

  

setting ln

 

 0

we get the maximum likelihood estimator of  as the following

1

1

MLE

0 0 1

ˆ ,

1 ln

m

m r

r m

i i

k k i

m

G k U

  

 

  

(7)

consider a gamma conjugate prior for  in the form

 

 1

π , exp , 0, , 0

 

       

.

 (8)

From (3) and (8) the conditional posterior (pdf) of  is given by

 

 

1

1

1

1

1

0 0

0 0

exp

π x

m

m

m

m r

r

m

k k k

r r

m k k k

G q

m Gq

 

   

   

 

 

 

 

(9)

where

1

1 ln

m

k i

i

qk

 

Ui.

3. Bayesian Prediction for Record Value

Suppose n independent items are put on a test and the lifetime distribution of each item is given by (2). Let

1, 2, 3, , m

X X XX be the ordered m-failures observed under the type-II progressively censoring plan with bi- nomial removals

R1, , Rm

, and that 1 2 1 be

a second independent sample (of size m1) of future lower

record observed from the same distribution (future sam- ple). Our aim is to make Bayesian prediction about the

, , , m

Y YY

,

th

s then the marginal pdf of is given by see Ahmadi and Mir Mostafaee [24] is S

V

 

 

 

1

log 1 !

s

S V

F Y

f Y f Y

S

 

 

 (10) where

 

 

 

 

 

2

1

2 2

1 exp ,

2 exp 1 exp ,

s

s s s

F Y Y

f Y Y Y Y

 

  

   

(11)

 

 

 

 

2

2

2

1 exp , log ,

2 exp

. 1 exp

s s s

s s

s

s

U Y q

Y Y Y

Y

    

  

 

 

   

 

s

U

(12)

Applying (11), (12) in (10) we obtain

 

 

 1exp

 

.

s

S S

s

V s s s

q

f Y q Y

S

 

  

(13)

(3)

Combining the posterior density (9) with (13) and in-tegrating out  we obtain the Bayes predictive density

 

 

 

 

  1 1 1 1 0 1 0 0 0 0 d . ,

x

s

x

m

m m

m

s V s

r r

m S

S

s s k s

k k r r m k k k

f Y f Y

q Y G q q B S m Gq

                     

 

 

  (14)

The Bayesian prediction bounds for the future

1

,

S

Y

1, 2, ,

s  m are obtained by evaluating PrYSt1 x for some given value of t1, It follows from (14) that

 

  1 1 1 1 1 1 1 0 0 0 0

Pr x x d

, , m m m m

S t S s

r r t s k k r r m k k k

Y t f V Y

GI Y B S mGq

             

 

 

  (15) where

 

 

 

 

1 1 1 d . S s

t s t m S s s

i s

q I Y

q q

 

 

 

Y Y (16)

The predictive bounds of a two-sided interval with cover 

100%

for the future lower record S, may

thus obtained by solving the following two equation for the lower

Y

s

L and upper Us bounds:

 

  1 1 1 1 0 0 0 0 1 , 2 , m m m m r r L S k k r r m k k k GI Y B S m G q

          

 

 

  (17) and

 

  1 1 1 1 0 0 0 0 1 , 2 , m m m m r r U S k k r r m k k k GI Y B S m Gq

          

 

 

  (18)

where IL

 

YS and IU

 

YS are given by Equation (16).

Now by using (14) the Bayesian point predictionof the future lower record values YS under SE (BS) and LINEX

loss functions (BL) are given, respectively, as

 

 

  1 1 1 1 0 1 0 0 0 0 d , ,

x

m m m m

s S s

s BS r r s k k r r m k k k

Y Y f V Y GI Y B S mGq

         

 

 

   (19) and  

 

  1 1 1 1 0 2 0 0 0 0 1

Log exp d

1 Log ,

x

m m m m

s S s

s BL r r s k k r r m k k k

Y cY f V c

GI Y c

B S mGq

          

Y

              

 

 

   (20) where

 

 

 

 

1 1 0 d , S s s

s m S s s

i s

Y q

I Y Y

q q

 

 

 

Y (21)

and

 

 

 

 

1 2 0 exp d . S s s

s m S s s

i s

q cY

I Y Y

q q

       

Y (22)

One can use a numerical integration technique to get the above integration, given by (21), (22).

Special case: In special case it is important to predict

the first unobserved lower record value 1.

When s = 1, in (17) and (18), the lower and upper Bayesian prediction bounds with cover

Y

 of Y1 are

obtained from the numerical solution of the following equations

 

  1 1 1 1 1 0 0 0 0 1 , 2 1, m m m m r r L k k r r m k k k GI Y B m Gq

          

 

 

  (23) and

 

  1 1 1 1 1 0 0 0 0 1 , 2 1, m m m m r r U k k r r m k k k GI Y B m Gq

          

 

 

  (24)

where IL

 

Y1 and IU

 

Y1 given by Equation (16), and

solving the resulting equations numerically.

4. ML Prediction for Record Value

The commonly used frequentist approaches such as the maximum likelihood estimate and the “plug-in” proce- dure, which is to substitute a point estimate of the un- known parameters into the predictive distribution are reviewed and discussed. In this section the ML prediction both point and interval using “plug-in” procedure for future lower record based on progressive type-II cen- sored sample defined by (2). By replacing  in the mar- ginal pdf of VS (13) by ˆ which we can find it from

(4)

 

   

 

1

1

ˆ

ˆ

exp .

s

S S

s

V s s s

q One can use a numerical integration technique to get

the above integration, given by (32).

f Y Y

S

 

  

q (25) The ML prediction bounds for the future S are ob-

tained by evaluating

Y 2

Pr YS t x

5. Bayesian Prediction for Order Statistics

 for some given value of t2, It follows from (25) that

 

 

1 2

2

1

Pr YS t x t fVs Ys dYs Ingamma t S; ,1 S

   

 

2

(26)

Order statistics arise in many practical situations as well as the reliability of systems. It is well-known that a sys- tem is called a k-out-of-m system if it consists of m components functioning satisfactorily provided that at least k

m

components function. If the lifetimes of the components are independently distributed, then the lifetime of the system coincides with that of the

m k 1 th

order statistic from the underlying distri-

bution. Therefore, order statistics play a key role in stu- dying the lifetimes of such systems. See Arnold et al. [26] and David and Nagaraja [27] for more details concerning the applications of order statistics.

where Ingamma is the incomplete gamma function defined by

1 2

; ,t t

1

1 2

1 2

2 0

Ingamma ; ,

ˆ

exp d ., and log 1 exp .

t

t t

z t z z t t

 

   

(27) The predictive bounds of a two-sided interval with cover  for the future lower record S, may thus ob-

tained by solving the following two equation for the lower

Y

s

L and upper Us bounds:

 

1 Ingamma

L Ss; ,1

2 , S

1 

 (28)

and

 

1 Ingamma

Us; ,1S

12 . S

 (29)

Suppose n independent items are put on a test and the lifetime distribution of each item is given by (2). Let

1, 2, 3, , m

X X XX be the ordered m-failures observed under the type-II progressively censoring plan with bi- nomial removals

R1, , Rm

,Ym .

, and that 1 2 2 be

a second independent random sample (of size m2) of fu-

ture order statistics observed from the same distribution. Our aim is to obtain Bayesian prediction about some functions of

, , , m

Y YY

2

1 2

Let

, , Y Y

th

s

Special case

When S = 1, in (28) and (29), the lower and upper ML prediction bounds with cover  of 1 are obtained from the numerical solution of the following equations: Y

1

1 Ingamma ;1,1 ,

2

L   (30) and

1

1 Ingamma ;1,1 .

2

U   (31) The ML point prediction of the lower record value

is given, from (25), as S

Y

 

 

 

 

 

 

1

0

1 0

d

ˆ ˆ

exp d .

s s V s s s ML

S S

s s s s s

Y Y f Y Y

q q Y Y

S

 

  

Y

(32)

s

Y be the ordered lifetime in the future sam- ple of m2 lifetimes. The density function of Ys for given

 is of the form

 

2

1

1 ,

s

m s s

s s

h Y

D s F Y F Y f Y

s

  

 

(33)

where

 

m2 .

D s s s

  

  

  For the Burr-X model, substi-

tuting (11) and (12) in (22), we obtain

 

 

2

 

 

 

1

2 2

0

1 2 exp

s m s

l l s

s s s

l

h Y

m s

D s Y Y U

l

  

 

 

 

(34)

Therefore, from (9) and (34), the Bayes predictive density function of Ys will be (see Equation (35)).

where

 

 

2

 

2 and

0

1

m s

m s l

l l

l D s

 

  

 

 

 

Ys , Us

  

   

 

 

1

1

1

1

0

1

0 0

0 0

x x d

log

,

m

m

m

m

s s

r r

m

k s

k k

s r r

m k k k

f Y h Y

G q s l U

l Y m

Gq

   

  

    

   

 

   

 

 

(5)

are given by Equation (12).

The Bayesian prediction bounds for the future are obtained by evaluating

for some given value of

1

2

, 1, 2, ,

s

Y s  m

1

PrYS x , It follows from (35) that

 

 

 

 

  1 1 1 1 1 2 0 0

Pr x x d

log 1 exp

m

m

S s s

r r m m k k r k k

Y f Y Y

l G q q s l s l                         

 

The predictive bounds of a two-sided interval with cover

1 0 m 0

kk  

 1 . m r m k

Gq 

(36)

100%

for the future order statistics ,

may thus obtained by solving the following two equation for the lower

S

Y LLs and upper UUs bounds:

 

 

 

  1 1 1 1 2 0 0

0 0 k

k k

s l Gq

... log 1 exp

1 , 2 m m m m r r m m

k k s

k k

r r

m

l G q q s l LL

                      

 

(37)  

 

 

 

 

  1 1 1 1 2 0 0 0 0

log 1 exp

1 . 2 m m m m r

r m

m

k k s

k k r r m k k k

l G q q s l UU

s l Gq

                       

 

 

  (38)

Now by using (35) the Bayesian point predictionof the future order statistics Ys

n, res

under SE (BS) and LINEX loss functions (BL) are give pectively, as

 

 

 

  1 1 0 0 0 x d m m

s s s

s BS r r 1 1 1 0 0 m m s k k r r m k GI Y l m Gq k k

Y Y f Y Y

       

 

 

  (39)       

 and  

 

 

 

  1 1 1 1 0 2 0 0 0 0

1Log exp x d

m

m m

m

s s Ys s BL r r s k k r r m k k k

Y cY f Y

c GI Y l m Gq                    

 

 

   (40) where

 

 1

 

1 0

log m d ,

s s k s s s

I Y Y q s l UY Y

  

 

(41)

and

 

 

2

0

exp log d .

s

m 1

s k s

I Y

cY q s l U Y Y

 

 

      s s (42)

We can use a numerical integration technique to get the above integration, given by (41), (42).

6. ML Prediction for Order Statistics

In this section the ML prediction both point and in using “plug-in” procedure for the future order

based on progressive type-II censored sample defined by (2). From (34) by replacing  in the density function of

terval statistics

for given  by ˆ h

s

Y which we can find it from the nu-

lution of t e Equation (7), then the density n of

merical so

functio Ys for given ˆ

 

 

2

 

 

 

1 ˆ 1 2 2 0 ˆ ˆ

1 2 exp

s

m s

l l s

s s s

l

h Y

m s

.

s Y Y U

l      

D 

       

(43)

The ML prediction bounds for the future 

s

Y are ob-

tained by evaluating PrYS2 x for e given

value of

som

2,

(6)

 

 

 

 

2

2

ˆ 2 1

Pr x

ˆ d 1 1 exp

S

l s s s

Y

l

h Y Y

l s

 

 

 

  

 

    

 

2 .

(44)

The predictive bounds of a two-sided interval with cover  for the future order statistics Ys

q

, may thus obtaine by solving the following two e uation for the d lower LLs

and upper

s

UU bounds:

 

 

ˆ

2 1

1 1 exp ,

2

l s s

l

LL l s

   

   

 

(45)

and

 

 

ˆ

2 1

1 1 exp .

2

l s s

l

UU l s

   

   

 

   (45)

From (43) the ML point prediction of the order statis-tics YS is given by

 

 

 

 

2

ˆ 1

0

ˆ

2 exp l s .

1 ML

0

ˆ d

d

s s s s

s s s s s

l YY Y U  

 

One ca se a numerical integration tec

Y Y h Y Y

Y

 

(46)

n u hnique to get

the above integration, given by (46)

7. Illustrative Example and Simulation

Study

E

.

Algorithm 1.

1) Specify the value of n. 2) Specify the value of m.

Specify the value of parameters  and p

Generate a random sample with size m from Burr-X and sort it.

5) Generate a random number r1 from

,

for each .

7) Set rm according to the following relation,

1

the exact value of and P are respectively 1.6374, 0.4 and n = 10 and m = 7,

xample 1: In this example, a progressive type-II cen-

sored sample with random removals from the Burr-X distribution have been generated using the following al- gorithm

3) 4)

, .

bio n m p 6) Generate a random number ri from

1

,

i i

bio n m r p

 

1

l

  i i, 2,3, , m1

1

1 1

if 0

0 .

i i

l l

l l

m

n m r n m r r

o w

 

   

   

In these sample, we assumed that 

the sample obtained is given as follows

X Ri, i

(1.2655,1) : (0. 0.5449,1), (0.6728,0), (0.9914,0), ,

95% Bayesian prediction intervals of the

4406,1), (

(1.3680,0), (1.4271,0).

We used the above sample to compute:

1) Bayesian point prediction, under SE and LINEX loss function;

2) The th

s

un ecords (order statistics); d prediction ML;

elihood prediction intervals of

observed lower r

3) The maximum likelihoo 4) The 95% maximum lik

the sth unobserved lower records (order statistics);

Simulation Study In this example, we

di paring the pe

fo d in this paper.

Firstly we generate ( wer record values (order statistics) from the Burr-X distribution  = 1.63 . By

usin gene nd 95%

B ervations

lower record (order statistics) from the the Burr-X ution and by repeated the generations 1 00 time we can find the Percentage (C.P) and we use prior

5) The results obtained are given in Tables 1-4. Example 2:

scuss results of a simulation study com r- rmance of the prediction results obtaine

m1 = 5) lo

74 g the rating data we predict the 90% a ayesian prediction intervals for the future obs

th

S

distrib 0

 ,

equal (2,3). Tables 5-8 show the 90% and 95%

Bayesian (B) and maximum likelihood (ML) prediction intervals for the future Sth lower record (order statis-

tics). The sample obtained is given as follows, (0.4640,1), (0.5124,1), (0.7323,0), (0.8293,1), (0.8665,0), (0.8713,1), (1.1322,0), (1.4969,0).

8.

In on

ensored sam- stribution fo

pr

ate

i n

y.

using “plug-in” procedure (MLPI) ediction intervals using Bayes results show that, for all cases (low-

Conclusions

this paper, we consider the two-sample predicti wherein the observed progressive Type-II c

ples with random removals from the Burr-X di

rm the informative samples and discussed how point prediction and prediction intervals can be constructed for future lower records (order statistics). Bayesian and ML

edictions both the point prediction and the prediction intervals are presented and discussed in this paper.

The commonly used frequentist approaches such as the maximum likelihood estimate and the “plug-in” proce- dure, which is to substitute a point estimate of the un- known parameters into the predictive distribution are reviewed and discussed. Numerical example using simu- l d data were used to illustrate the procedures devel- oped here. Finally, simulation studies are presented to compare the performance of different methods of predic- t o . A study of 1000 randomly generated future samples from the same distribution shows that the actual predic- tion levels are satisfactor From the results we note the following:

1) The results in Tables 1-4 show that the lengths of

the prediction intervals are shorter than that of pr procedure.

2) The simulation

(7)

for Table 1. Point and interval BP

LINEX

the future lower record YS.

95% BPI for YS S

Y SE

c1 = −1 c2 = 0.0001 c3 = 1 [Lower, Upper] Length

Y1 0.9681 1.0863 0.9735 0.8672 [0.1892,1.9866] 1.7973

Y2 0.5931 0.6535 0.5933

Y3 0.3998 0.4366 0.4010

Y4 0.2799 0.3033 0.2849

Y5 0.2000 0.2150 0.233

[0.0305,1.0073] 0.9768

5

0.5407 [0.0727,1.3417] 1.2690 0.3679

0.2596 [0.0133,0.7844] 0.7711 0.1869 [0.0059,0.6211] 0.6152

Table 2. Point and interva

Length

l 95% MLPI for YS. S

Y ML [Lower, Upper]

Y1 1.0686 [0.3295,2.0415] 1.71 20

Y2 0.6966 [0.1810,1. 1.2245

3 0.4958 [0.1082,1. 0.9635

4 0.3644 [0.0671,0. 0.7784

5 0.2721 [0.0426,0. 0.6343

4055]

Y 0717]

Y 8455]

Y 6769]

Table t and interva e future tatistics

LINEX 95% BPI for

S

Y

3. Poin l BP for th order s .

S Y

S

Y SE

c1 = −1 c2 = 0.0001 c3 = 1 [Lower, Upper] Length

Y1 0.4847 0.5150 0.4848 0.4563 [0.0777,0.9992] 0.9215

Y 0.7202 7540 0.7201 [0.2425,1.2453] 1.0028

Y 0.9334 9716 0.9334 [0.4145,1.4887] 1.0742

Y 1.1730 2201 1.1730 [0.6040,1.7968] 1.1928

Y 1.5292 6070 1.5292 [0.8470,2.3554] 1.5085

2 0. 0.6875

3 0. 0.8963

4 1. 1.1278

5 1. 1.4592

Table 4. Point and interval 95

Length S

Y . % MLPI for

S

Y ML [Lower, Upper]

Y1 0.5895 [0.1978,1.0525] 0.8546

Y2 0.8301 [0.4219,1. 0.8754

3 1.0394 [0.6046,1. 0.9361

4 1.2707 [0.7848, 1.0634

5 1.6134 [1.0033, 1.3990

2973]

Y 5407]

Y 1.8482]

Y 1.4023]

Table 5. Two sample p ion for th re lower reco and 95% for d their actu

e-tion with = 2, = 3, 374, p = 0.4 = 12, m = 8.

90% BPI for YS 95% BPI for YS

, 1, 2, ,

S

Y S 5 an al pr

redict e futu rd.-90% BPI

= 1.6 , n

dic

S

Y [Lower, Upper] Length C.P [Lowe, Upper] Length C.P

Y1 [0.2723,1.7970] 1.5247 0.911 [0.1902,1.9814] 1.7912 0.960

Y2 [0.1174,1.19 1.0793 0.921 1.2589 0.964

Y3 [0.0550,0.88 0.8261 0.911 0.9644 0.959

Y4 [0.0266,0.67 0.6443 0.903 0.7573 0.949

Y5 [0.0131,0.5184] 0.5053 0.909 [0.0064,0.6070] 0.6006 0.954

67] [0.0743,1.3332]

10] [0.0318,0.9962]

(8)

Table 6. T o sample predictiow n for the future lower record-90% and 95% MLPI for YS, S1, 2,,5 an actual pre- iction with = 2, = 3, = 1.6374, p = 0.4, n = 12, m = 8.

95% MLPI for YS

d their d

90% MLPI for YS

S

Y [Lowe, Upper] Length C.P [Lower, Upper] Length C.P

Y1 [0.3850,1.8439] 1.4589 0.896 [0.3018,2.0242] 1.7224 0.949

Y2 2 1.0397 0.913 ]

[0.1 0.8091 0. ]

[0.0 0.6434 0. ]

[0.0 0.5144 0. ]

[0. 104,1.2501] [0.1593,1.3820 1.2227 0.960

Y3 250,0.9341] 908 [0.0918,1.0444 0.9526 0.955

Y4 770,0.7204] 899 [0.0550,0.8160 0.7610 0.945

Y5 484,0.5628] 910 [0.0338,0.6464 0.6126 0.953

Table 7. Two sample prediction for the future order statistics.-90% and 95% BPI for YS, S1, 2,,5 and their a re- dict with = 2, 4, p = 0.4, n = 8 5.

95% BPI for YS

ctual p

ion = 3, = 1.637 = 12, m m2 =

90% BPI for YS

S

Y [Lower, Upper] Length C.P [Lowe, Upper] Length C.P

Y1 [0.1201,0.8999] 0.7796 0.912 [0.0801,0.9867] 0.9065 0.952

Y2 ] 8 7

4 1 8

[0.3104,1.1472 0.836 0.89 [0.2470,1.2344] 0.9874 0.943

Y3 [0.4938,1.3862] 0.892 0.916 [0.4191,1.4797] 1.0606 0.963

Y4 [0.6910,1.6811] 0.990 0.924 [0.6075,1.7900] 1.1825 0.963

Y5 [0.9442,2.1960] 1.251 0.920 [0.8483,2.3511] 1.5028 0.971

Table 8. Two sa n for th rder statistics-90% an or 5 and their actual

rediction with = 2, = 3, = 1.6374, p = 0.4, n = 12, m = 8 m = 5.

95% MLPI for YS

mple predictio e future o d 95% MLPI f YS, S1, 2,,

p 2

90% MLPI for YS

S

Y [Lowe, Upper] Length C.P [Lower, Upper] Length C.P

Y1 [0.22 0.7224 0.906 [0.1752,1.0249] 98 0.952

2 5] 97 ]

Y3 4] 4 ]

Y4 6] 4 ]

Y5 1] 1 ]

21,0.9442] 0.84

Y [0.4488,1.190 0.7417 0.8 [0.3924,1.2725 0.8801 0.950

[0.6356,1.429 0.7938 0.91 [0.5740,1.5187 0.9447 0.961

[0.8232,1.723 0.9004 0.91 [0.7548,1.8292 1.0744 0.955

[1.0568,2.235 1.1784 0.90 [0.9753,2.3875 1.4122 0.955

3) In general on results show that plug-in” procedure (MLPI) performs better than the

9. Acknowledgemen

The au would like to e heir thanks to the edi- tor and referees fo ments and sugges tions o e origina s manus

R

ES

] G. J. Hahan an , “Statis erval: A

, the simulati the

Bayes method (BPI), in the sense of shorted interval lengths.

ts

thors xpress t

r their useful com - n th l version of thi cript.

EFERENC

[1 d W.Q. Meeker tical Int

Guide for Practitioners,” Johan Wiley and Sons, Hobo- ken, 1991. doi:10.1002/9780470316771

[2] I. R. Dunsmore, “The Bayesian Predictive Distribution in Life Testing Models,” Technometrics, Vol. 16, No. 3, 1974, pp. 455-460.

doi:10.1080/00401706.1974.10489216

[3] G. S. Lingappaiah, “Bayesian Approach to Prediction and xponential Distri

stitute of Statistics & Mathematics, Vol. 31, No. 1, 1979, pp. 391-401.

[4] I. G. Evans and A ayesian One-Sample

Pred Two-Param eibull Dis ion,”

IEEE Transectio , No. 2 , pp.

410-[5] E. K. Z. F. J “Parametric

edic-tion e Future Median of the E ntial

Dis stics, Vol. . 3, 1999 267-

275 33188990 7

the Spacings in the E bution,” Annals In-

. M. Nigm, “B

iction for the eter W tribut

n, Vol. 29 , 1980 413. Al-Hussaini and

Bounds for th

aheen, Pr

xpone tribution,” Stati 32, No , pp. . doi:10.1080/02 880266

[6] I. G Nigm, an Predic r the

Left Truncated Exponential Distribution,” Tech rics, Vol. 22, No. 2, 1980, pp. 201-204.

86135

. Evans and A. M. “Bayesi tion fo nomet

doi:10.1080/00401706.1980.104

igm, “Bayesian Prediction for [7] I. G. Evans and A. M. N

(9)

pp. 649-658. doi:10.1080/03610928008827909

[8] A. M. Nigm, “Prediction Bounds for the Burr Model,”

Communicatio Methods, Vol.

No. 1, 1988, p

n in Statistics Theory & p. 287-297.

17,

doi:10.1080/03610928808829622

[9] E. K. Al-Hussaini and Z. F. Jaheen, “Bayesian Prediction Bounds for Burr Type XI Model,” Communication in

d Z. F. Jaheen, “Bayesian Prediction in the Presence of Sta- tistics Theory & Methods, Vol. 24, No. 7, 1995, pp. 1829- 1842.

[10] E. K. Al-Hussaini an

Bounds for Burr Type XII Distribution

Outliers,” Statistical Planning and Inference, Vol. 55, No. 1, 1996, pp. 23-37. doi:10.1016/0378-3758(95)00184-0 [11] M. A. M. Ali Mousa and Z. F. Jaheen, “Bayesian Predic-

tion Bounds for Burr Type XII Model Based on Doubly

ic-eview,”

Commu-tial Distribution

of Mathematics and Censored Data,” Statistics, Vol. 48, No. 2, 1997, pp. 337- 344.

[12] M. A. M. Ali Mousa and Z. F. Jaheen, “Bayesian Pred tion for the Two Parameter Burr Type XII Model Based on Doubly Censored Data,” Applied Statistics of Science, Vol. 7, No. 2-3, 1998, pp. 103-111.

[13] J. K. Patel, “Prediction Intervals—A R

nication in Statistics Theory & Methods, Vol. 18, No. 7, 1989, pp. 2393-2465.

[14] H. N. Nagaraja, “Prediction Problems,” In: N. Balakrish-nan and A. P. Basu, Eds., The Exponen : Theory and Applications, Gordon and Breach, New York, 1995, pp. 139-163.

[15] K. S. Kaminsky and P. I. Nelson, “Prediction on Order Statistics,” In: N. Balakrishnan and C. R. Rao, Eds., Handbook of Statistics, Elsevier Science, Amsterdam, 1998, pp. 431-450.

[16] E. K. Al-Hussaini, “Prediction: Advances and New Re- search,” International Conference

21st Century, Cairo, 15-22 January 2000, pp. 25-33. [17] J. Aitchison and I. R. Dunsmore, “Statistical Prediction

Analysis,” Cambridge University Press, Cambridge, 1975.

doi:10.1017/CBO9780511569647

[18] S. Geisser, “Predictive Inference: An Introduction,”

15-425. Chapman and Hall, 1993.

[19] M. A. M. Ali Mousa, “Inference and Prediction for the Burr Type X Model Based on Records,” Statistics, Vol. 35, No. 4, 2001, pp. 4

doi:10.1080/02331880108802745

[20] Z. F. Jaheen and B. N. AL-Matrafi, “Bayesian Prediction Bounds from the Scaled Burr Type X Model,” Computers and Mathematics, Vol. 44, No. 5, 2002, pp. 587-594. [21] A. A. Soliman, “Bayesian Prediction with Outliers and

Random Sampel Size for the Burr Type X Model,” Mathematics & Physics Socity, Vol. 73, No. 1, 1998, pp. 1-12.

[22] H. A. Sartawi and M. S. Abu-Salih, “Bayesian Prediction

Applied Statistics,

tics Bounds for Burr Type X Model,” Communication in Sta- tistics Theory & Methods, Vol. 20, No. 7, 1991, pp. 2307- 2330.

[23] J. Ahmadi and N. Balakrishnan, “Prediction of Order Statistics and Record Values from Two Independent Se- quences,” Journal of Theoretical and

Vol. 44, No. 4, 2010, pp. 417-430.

[24] J. Ahmadi and S. M. T. K. Mir Mostafaee, “Prediction Intervals for Future Records and Order Statistics Coming from Two Parameter Exponential Distribution,” Statis Probability Letters, Vol. 79, No. 7, 2009, pp. 977-983. doi:10.1016/j.spl.2008.12.002

[25] J. Ahmadi, S. M. T. K. Mir Mostafaee and N. Balakrish- nanb, “Bayesian Prediction of Order Statistics Based on

ourse in Order Statistics,” Wiley, New York, 1992. k-Record Values from Exponential Distribution,” Statis- tics, Vol. 45, No. 4, 2011, pp. 375-387.

[26] B .C. Arnold, N. Balakrishnan and H. N. Nagaraja, “A First C

Figure

Table 1. Point and interval BP for the future lower record Y . S

Table 1.

Point and interval BP for the future lower record Y . S p.7
Table 2. Point and interval 95% MLPI for Y . S

Table 2.

Point and interval 95% MLPI for Y . S p.7
Table 3. Point and interval BP for the future order statistics YS.

Table 3.

Point and interval BP for the future order statistics YS. p.7
Table 4. Point and interval 95% MLPI for Y . S

Table 4.

Point and interval 95% MLPI for Y . S p.7
Table 5. Two sample pion for thre lower reco and 95% for d their actue-tion with  = 2,  = 3, redict374, p = 0.4e futu = 12, m = 8

Table 5.

Two sample pion for thre lower reco and 95% for d their actue-tion with  = 2,  = 3, redict374, p = 0.4e futu = 12, m = 8 p.7
Table 6. T o sample predictiown for the future lower record-90% and 95% MLPI for Y, 1,2,,5SS  anactual pre- iction with  = 2,  = 3,  = 1.6374, p = 0.4, n = 12, m = 8

Table 6.

T o sample predictiown for the future lower record-90% and 95% MLPI for Y, 1,2,,5SS  anactual pre- iction with  = 2,  = 3,  = 1.6374, p = 0.4, n = 12, m = 8 p.8
Table 7. Two sample p

Table 7.

Two sample p p.8
Table 8. Two san for thrder statistics-90% anor 5 and their actual rediction with mple predictio = 2,  = 3,  = 1.6374, e future op = 0.4, n = 12, m = 8 m  = 5

Table 8.

Two san for thrder statistics-90% anor 5 and their actual rediction with mple predictio = 2,  = 3,  = 1.6374, e future op = 0.4, n = 12, m = 8 m = 5 p.8

References

  1. doi:10.1080