1 FEB 1 Q1I4 teSG~t BUS. ADM LIRARY RESEARCH SUPPORT UNIVERSITY OF MICHIGAN BUSINESS SCHOOL DECEMBER 1996 CHARACTERISSATIONS, STOCHASTIC EQUATIONS, AND THE GIBBS SAMPLER WORKING PAPER #961 2-33 STEPHEN WALKER IMPERIAL COLLEGE, 180 QUEEN'S GATE, LONDON AND PAUL DAMIEN UNIVERSITY OF MICHIGAN BUSINESS SCHOOL

I Characterisations, Stochastic Equations, and the Gibbs Sampler Stephen Walker1 and Paul Damien2 Department of Mlathematics, Imperial College, 180 Queen's Gate, London S\W? 2BZ. 2 Department of Statistics and Mlanagement Science, School of Business. University of Michigan. Ann Arbor, 48109-1234, USA. SUlMMARY \e obtain characterisations of listributions on the real line and solve stochastic equations using the Gibbs sampler. Particular stochastic equations considered are of the type X =d B(X + C) and X =d BX + C and we will be considering solutions when B and C are not necessarily independent. 1 Introduction Dufresne (1996) considers the stochastic difference equation \n+l = Bn+1 (\X + Cn), where {B,. n > 1} and {C,,. > 0} are independent iid copies of B and C having densities fB and fc. respectively. Vervaat (1979) provides E(log B) < 0, E (log ICl)+ < oo 1

I as bleing sufficient conditions for the existence and uniqueness of the limit distribution of An. Under the above conditions let X, -+d X. then the limit N. satisfies the stochastic equation X =d B(X + C). where B. C and X are independent. Solutions to this stochastic equation are apparently hard to come by (Dufresne. 1996). In this paper we will characterise solutions via the Gibbs sampler (Smith and Roberts. 1993). A closely related stochastic equation is given by X =d BX + C. We will find solutions to this stochastic equation when B and C are not independent. However. the paper is firstly concerned with characterising continuous densities on the real line. for example. the normal. gamma. exponential and beta distributions. We do this by considering a Gibbs sampler over the joint density f(x. u. As is well known the Gibbs sampler generates a sequence {A,;} by taking U' from f(.|IX,) and then taking.~-,+ from f(.ltn). In many- instances it is possible to define Ln = h2(12, Xn): for some appropriate randoml v\arialle 1\ a<nd function h. and also to define -A,+1 = hl(l.i-n). naain for some appropriate random variable l1 and function h/. Putting tlele together it may be possible to construct the sequence directly v-ia -\,l, = h (A'. X). It is well known that under mild regularity conditions X-, "-d X and the limit variable X satisfies N =, hl(.X). T'lis is a stochastic equation with solution fx{(x) = f f(x, u)du. Moreover. lhi.- solution is unique and therefore X =d h(V.) also characterises X I pro-ided l f(.r tt ) and.(t tl.r) define.fx (1) uniquely). \e start in Section 2 with the characterisations of some well known densities on the real line. Tlhe characterisation of the exponential density leads to a solution of the stochastic equation X =d B(X + C) when B is uniform on (0. 1) and C has an exponential distribution. The characterisation of the gammia distribution leads to a solution of the stochastic equation X =. BX + C when B has a beta distribution and C an exponential distrilution. In Section 3 we olbtain new solutions to the stochastic equation N' =d B(X + C) including a solution with B and C not independent. 9

2 Characterisations of densities Let X be a random variable defined on the real line,.(0. 1) represent the ulifornll distribution on the interval (0. 1). ~E() represent the exponential distribution with mean 1/A (A > 0). and =d denote equality in distribution. Our characterisations are of the type. x =d h(VX.), where h is a function and V a collection of independent variables. also independent of X. LExpoa.t tial(1) density TIEOREM 1. Let I\ and be U(0 1). Then X\ = {d 1' ( X - og10 ) (1) f. a tld only if. X - ~(1). PROOF. First we describe how such a characterisation arose. The den<ity for a ~(1) variable is given up to proportionality by exp(-x.)I(.r > 0). \1We construct a joint density for X and '. a random variable defined on t0. 1). given up to proportionality by I(0 < u < exp(-.).0 < x). Clearly the inarginal density for X is E(1). We can now construct a random sequence {,,;} such that X, — d C(1) using the Gibbs sampler (Smith and Roberts. 1993). This involves sampling XN,+l from f(.Lr',) having taken Un ftronl.f(.[AX,. IWe can define LUn = 1n exp(-X,,) where V1 - U(0.1) (independent of X, ) and,,n+1 = -1i, log Un where \1n 1 U(O. 1) (independent of IL,. and X,). Joining these two together gives.~n+ = 1-ln(X-n - log 12n). From the convergence properties of the Gibbs sampler and provided the sequence is uniquely associated with the exponential density (that is, the two conditional densities uniquely determine the joint density of X and U) then t he Theorem is proved. However, we will only use the above arguement to propose the characterisations and will rely on more traditional methods to prove the result. 3

I Therefore let us assume that X -~ c(1) and consider the Laplace transform,! ' =;(X - log I -) noting that -log 2 ( - (1 ). If oy(O) = E- exp(-01") i lien clearl(O^ j [al(t)] or() — L[ + Oi] du. (2) If now ox(O) = 1/(1 + 0) then or(O) = 1/(1 + 0) proving the 'if assertion. To prove uniqueness we consider moments. Let;li = f.idFx(x). First. it is straightforward to show using (2) that if (1) is true then E(-1f)"iO /1,! = E E (-i)m+"non+mI.n1!( + n + 1)]. n=0 n=O m=O lTherefore for all i (-1 )'tt/i! = E (-1)m+rn",,/[n!(n + 12 + 1)]. nm+n=i Itilrina to i,,/i! = (i + 1)-' ij/j! j=o froml which the result fIt = i! follows. Since E, yiti/i! is absolutely convergent f;Ir '/ < I the moments characterise the distribution of X which is identified,- being t7( i). completing the proof. COROLLARY 1. Let I and I be iidt(O. 1). Then A =d T (x - 1 log, 2) (3),1. (ind only if. X - (7(A). The proof to this follows the same procedure as outlined in the proof to Tleorem 1. It is clear that (3) provides a solution to the stochastic equation \ =B(X + C). The characterisation of ~(\) given in Corollary 1 is not new (Iotz and Steutel. 1988). however, that a proof is available via the Gibbs sampler is intriguing. 4

Isimm~" --- I ~L L I I - I - r L M- 1 PROPOSITION 1. If B --.(0.1) and C (-,(A) the fthe unique solutio to the stochastic equation X =d B(X + C) is gicen by X (-,(\)..\Xnrmal( OI.) density. In this section let A'(u. a2) represent the normal distribution with mean II and variance a2. Proceeding as for the exponential density we define the joint density of X and t up to proportionality by I(0 < u < exp(-x22/2)). Tle Gilbbs sequence is defined as follows; U, = 12n, exp( -- /2 ) and n+l =!'2l;, - 1)>/- log t. Putting these together gives.~+l, = (21' - 1) - 2 log 12n T'lerelore. THEOREM 2. Let V; and 1: be iidU(O.1). Thenz X = (21j - 1)/\2 - 2logV 2 (4) If. and only iwt X -A'(0. ). PROOF. First we note that if X is symmetric about 0 and.~2 is chi-utilarecl then X is normal and note that if (4) defines a random variable A" on t-x. ' c) then it is symmetric about 0. Define Y = (21 -1)2(X2-2 log 12) anld note that (21 -1)2 is l(1/2.1). where B denotes the beta distribution. ital - log 1 is ~(1). Therefore o-(0) = B(1/2. 1)',-12 [+ u. If no\w X. \'(0. 1) then ox:(O) = 1/1 + 20 leading to o-(O) = 1/V' + 2,. lr>ovicled 1201 < 1: proving the 'if' assertion. To show uniqueness we need to conlsider the integral equation b6-(O) = i/2 / U/2(1 + 20u)-1 o(0u)du. (5) where I' = X2. Our aim is to show the unique solution is given bv o6(0) = l/1v/1720 or f,-(y) = (2r')-1/2y-l/2 exp(-y/2)I(y > 0). The moments for 5

this density are given by /li - 2'i(i + 1/2)/r(l/2) and since EZi iti/i! is absolutely convergent for Itl < 1/2 it follows that these moments characterise f- uniquely. It is easy to show that (5) leads to,(- 1) O,/1,,!- 1/2 ((-1)n"m 2"n'+mt,/[!(?n + m + 1/2)]. n=O n0=O m=O Tlerefore pi/i! = (2i + 1)'- E 2'-iLj/j! j=O the unique solution being gi-en by ti = 2ir(i + 1/2)/r(1/2), completing the proof. C(OROLLARY 2. Let 11 and 1t be iid U(0, 1). Then - =d { + (2i-1~) /(A- )2 -2 log 12 (6) If. a, ( d oinly if. X oA. (.c2). Gam lna(a. ) density. Let G(a. b) (a. b > 0) represent the gamma distrilbution with mean a/b. In the following we will additionally assume that a > 1. If X 9 Q(a, 1) then the delnsity for A is given by 1/r(a)X"-l exp(-.')I(a > 0). Here we introduce th e \ariable [' which has joint density with X given by f(x. ut) ix I(0 < it <. '-1.. > O)exp(-.). The Gibb1s sequence is given by- U = I1Xa-1 and.,,+.i = t -1/(') - log 1-i. Combining these leads to,,+1 = i nI'2/a) -log l-n. Therefore. THEOREM:3. Let 1 an d V1 be iid (0O, 1). Then - =d W;/11(a-1) _ log V (7) if. and only if, - G9(a. 1) (a > 1). 6

PROOF. Define I"_ = x'21/(-1) - log 1 and note that, for a > 1, 1/a is LB(a - 1. 1). Then we have a- a-o OY(8)= 01 "-26x(0u)du. (8) 1 + 0 JO If XA ~9(a. 1) then ox(O) = (1 + 0)-a and with this a-i Yz(IU) 1= +-0 E (-1)"r(a + "m)0m/[r(a)(nm + a - )], 1 + [m m=0 for 101 < 1. leading to.~. 6r(O0) (1 + )-1 E (-1)m(a - 1 + im?)Om/r(a - 1), m=O alld hence Qy(0) = (1 + 0)-' proving the;if' assertion. To show uniqueness \we consider ox(0) = — 21). K 1 ( o 1 ul-2 6(Ou)du' 1+ 1 Jo This leads to -~(-1)"",71,,/,! = (a - 1) (1)n+mn+ni,, /[n!(n + a - 1)]?i =U n=0 m=0 and hence i/i-! = (a - 1) /[j!(j + a -1)] j=o the unique solution beinl given 1b' pI = r(i+a)/F(a). Again. since EZ [iti/i! is absolutely convergent for Itl < 1 this implies these moments characterise uniquevly the distribution for.\. which is identified as being (G(a, 1), completing the proof. COROLLARY 3. Let l and 1 be iid l(O, 1). Then if,y - 1/(ial) - b-1 log ) (a > i. and only if. X ^, G(a. b) (a > 1). 7

I Clearly (9) provides a solution to the stochastic equation X =d BX + C. This solution is already known (Vervaat. 1979. 3.8.2). The use of f1/(a-1) to represent the B(a - 1.1) distribution in Vervaat (1979) takes on added significance. A characterisation of!(a, 1) exists for all a > 0 by considering the joint density f(x. u) x xar-I( u < exp(-.)). For the Gibbs sequence we obtain L, = 1',, exp(-X n) and \X,+1 = -VI% logi- U leading to the characterisation/stochastic equation X =d I {X -log12}. Btfa(a.b) density. We state without proof the characterisation for B(a, b) for a > 1. b > 0. THEOREM 4. Let 1; and 1i be iidl (O, 1). Then X = - 1/ - ) (10):i' a: d o1ly( j if. X -- 3(a. b) (a > 1). This provides a solution to the stochastic equation X =d BX + C when B and C are not indlependent. i.e. B = lIl/bl1/(a-l) and C' 1 I-b6 LE tr. eill raluet de lnsity The extreme value density is given up to proportionality 1y -f(x).x -xp1(- exp(x))I(. > 0) and consider the joint density f(x. u) oc exp(-u)I( a > (exp(.r) > 1). The Gibbs sequence is given b1y U =' - log l'i + exp(Xn) and (exp(AX-,,+i) = u h". leading to the characterisation/stochastic equation exp(X) =d {exp(X) - log 1}'2 For some general theory. let fx be a strictly monotone decreasing density on (0. oc) with inverse g = f-. Define the joint density of A and U by f(x..u) = I(0 < u <.f(x)). The Gibbs sampler is then given by,, = I f(X,,) and -\,+ = 1#g('n). where 1V and V; are independent U(0, 1) variables. leading to the stochastic equation X =d 12g[1fJi(X)]. This has the solution f.- Therefore. 8

I PROPOSITION 2. A random variable X defined on (O. c) has a strictly 1mooto~ne decreasing density f if. and only if. X =d..f-'[l.f(X)]. uhere I; and I' are independent uniform(O.1) variables. 3 X = B(X +C) and U =BU + C The aim in this section is to provide a solution to the above stochastic equalions via the construction of a suitable Gibbs sampler. An example of this solution has already been given in Proposition 1. Solutions are based on the following Theorem, THEOREIM 5. If f(x. i) is a joint density with conditional densites f(.ru) and.f(ulx.r) such that f(.xlut) = 1/ufB(xt) ad = fc(u - ) whtere fB lfnd fc (Ir the (e dnsities for independent B and C. respectively. then fx(x).i the unique solution to the stochastic equation X =d B(X + C). fi'(u) is 1lif unique solution to the stochastic equation U =d BU + C. and X =d Bt',,,,d U. = X + C'. IEXAMPLE. Let f(.r. a).x x.-'( -.tr)b exp(-ct)I(O <. < u) for a.b.c > 0. Then.f(.rJ.) x.r -l(u- a')6-1I(.'/1u < 1) and f(tj.X) x (u - x)b- exp(-cj)I(u - > 0). Tllerefore B ( B(a.b) and C - G(b. c). Therefore X - G(a.c) solves X =,I B (X + C) and U, 5(a + b,c) solves U =d B& + C. Moreover. we deduce G(a. c) =1j B(a. b) x g (a + b. c). In fact we can show that the solution in the example is the only one available via the Gibbs sampler for which C > 0 a.s. and 0 < B < 1 a.s. To find solutions we need to identify independent random variables B and C such that the conditionals f(xlu) = 1/ufs(x/lu) and f(ufx) =.fc(u- x) uniquely define the joint density f(x. u). The result is contained in the following Theorem. THEOREM 6. f(.rxlu) = 1/u fB(x/u) (0 < B < 1) and f(ujx) = fc(u - x) 9

I (C > 0) define.f(x. u) uniquely if. and only if. C ^9 (b,c) and B - B(a.b). PROOF. Clearly the 'if assertion is trivial. Now suppose that f(x, u) = 1/utfBS(x/)f (u) = fc(u - x).fx(.), from which it is immediate that U and X/U are independent as are X and U - X (note that U > X > 0). If we let 1 = U - X and II' = X we see that 1'/11' and V1 + I' are independent. Since -' and j1. are independent Lukacs' Theorem (Lukacs. 1955) states thz'i 1' and 1I' are both gamma variables with the same scale parameter, say c. and let U - X ~!(b.c) and X ~ 9(ac). Therefore C' - (b. c). '(.. u) (t -.1)1'- exp (-c(t -.r)) x-l exp(-cx)I(0 < x < t)., -o f/(.1r[) X.x,-l(u -.r)'-1I(0 < x < u) which implies that B - B(a. b). \Ve can obtain a further solution to the stochastic equation X =d BX + C whlen B and C are not independent (see also Theorem 4) by considering t le Pareto distribution with density f(x) c x-'-"'I(x > a) and parameters it. > 0. THEOREM 7. If B = V '-1/(a+l) and C = a(l - 1i) twhere Il ad I2 ar itd l(0. 1) then the unique solutfion to the stochastic equation X =d BX +C' i. git(n by the Pareto distribution with paramete.r (a,a). References Dufresne.D. (1996). On the stochastic equation C(X) = C[B(X + C)] and a Iroperty of gamma distributions. Bernoulli 2(3), 287-291. IKotz.S. and Steutel,F.W. (1988). A note on the characterisation of exponential distributions. Statist. Probab. Letters 6, 201-203. Lukacs.E. (1955). A characterisation of the gamma distribution. A-nn. Math. Statist. 26. 319-324. 10

I Smlith.A.F..M and Roberts.G.O. (1993). Bayesian computations via the Gibbs salmpler and related Markov chain Monte Carlo methods. J. R. Statist. Soc. B 55.:3-23. \ervaat.W. (1979). On a stochastic difference equation and a representation of non-negative infinitely divisible random variables. Adv. Appl. Probab. 11. 750-783. 11