Alberto_leon-garcia 2009 Student Solutions Manual

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA

Overview

Download & View Alberto_leon-garcia 2009 Student Solutions Manual as PDF for free.

More details

STUDENT SOLUTIONS MANUAL

Probability, Statistics, and Random Processes for Electrical Engineering Third Edition

Prentice Hall New York Boc;ron Stlll Franusco London Toronto Syd ney Tokyo Singapore t..tadnd Mexico City Mun1ch Paris Cape Town I long Kong t..lonncal

Vice Prc,idcnt nnd Edilorinl OirC isannu Art Oirectur: "cnn\ Beck Co,er Oc, igncr: Kri\linc Carne)· Art Erli1or: Cm:g Dulle~ MnnuJ'nciUring Mnnngcr: Alan nschcr MunuJ'ucturin~ On)Cr: I "" McOo\\cll Cnver imn~e: I oughcre\\- mcg

3 ( 1' 1 ) ( 1.2) ( 1.31 (2, 1) ( 2. ~) (2.3) (:1. 11 (3.2) ( 3,3 ) ( l ,ll (-!.21 (-! ,3' (5.11 (5.2) (5,3) (6,1) (G.2J (6.31

indicalt" Plcmenh of en-u t ::~ hc·low.

A :- t •\ t h\ 2' 1 2l

Comparing the tahles for land(' we see that .I n C - :<3.1).(4.2).(5.3).(6.4)>.

a> Each testing ol' a pen has lwo poss ible

outcome~: " pen good"(.~) or ··pen bad"

(b). TI1e experiment con-.ists nf te ~ ling pens until u good pen is found.

b) We 110\\ simply record the numberofpens tested. so,\'= 11.2,3.4.5]. c) J'he outcome now consists of a substring of "b 's" and one ··x·· in any order fullowed hy a fina l ··g". S =

Ch 2. Basic Concepts of Probability Theory

If \\C sketch the e"ents A and B "e see that B = AvC. We also see that the intcnals corresponding to ~ I and C have no points in common !>O . I ( -0.

\\ c also sc-: that (r.

\I= (r. n)n(-oo. \I= (- 7" .r I

that is C =A" n B.

since .4isa then

since 11 i~ a ~ul1-.c1 nf. I·

'~ Wake at 11 Sleep at '~ > 11

flnm nc)(m miJn igtu 6 am

Nt)tc that the problem specifies that the student ''a"c~ up hd~lrc returning to 6l v ft, 3l v (I, .. ll u) A n n is lound by ta"-ing the intersection of the region'> in parts b) ami c). We obtain the three triangu lar region ~ sho\Hl bdo''-· \\ hich are in!l:rp1 ~·t~d lx•lm\ . lz

'-lutknt \\ai-1.~ up :atlcnllll>n or :.tudcnt \\ >1..~. up alkr 9 :un :u1tl ~"-. bad.. In -.I\X-1' tl\:h•r~ m>on 1>r

In . h.:cp hdim.: 7 [1 = p =t 1

Ch. 2. Basic Concepts of Probability Theory

b) We c.\ press each e\ ent as the union o r clcrm:ntar) events and then appl) Ax iom Ill ':

We first lind the clements in each event of interest and then appl) A>.. iom Ill':

PI . I v B I = P[ I = P[ ( I > I +PI P J I+ 1'1 I4l l + I 'I ] = t P[A r'l

where we used Corollary I.

ldentitiP.S of thic type: arc

~hown by applicntiun of th1~ nxiom.;, \\'c begin hy treating

( .·\ U D) ;1s R sin~!~ cwmt, then

hy Cor. Son .1 U B noel by distlibuti'

by C'or. 5 on \A n C) U ( FJ n C)

J:aeh transmission is equivalent to tossing a fa ir coin. If the outcome is heads. then the tran . mbsion is successful. Ifit is tails, then another tran~ mi . ~ ion is required. As in 1- xamplc 2. I I the probabil ity thatj transmissions arc r~quircd i":

1 1 - - I - - smcc a multiple ol 2 nnJ 3 is u multiple ol 6. ) -~ 63

11 - f[A ~ Bl = - ---- SII1CC

fJ> u (A n B)] anc.J

1 B~ C! (_!_)" = - since A n 2 64 =

As5umc that the probahilit) of an)' subinh:nal /of 1- 1. 21

propor1innal to its

lf\\c let/ = (- 1. 21 then

a) Pl11 =1 1ength ([ - 1.0))- ·dl) =t

,, I(\ /J I = P1 0 ] - 0

PI I n c I = P1 0 1= 0 h) PJi u /JJ = fll(-I.O) u (O.l)] = fl[I - I.O)J t/'I(O.Ill=t PI 1u

c J = Pil- Lo) u 1+ PI(>. 2 11 = 1 +

PI l u Bu CJ = P[f- 1.0) u (0.2Jl - P[S

l\o\\ use a' ioms and coro llarics: /'jl u fl] = I'[ .IJtPJ/J]-P[A " B) =t1t- O=i

PI I u (' I= P[ .l] +PIC 1- fl[.d..O£ 1- t

f[arhitrar) sequence = correct sequence I = f.;;-. P( . ucccss in two tries]

I - P[failure in l:xlth tries J

I he order in \\hich the 4 toppi ngs arc sckciCd docc; not matter so sampling\\ itlmut ordering.

If toppings may not be repeated, Eq. (2.25) gives 11 ( ) - 1365 pnssible de luxe piuas . 4 II' toppings may be repeated, we hnvc sampling '' ith rcplnccmcnt nnd without ordering. I he number or such arrangements is 4 4 (' : ) - 3060 possible deluxe piuns

I here arc 3' permutations of "hich onl) l)nc cone. ponds to the assuming cquiprohable permutations:

1'1 correct order] =. =1. "

Student Solutions Manual

The nnmh"' uf "'"Y' of dooo tpuprobflblP outr.011H'S in rhe sample spt1o•. \\'car(' inte-res11 din th~ outrmues in which m nf the c·lal)~en iH·m~ are rle(edi,·e and .\1 - m arc IHmdt'fE'c:th·e.

The numlJer of ways of

fP«'t i\·e 'Tutal # of nutc·omr·~

J\s-;uming the lorwan.ls and dcfensemen do not ha\e assigned posit ions, (that is. lcfi/center/ right for forwards and left/ right for dcfenscmen). then

The number of fomard comhinations

The number of defense combination.

The nu mher of goa Iie combinat ions

numl~r of teams ( ~) (;) U) =140.

Ch. 2. Basic Concepts of Probability Theory

If the fon' anJc; and dcfcnsemen ha\ e ac;signed po. . itions then the ~cvera l ways to ac;•.ign ria) crs to positionc.

increases because there arc

@A= Prom Prob lem 2.2 we have that A ::J B. therefore

P[A I B] = fl_l r"ll

P(hgJ = J•jhJll~ I h]=ixf = ~ P[hbv J = .lt. X.! X 1.I =. ~

P[hhhgj = .lxl xl.xl. = 152 I• S I 3 P[hhhbol ~

xl-xl.x. x I =_!_ ~ I .I I~

b) Pfl pen tested I= Pi g] 1'121 = P[hg]

@ a ) P[A] = Plhand rests in last 10 minutes ! P[A] = p\, +p\, +···+p(l()=. - t

P[ BJ= P\fJ + PH + P~ + Ps9 + p,~, P[BIA) = Pj.rl fi !J) =

Student Solutions Manual

tl =t (( + + (t f I + . + (t t

PI B I - t ((1 ) ~~ + ·· · + (t )~Q

@ a ) I he rc5ulls lolhm dircctl) from the de tinition uf' condttiono l probahilit):

PI HI I r . I n /J

then 1'1 An IJ I = 0 by Corollat) 3 and thus

If ,I c II then An B - .I and

1'1 A I Bl =0 . I'I 1111 I = I ,I A I . PJBI

I f A => IJ -=> A n B = B and

h)JI' P[AIIJJ = / [. Ir1 B]>P[, I]thcn multipl)ing hoth "iidcs h)/'[//[ \\e ha"c PIR]

'A c then also ha ve that

PI IJPIHJ - PI B] PI II

Pf A I We conclude that if P[ A I B) > fl[ Al then IJ and A tend to m:cur joint I) .

@ a ) 'A c u.;,e conditional probabilit) to 'io he thi-, problem. I ct I , l fhund in tth le">t>. 1\ lot is accepted i r the items in tests I nod 2 an: nondclccti\ c. that is. iI' 11 n I occur 99~ (,

items in mare defccti\cl = +2J = P[l'=2 IX = 2Jf>IX = 21 -- .ll. - l. ~~ -I

/)I X =2. > =ll = tt = t PIX =2.1' =01 =tt=t PIX =-2.Y =Ol =tt=t P[X - - 2, >'

PI X = - 2, >' = - 2] = tt = t c)

fll>" --t 2J =tt=f= P[Y= - 2]

r = 21 r = k 1= P[ Y = k I x = 2l Pl x - 21 P[J' = k]

Student Solutions Manual

Independence of Events

l' cnts A amJ lJ'

For l\\(.) C\cnts \\ e check \\ hcthcr Cq. (2.11) holds lhr the

J-= PIPJJ -+= P(.l)PIC1 =1.1

P(/J n CI-1'11 =-1 =P[/JJI'[C'J -=t~ .,f

1 hcrcl(m: the pairs or cvenb are independent. for three event. pc1im isc indcpcndt:nce as \\CII o" for: 1'1 . 1n B n C I = P( fill=+ - f[ A]P[ BIPI C I

need to check for

implic" that the tripkt ofc\(.!nts is not indcpcm.Jcnt.

the uuiou of the mutually exdusiw

P[A n B 1 P[. 1 n P [. ~ J- P[A "" Bj P,A.J- PfA]P.B] Pl-\j(l- P(BJ)

by L owUa:y 1 sinrl"' A

U 'l'e indrl'~ndcat

A pcudl'!rtt. Finoll~·

P[A PIB I fii d B' 11t l ' i tlt:pl'lldi•ul

Ch. 2. Basic Concepts of Probability Theory

We U\C a tree diagram to show the sequence of C\>Cnts. I ir. t . e choose an urn. so A or . I occur-\. We then select a ball. so 8 or fl occurs:

Nnw . I nnd B arc indcrendent events ir

1'1 LJ I A I 1'1 BJ But

Pl/1 1.11 PIIJJ =P[B I A]P[A]+PIB I 1']Pl, /' I ~ PI /J I 11 :

errors I =( ~,) (f>a l'' (I -

or fJ(n + I ) - f'(n) P

E E(2)- E( l)l' = E(O)PP = £(0)P

E P". To lind P" \\1! note that if P has eigem aJues J. 1• ).~ and cig~o:n\lcctor~ ~ 1 • ~ then

"here E has £• and £> as culumns

and P" -( E/\E ' )(E/\E ') . (E/\E ')

has eigenvalues l, = I and A2 · y, and cigcmcctur )

h) I he l(liiO\\ ing Octm e code computes th~ . mplc mean and \aria nee of 1000 C\amples of the random variabk~ in the abm c experiment: a

b .. 15 Y = (b-a ) • rand(lOOO,l ) +a • ones(lOOO,l);

I he nc\\ mapping maps a pair of outcomes ofS to each va lue in S1. o;o

PI ) = -., J = J>~ c) X - 0 corrcsponuo., to II>: >

a) S = fOOOO,OOOI. . • III

c) Pu- 11a- fJ, . p , 'I~-. l.?

Ch. 3. Discrete Random Variables

bills" ithout replacement:

Outcomes along I he diagona I cannot occur because sn mrl ing other outcomes ha ve probability -kt = ~~ .

j., \\ ithout rcrlaccment. A II

= ,~ = f since 72 outcomes gi'c .\ 2 P( r - 511 == ~ = -rli =+ since 18 outwme 81 - L "' = I~> = + I X IS

I rom Problem 3.7b: a)

Student Solutions Manual

Expected Value and Moments of Discrete Random Variable

V f•\ RI • \" J =~( ~)11 1 I

I [u+lf - i ]= L: =Ji i + J . l +~ I

L• [ (I + I)' - P] = (k 1 I )\ -

) mcc the -,um is te le!;coping:

P[l , = 0] =-f J'[JA= 11 E[J I J = 0 X 1-t I X 1.5 =1. J

b) S= [O, Il A = [ 0.3 0.25> b) i'[ N = 41 = ( ~ > 0.>5 )'(0. 75)' = 0.0865 since the order o.75J

I ) Fitl:lt ..;uppo. e (n

1.· =. (,_1-_k_-_,;_ l). . .p kq

+ l)p j,. not ;m integer.

k increases Irom 0 to I( 11 + l )pj for /.: > (11

(n +Up-/.: ) :. l'k nttnins h~ maximum at /.:"M.H = [ru 1 lp]

+ l>p = k.\IAX then r1bon• impliffl that.

N= # oferror-fi·ee characters until the first error.

_Lk(l - p)~ p= (l - p)p_Lk cl - p >•' ! -1.

0.99 = /'f\ > k,, ] = I 4] 0.9

I r II allain 0.9.

41 - 0.891. lhcrefon: I\\ 0 cmplo) ccs art! almost sullicicnl to

fll .\ "" 0 J = e " 1 - e ' ~ = 0. 082 I

lh c Octme to plot the pml: a .. 0.1;

Bi nnmiol Poisson II=

Hiunmial Poisson II -

mean of' Poi-.-.un RV ( Jl:IICr.ttc .unt) nl ' 0.1839 0.0613 p = 0.01

7!JI = 1 J:_J !.·=2 0.3S7 0.1937 O.:i6i0 l•.1~30

np= 1 I 1.:=2 ~· 0.3GD7 0.184!> 0.3679 0.1839

\\'c :-ec that fot 1lp = ··onc::tnut. as n ina·easec:: cllul 1J cl• nc R"C" t lte ;\l'ru rar.y of the approximation improves.

Ch. 3. Discrete Random Variables

~ ~ \' lllll'f'orm .Ill 1 f

/~ IXI =-2 E(t r J-4= ~/ 1 -4=0.5

h) l~p l-1·1-2.\ 1 +3 1= -2E[X 1 j+3

=41. 1XJ]- 12£(X 2 J+ 9 -(-8) '

VARI J' l= 4(-lf ) - 12( '~') +9-64 = I05

~ 990] = 1- c. 1 = 1 -~NO + O.S?72.J =0.00 134 c11

In I 000 +0.5772 1

Student Solutions Manual

@"I! need to calcuiJte

fJ the fraction ofpopulation \\ilh \\l'alth less thank. and HI. the proportion of \\Calth

I·~= /)IX ~ AI- cip'- 1 = 1-

Chapter 4: One Random Variable 4.1

The Cumulative Distribution Function \'and > tm: discrete rnndom variables:

2 3 ..J 5 6 0 I I 2 2 3 I I 2 2 3 3

I I I I I I - 1 0 I 2 3 4

=1- (t -+e-:!' 10, ) =5. 15x l0 10

The Probability Density Function We lind c through the normalization condition:

.\ b a continuous random variable. FIX

f!.r dr = .l.i.l2 = . [4..l..J= . 1. :! ! t l

h) To lind the probabilit) that X = 0. \\e need to integrate the delta function: II

x nr\! as shn\\ n he low:

(J') - lR £5(·1' -l 29) + 1.8 o(·I'+ I S) +l.b'( A .I'+ 5) + lo( M ·I' - I) + .lrY( H . I' - 3 )

I h\! condit ional cdr given B b obtained using J q (4.23)

I IJ)- - - Pr .r > o.25J

.\ n 1'1' ~ n ) J"'l'olll t Il P. 11e fi mtw11 11 uy ,_. . b. F·' r.r Io . X ~ J =

I he second moment is obtained in similar 1

I he ' ariancc is then 1 1 V'\RI .\ ] = /.l.\ ]- £[.Y] =t·

Ch. 4. One Random Variable

37 mean of exponential R V

EIX )= J.'x l,(x)dT = O+t [ 2x~e-~'cl\=+1=t 1

f . second moment of l!\l')(lllential RV '-;ec Prohlcm 4.48

for a solution of this integral.

l hing l q. (4.28), we have:

~IT' dr = I I'!e ,: J2;u &a ,·

where we used integration hy purls with

( onc;ider the latter term:

_.-,-f/r =_I_ In( I + x:! >I' = In( I - y) ~ J';r(l_,+X" ) 2;r 2;r I

Thus the integrals do not exist~ E[X l docs not cxi~t.

Student Solutions Manual

intcgr:tl into thrcl! parts

= - a F, (- a) + [ :~:1 (x )dr +a( I - F, ( u ))

J- a' r 1 (- a)+ [x~fr(x)dri-o'(I-F,(u))

Fll' l =- (l)fl[l'5- I] +( I)P[J '---v----' le I

e ( r~ + 2x + 2>1: from .t\ppcndi\ B

\\ c ''rite the general e\.pres~ion for the C\~ctcd '.tluc in lhc terms:

5 - h]+ hF[X ~ h] + 0 x P[- o 5 .\ 5

Ch. 4. One Random Variable

%nilnrly. the second moment is: E[)'~

I=- h~ I'[ X ~ hl+b1 P[X ~ hi -f

1''-4! l , (x)tl\' + jr, _K_ , (b-tl) • (" 11) ( \ + a)' •( , (r)dr

b) II' Xi~ a Laplacian random variable. then - oo ' I --2PI X ~-2 1+ 2P(X ~ 2l+ f 2(xt l)e'clrt J' 2(r l)e-'c/, . 0 ~ ~

1- 4 x t e-1 + 4 x te 2 + 4 f 1 (x + I)2 e' ch: ~ 4

that is. q is the midpoint of each interval. Therel()rc, I

L _d_ £: ' ·' (x - (k +!)d) I

-----•. -cf (lrum . part b) 12

since there arc AI terms in the sum. and since Md - 2.rmil\· This is the same result that was obtained in Example 4.2 0.

Ch. 4. One Random Vanable

Important Continuous Random Variables o) 11 X S d J=

l,kd ~ X ~ (k+ l)d] =F, (Ck+ l )d)- F, (kd)

b) Finu XA. k - I. 2. 3. 4 such that

@ a ) We start b> linding the conditional c .f. I

'I here fore decide ..0.. if

d) I he o\crall probnbility of error i'i then:

Nntc that the \aluc oL\'in the intcnal (0. o) are mapped onto L = 0 :

Ch. 4. One Random Variable

(:) = (I - e- y. )c5(:) + .~ F . (: + cr) = (1 - e-~ )15(:)+ fz 'wi ll have thi'i massa ty =- 4(0) +2=2.

y =g(x) = -4x + 2 / \ (x)

First we find Lhc cd f of Y: I·; (y) -

4X -t 2~y] = P[-4X ~y-2 1

Therefore Pb O. c) If.f,(x) is an even function ofx. thcn.f\(x) /I( x) anu thus /)lr)

PI>' !S y ] = 0 PI >'~ yJ = P[ex :S il = PIX O

b) If \ is a Gaussian random variable. then y50

Student Solutions Manual

The Markov and Chebyshev Inequalities

a) For a unifonn rC1ndom ,-a. able in

P'IX - m > c] PJY- m ,

b) For the Laplacian random Yariahle £[.\j = 0 and L-lRIXJ Exact: PIIX- 111j > cj = P\ XI > c] =e-<> ~ 2 UUlmd:

Ch. 4. One Random Variable

Hl .\' l =1 d¢, ( w) I ·' . o J c:I II'

-h--1'-h [-J.b2 +J. b2] 2 2 - t (h - h) = O E[ \' 2] = 1 - ct'¢\(w)l ·2

I [ I 'h' I . ' ] j(h-a) --o +-ua " "

= f 1: ,= n(p: + lJ) ' ' !' I_

V1\RI N] - n 1 p - np' + np - (np) 1 =np(l - JJ)

probabilit> p \\ith probability 1- f1

~ Each C(lmponent has reliability:

fJ[ v I] • \ J lli'CCJif.C•

\'I and X is a 7ero-mean, unit-va riance rnndo111 variable.

Student Solutions Manual

t) ll~=log6 h) llxiA = log:J lJ 1i - X ~·p = log G-log 3 = log 2

b) X= 4, 1 = 3 or 5, Hr

1 nuifo11n H\. in [-a,a], .f-x. xk) = 2c

-log tJ. -log (Jdx)) 1 -log 6 - lo110 -

-log 6 -lvl! (fAiil(~r)) I -log 1. -lug -

1 1 == log -(I - lo''0 -'211

fhe Jiffea•nce of the dilfc!rcnlial cul.ropy log(a- (-a)) -log(a- 0) =log 2

Ch. 4. One Random Variable

Codeworu 0 3/8 10 1/8 110 1/ 16 UlO lj;32 11110 1/32 11111

FJX I= I xCa+ 2xCa~ +3xCa' +4 xCa 1

= 0.2 8.[~ = 0. 18. P. = 0.12

Chapter 5: Pairs of Random Variables 5.1

Two Random Variables

a) The underlying sample spaceS consists or the pair of outcomes in Carlos and Michael's e>.perimcnts. The random pair (X. l) is generated for the pair from S b) the! mopping shown helm\ :

b) 1 he probabilities for (X. )) are determined from the prooahilitics of the eq ui va lent events in S. Por example. the eq ui va lent event for (.\ ~ 0. )' = I> is the subset

J>f X= O.r = OJ= P[ ) = t, P[X = O.Y = lj = P( IO LI OJ) =+t+tf =+ P[ .\'=OJ' = 2 J = P[ . - ') ] - J>[ I 1 2 l ,l]-- 1..U.·1 + .l.l - .! J

Not product form

re'l = P' O>v< max(X.>') 1 below hy identifying tht.: probability mass at the valut.:s (x,, y 1) in th ~: plane. We show the associntcd marginal pm r.-; of \' nnd Y along the

corresponding margino;;. u) J'

Sh iflto higher \aluc ll I.\

P[X ~ r) = P[(X. >') e !C- 1. - 1).(0.-1).(1.-1), (0.0).( I. 0).( 1. 1)I]= t+ t+t = 2

Student Solutions Manual

The Joint pdf of X and Y

@ a ) The vector (X. >) as. umcs va lues in the triangular region

joint is eva luated sepamtCI) in the four regions indicated below. In each instance \ \C consider an arbitrary po int (x. y) in a region and calcu late the probabi lity lhnt (X :5 x. r l 1 PI /\' - l. y the limiting value of/·, 1(x. y) as x increases is attained at Fu( I. y) (sec Region 3 above). Simi larl> F, 1(x, y) npproaches F., 1(x. x) as y (Region 2).

1'1 \' S xl : ;: ; Fu (x, oc) = x 2

PP SJ'l "" Fu (oo.y) = 2< I'-':) c)

"L\' s-t . r :5~]= (!) =t since(+-~) is in Region 2 n[l.I l.y > l

F\ (x> = lim Fn (x.y)::;::; I I

F,(x) cann()t be equal to I for all x. theref(m: it is not a \alid cdf.

Ch. 5. Pairs of Random Variables

@ a ) Fur 0 . ru !: xu we integrate along the strip indicated below.

J'or 0 O.y > 0 F (x. r) = \)

J'J'( (.\". r' )dY'dl'' = J'J.e-.r' ~c/r' J2 ,·, 1

h) We lind the probability of an event invoh ing \' amJ > hy integrating the pdf the region that corresponds to the evenl. In the Cihe below. for curh \'alut: u r' "~: integrate the joint pdfo,cry from minus infinit) toJ,.

F1 (x) =Jun I ,1 (x.y) = 1- e-T ='

b) prnpt.!rt) (iii) ufthc joint cdf. and so d r,x L.' ( ) J. ' - d (I - e_, 1) -2e

Altcrnati' el> \\e could ha\e inll.:gmtcd the joint pdf using Eq. (5.17uh).

Student Solutions Manual

I e r· 'a' = j 'x J--, · rdrd8

''here \\e lctx - r cos O.y

Independence of Two Random Variables ~hows

lhc table beltm N

the probabilities fo r thc pnirs nf outcomc~.

Outcome ol too;s

I ull pair., Remainder

the joint pmfand associated rnmginnl

=> X and ) arc nut indcpcndcnl.

1' 1 , (0.0) t: p 1 (O>p, (0)

- II P[ \ ()I PI \' = 31 fll .\ - II -+.= fll X = 21 Pj)

@a) P(a t ] = I - 2 f cl\ ' · ·~ = I=

complement of ,-.hile region

Student Solutions Manual

5.6 Joint Moments and Expected Value of a Function of Two Random Variables

1 he e\pcctcd 'alue of a sum of random 'an.tblcs is the sum 'values of the indi\ iduJI random \ariables. n)

2Xl f- l - = f( \'"~) + 2£ \ )

£((X + F)·]- f[X -r) 12 = f(Xz) + 2f[X l ') + £[1"2 ) - £(X]2 -2f!X]E_> j- f l' j2 = I •\ R[X 1 r \ AR[Vj + 2jt"[ \ >"j- £.\It'[) JJ

f( X']£P') Lha• is, if X ami > are

i) f.IXJ =-Ixt+ Oxl+lx+=O /: I)') = 0 ()'has the same pm fa!> \) f[ \T] = ( _ , )( - 1H +(- I) < I H= 0 =>X and r :m.·nr thogonal. COV(X.>' ) = fi.U]-£[ ,\1£[>1 = 0 => \' and> arcuncorrdatcd.

lin\\ ever .\ and > an.: not independent since P[ X -

II >'I = o as he lore I~ l .\'>' 1- ( I )( I H+ (- I ) ( I H+ ( I )(- I H+ ( I )( I H

r are uncorrelated and orthogonal.

lurthcrmnrc • .\"and >'ttre independent since f[X = i.r-

iii) f:.'[ X] - I· [> ) = 0

l![X> 1= (-1) <-l)t+(l).r and r arc not uncorrelared and not orthogonaL X and )'an: not ind~pcndcnt c;ince P[ X= i. Y = i ] 7: P[ \'-

Ch 5. Pairs of Random Variables

NO I E: Probkm 5.67 refers to Problem 5.28.

= ~ = r[l] 2 2x (1- x)dx = ~ x)d:r

V AU[ X]=~- ( ~) = G 3. 13 t. \' > j = 2rydyd-.c: = 2

= 0 orthogonal & unconelated

I 2 1' x( l - '.l') tl = Y, r.

not orthogonal & uncorrclatcd

If we view this as a quadratic equation in/, then the equation is nonnegati\e and therefore has at most a double real root. Therefo re the di scriminunt is nonpositive:

4 E[X 2 1 E[ Y 2 ] ~0

EI.\T I2 ~ EI X 2 ]EI1 ' 2 j

1£'1 \TJI ~ ) E[X 2 ]E[Y 1 ]

Student Solutions Manual

Conditional Probability and Conditional Expectation

rhc conditional pmls tal-.c a column or nm of rcnormnlize it to have unit mass. a)

probabiliti~s from the joint pmf and

d) I he conditional C\pccted value is obtained 'aluc is obtained from l'q (5.51 b):

r 4· (5.49b). and the C\pectetl

l [X IY- 2 j - Ox f+ I x-:!-+ 2x-1-= ~

. f!. .2. - j_ f l:! X Jl.. II• ~ 7 X lb - 8

El >' I X =21 - 2 'f ) _ 1 S ') I _ r. 1- 1 xu, rzXv,+-Xu,-"ji;' ~

@ a ) 1 he conditional pdf 1s found using fq . (5..t5): i)

- l ~.r~ l. - (1 - lxl)$y~1 -l xl

iii) .f, (I' I X)=-( 2 1-J:)

2 1(1 -x) r ' ,~., nX 2 ( l - X) dt = J( I - X c/r

= f( I - 2x + x 2 >i" II

of Binomial Coefficient

N i!' called tlw gcn,•t ali zed Binomial RV.

fo-x rfn(r d,·- £[UJ- I

fooo f[N2 jr)fn(r)dr = f t)f[X1 >t] =e-"'e_"', P ( I t l f'>t 0 ]=P[T>t+l, T>t,] _ P(rT > I+t ,>r"''(T> t0 J]

I 00. Xl be a Raylei gh

lu is the aduitional time.

Ch 5 Pairs of Random Variables

> 11 ] is given abme. "here I is the total time.

- LPIN = nlP[ K = m-n] , u

Therefore the pmfofa sum of discrete random \ariahles is the convolution ofthe pmfs of the im.li'vidual randnm variables. b) \\ c evaluate the summation from part a)" here \f and A arc binomial:

-t("]p'(lk )p"' '( I _1l'(l -piw"'i:(n)( k ) . I

therefore the sum of independent binomial random \aria hies is also hinomial.

Student Solutions Manual

., ,_, a.t -n. a, - L. - e x e ~ I

( 11~)(1: ( Therefore the

a,+ a , )m e (a' ~a.) . m!

of independent Poisson random vnrinbll.!s b abo Pniso.,nn .

f (:> = . /·~ (:) = l - In = -~ O s .:~l

leigh random \ariables "ith a. = fl = I. /

Ch. 5. Pairs of Random Variables

J Ix xe_. -' f = Iv •ve_. ·'

' I ( II' + z II' : . .fuz (ll', : ) = 2_/n -2- ,-2-

. hi1..· h implies that 2

Ch. 5. Pairs of Random Variables

I (.r.- m 1 ) = -;-px a1 2 o

c) ' l'h(• riots iu pa.rlli a) a.nd b) are tlw s;unt' uuly wlw11 f1 =I. lu this

= E.(£[X'2Y p·J] = £(Y2 f.j.\ 2 IF]] 2

between in!>pections is the sum or the /If mtcrarrival

where the X, arc iid exponential random variahlcs "ith mean I .

/, (/) L' .1, (I I,\/ = i)P[ ,\1 = j J I

I he sum ufj independent exponential random variables is I rlang:

f (I I \1 = .) = A.e- J' (A.x) . I

~ (A.x(l- p)) ' , , (j - 1)! =A pe 1.1 e hH - rt l

:. T i an exponential random variable.

c) Choose p so that

0.90 - pt T > t] = e-r• ==:>

Chapter 6: Vector Random Variables 6.1

Vector Random Variables a)

l he prohnhilit) is giH:n b) the volume of the !'>phcn::

l, (X) =t (X + y + t) CO' = t [.\)·I:, +

/x 1 lXl>fx:(·c2lx,)Jx,(xslxhx:z) 1·-

Jo XtZ2 r~ J.r.~. = ' o )~ du• 1 - 2 J-

'L hat is Z is uuif. di;,t . in

Expected Values of Vector Random Variables

/·[MJ =thl \'.1 +!EIX2J=1f+t->= l N Ull!llUl I . \ ,1 - \I

=tE[t(xl-x2 ) ] =f( !.l.\'1 )-2£[ .\ 1 ]EIXl]+£fX; l] =

l[. t.a _.: '-) • + ..l..J = 1. = 1.. j_j_ ..1

r~[1(.\ , +X2)t( XI -xl f]

= rl; £ [ ( .\',' - X J) (X 1 -

- t /·I,\ 1' 1-t £IX, JE[XJl ,

To find EIX' l for an c\poncntial mndom \ariahlc. in E:\amplc 4.43.

~ Fl \ ' ] = ;, Ll \II l=tf--t=t.

lhc thanu.:h:rbtic function

Ch. 6. Vector Random Variables

a) Using the pmrs found in Problem 6.5a: FJ \I = 3xt= I

c.tlculute the joint moments:

ur t E[ .\'7. J t+ t+ F[ L/.1 =1 1 *r = E[., , 1= 3 x 1x

f~l X 2 I = 12 X t + 22 X t + 3, X ( t ) = ~: 1

~p 2 I=I ' X ( t)' =211

£r z 1=1, x -rr = ~~ 2

Finall> the covariance matrix is: 27 1- l xJ. _ I

@ a ) 'I he linear trans formation oft = (U .V,IV) 1 in tcrmr; of J: =(.\'1.X, . X,)T is:

u-x, r' =.\' +.\' w- x, t X 2 -+ _\ , 1

Student Solutions Manual

+ 2wl).Pl'(tVJ + W:~)lu,ct1w,t~c/,•u - d cPu((a + 2b)w)~~·((a + b)tt•) 2 - e'u.-c exp [- ~(aw + 2bw) ] cxp [-~(aw + bw)~] 11

1 exp [ 2 2 ) /2•(2a' + fiab + :'ib


(v- c 2 + Gab I 5b2 )

find the cigcmeclors:

e1 -1le -1. e ·1 2 - I I

te, t-fel =0 e1 - If, IJ '

In the last step. \\C normalized the eigenvccto~ so they have norm I . h)

P io.; gi\-cn b)' the matrix \\

I [-1~I] d) A = [~ ~] - .J2 A=I'' -J2I [ I ~I] K, = A =[~ ~] P--

Ch. 6. Vector Random Variables

@>a) A. [~ ~] M o ~] I 0 [Y. o M

we necu to co cu ate: .I

= e Jlll/11 - - J.' .,IA·II . 1'11c terms •111 t '1e expo nent are:

-l(w . I --~21 -l[J.ll'~ - U'1 II' 2 +! •2 . !ttl

Student Solutions Manual

Estimation of Random Variables

From Problem 5.61 vvc have the terms needed to cvaluult: l ·q. ((L55): COVI.Y. )' 1-

)' = Pn -+[x ty+ !] f(x) +[x + l]

a) I he moments ore then given b;:

+ . ) _ 1. 2.. _. ;I \' 2 1-- 11 j(' , .2 ( ,. + I ) (/,· -1.[.! • ' l -1 l - l 12 - II 1

tY = .~- i: = o:~~ = .~ =v \Rf I 1:[.\T I =t f. f xy( x+ I +t)dr =t[f. f. (x y t-.\/ >

= jr '" [ "~ + J. ++.q., ~ + x . +. 2 +] _ =1. [ +J. + ~ ~ = J1 I

Ch. 6. Vector Random Variables

I he optimum linear estimator for .\· = ( a1.a2 )[x- m\] + m1 : -m/.

=-0.04( x+ y 2( ~)) +~ =- 0.04(x + )') + 1.08(t)

" here from h . (6.63a)

"•] - [ COVIX,Z]] '[C.OVIY.. [a CO Vj,\ ,7. 1 VAR[Z] COVP,%1 2

Generating Correlated Vector Random Variables

IK- ,\II - ..\2 - l);\ ·• .\1, ..\2 :1 ± \/2

The u 11 houv1m.d Pigcnvcctors are:

.80l01 I.16:H2 1.9!101 -.·1 '190

Check \ l+- [ .80101 L16342 l.9ll07 -.48190

.8010 1 J.tl:H07 ] = [ ~ 1 ] l.lG3 J2 -. 18190 I 4

Student Solutions Manual

Eflili.·) = ~Ef(X;. + X.t-,)(XJ:' -t

Sinr.e the X~;; 's are independent. lhe above terms are all l.l I 1. 1'11,.11

t•xn:pi when 1.: - For 1.·

h) 'I h~ ltlllo\\ing Octa\e code generates a sequence nf 1000 ~ampk~: X • n ormal_rnd (O,l,l,lOOO) ; Y • (X+ (0 X(l:length(X)-1)))./2;

a) Consider n causa l trnnslormation matri~:

J[ X - y;][a [ah O (' .X 0 -~

a OJ [ Xa y; h - ~ c: J [ I OJ [ h c -Yz a -Yz h +~ c = 0 2

Match terms lnr the entries in the matrices in the kli-haml !"ide to th1.' ctlrrcsponding entric-. un the right-hand side:

Ch. 6. Vector Random Variables

~ah -tac =0 . > 3b = c

=2 j h2 -h(Jh )+1(91J2 ) = 2

0 J and [.ffi JX..JX

Problems Requiring Cumulative Knowledge

a) If \\e assume the signals are zero mean. then th\! cornpon~.:nt~ of~ correspond to the jointI) Gaussian random variables in [~. 5.1 H "hich arc transferred into an

independent pair 1 b) the inner transformation given in I '· 5.45:

Consider how two consecuti ve bloc "-s Kt and jl arc lranslimned into It and Y-,:

"hich C\pnndcd gi' es: >" I

Student Solut1ons Manual

I he em ariuncc mntri\ for 1: is: (j

a£[XJ + b£[1''] = JE[Xl - 7 HI r I

-t~(~·> (ci>:~ ( o~ )a 1 d lx..•) -'- 2 :\' (aw )atJ1 ~ ( &w )b + ~ x t cu.: ) '< (b. )b:l]t.=O

b2 £[Y 2 J 1- 2•7b£(.\")£(> J £(Z2 ) - £(Zf = a 2 \ 'AR[X) + b2£(> ] VA R( rj = 91 ' t R( .\ I + -t91: IR[>

Student Solutions Manual

#) Note first t1mt

since E[X1 .X 1 ) = £( x:z] if i - j and E[XaXJ] = t:[ ') 2 ir i "f:. J. 'I hus

t(XE)\ 2 ] , .S \'- 1)f(.\1 1 = E. YJE.(.•.'\ :!] + t( V7 )£[X" f( ~'jf[ -

t!S' 2 1 E[S> 2 f[N]t"f r 2 ] + £[N 2 )£[X~]

b) HN>t note that

Ch. 7. Sums of Random Variables

The Sample Mean and the Laws of Large Numbers

@ F o r n - 16. Fq. (7.20) gives

. Oj [l \111, - OI [l \1,,1 20J

= I-P[S100 - 15 > 20 - 15] Jl2:i5 .f0.. 75 ~ I-Q(I.4) =0.92

I number or faulty pens in the duration nf wcd,s b a Poisson random \ariable \\ith mean 15. According to Problem 7.34 "e have for a > 15

11> trial and . rror we lind a= 28, so the student .

(N(f) ~ nl=>t ~s,,I)= X,+Xl+· · · l ~

a) Thcreture the lon. ;·Lerm

rep lat·ement ratP

0 X, dt' = I' t't"' =2x~

., he figure bcJm-. ShO\\S the relation beh\CCn u(l) and the (' ' s. 1

c) I rum the abO\ c figure:

'"' r r au'>dt' = lim-1I 2: j, J,

I '. = Ilim ' CI . . I~ t=l

d) For the residual life in a cyc le

-:>same cost as f(1r age of a cycle.

Student Solutions Manual

Calculating Distributions Using the Discrete Fourier Transform

I ht.: f'o llowing Oclnvc code produces the required IT I s: 8;

I? .. l/2; n = [ O :N-11;

ems= fft(hinomial_pdf(n, N, P), 16); %You can also evaluate the characteristic function ditectly . %w ~ 2.*pi .* n . /N; \ems"" (1-PiP. • e."(j. • w))."N; pmf • ifftlcms. • cms); figure; stem([1:16), pmf,

f he li.>IIO\\ ing Octave code produces the ffoT to ohtain

function fx • ift(phix, n, N) phixs • [phix( (N/2+1) ::tl) phix(1: (N/2))]; fxs • f f t.(phixs) ./(2. *pi) ; fx • [ftshifL(fxs) ; end N = 512; n = [-(N/2): (N/2-1)]; d • 2. *pi. *n./N;

alphnX • 1; alphaY • 2; phiX • l./(1 1 alphaX."2. *n . "2); phiY • 1./(1 1 alphnY."2. *n .• 2); phiZ c phiX. *phiY; pdf • ift(phiZ, n, N); figure; plot(d, pdf, "b"); hold on; plot(d, ift(phiX, n, N), "g"); plot(d, iCt(phiY, n, N), "r");

Ch. 7. Sums of Random Variables

E[S,.J - 1l >I l AHI 8]=P(X1 > 8] P[X, > R] . P( \',,

=Q( 8 1°f Q( - l f = .02 112 c) P( ma~(.r, . X., ) c =¥, 11

m1 = G~ (I) = np G~ ( I) = E[N(N - I)] = E[N ' -N j = m2 - m1

=> m1 = v~. (l)+m 1 = n(n - l)p 2 - np G~(l) = E[N(N -I)( N - 2)1 = m, -3m2 + 2m1

=> m, = v.~ (I) + 3m 2 - 2m1 = n( 11 ( ;~-'

l )(n - 2) p + Jn(11 - l) p ' - np

(1)- E[N(N - I)(N - 2)( N -3) 1= m4 - 6111 1 +11m2 -61111

= n(n - l )(n - 2)(n-3)p +6n(n- l )(n - 2)/ -1 7n(n - l)p 2 + llnp 1

where we used the fact that:

G~. (1) = np. G~ (I) = n(n - l)p' . G,~( I) = n(n - l)(n - 2)p' G_~.1l (I) = n(n

We can now proceed with the calculation 2

-;,'- E[k 2 l = ,:- (n(n-1)p - np) = (1-f.) p

-.l. £rk ~ J =-~ (n(n-l)(n- 2)p' +317(n-l)p~ n 11

Student Solutions Manual

106 4 ..L ,,• J~ [k I =.L . . 11(1t - l >(ll-2)(n-3)JJ

'l hll .. gi ve~ the

dcs ircd covariance matri~

You can usc the.! l()l km ing Octa\ e command~: x • n o t ma l _tnd(O,l,2,2000)

y • A• x plot(X,Y,H+H) cxyl = y(l, :) . • yC2, :) z • teshape(cxyl,20,100) hlst:(mean (z))

unknown me ans and vari anc e o

l : l OO mx( l )., mean(y( l,i: i +2 0) ); my(j) • mean(y(2 , i :it20)) ; cxy2(i) • (xy( i )-20 *mx ( i ) my(i))/19

end h lot(cxy2) mPan(cxy2')

Ch. 8. Statistics

Maximum Likelihood Estimation

In /(x1. . x, IB) =-nlnB - -

n I "£. JX => nO 0 =d- In .f.(x1 , . ,xn I B) =--+-, 1

h) Hy invariance property: •

Tr) the direct approach an)\\O)':

,x, I A)= .i_[n In A A d).

0. 11 =.!_ fx,.a scaled version or n-rrlang random variab le. /1

y > 0, "here .f, is n-Erlang.

0" 1 is unbiao;ed and consistent because it

Student Solutions Manual

d ~ =-LJ - Ina +(a- l)ln .r, ) cia •~I

_Q_In/( X I p) tip

~ ( - -k,, - n - k, , ) c ,- ln /( X p)= ~ tp I p· (I- p)' :)2

Ch. 8. Statistics

i12 I I X I )] ~Ef k,-+ ] ~ n- 1!1 k, I ~ ' [ -r O n ( p =+ ~ ••• rr , , (1- p) 11

The ith measurement is X, = m + N, where hi N, j - 0 and V1\ Rl NJ] sa mple mea n is Af10u = I00 and the variance is a = JIO. Eq. (~.52) with =. = 1.96 gives

1 ,100+ 1.~ )= L,x = 256

'I he e\~ rimcnt im ohe\ n measurements ofn Poisson random \.trtahlc. \\e tak.e 3

the sum nfthc total numhcr of order-.

X= L,N, (eq ui-.. ulcnt tu t.tking thc Mtmplc mean). I

30.847 => Reject H 0

1\ssumc thnt X11 is used in les t.

is Gaussian with m = 8,a.?

N~..·~ man-Pearson Criterion:

II, I (T-q)~ ' I I ( SJ > ' +-,-I In .:o. t-,-InA(x) =- .1l 1n . ---,, l Y. r. fl

Ch. 8. Statistics

f J21!II n (' IT-SfY l

I ._____,___. 2 1!(1

/~,;;;:: 0.99 = f>l y > t" j/11] = Q( t" -~) ll vn ._____,___. 1 ''(•

2.326 22 then t" = 8 + ,-;::;; = 8.4959. v22

1111 : X is (iaussian with m =O,a.l 11 1:

is Gaussim1 with m '# O,a 2 =·;~

a) Proceeding as in Example 8.28: a = 0.10 = /'[,\', > c I Hu]= 2Q( cJ;;) ~ c:-

h) PI I )pl.' II error)= P[l.¥ .. 1 r n: mu •

plot (mu, 1.- ( -normal_cdf ( -1.644 9. -Bmul + (nounal_cdf ( 1. 6449. -Bmul) )

iii (inuc;c; inn \\ith

is Guuc;sian with m > p.a~ 1-..no\\ n

2 compnsih! h) puthcscs

l sc the folltm ing decision regions Reject

\J~lh:: p' y Reject Ho if :r, y

Reject lfu if a,~ l!

the fol io\\ ing Octave code:

s 1g2 • ro: o. 1 : 41 plot(sig2,chisquare_cdf(63 *2.5J./sig2,63):

Bayesian Decision Methods 1111 : // 1:

X i\ exponential '" ith m =5.1 - f'o = ;;,

=5 l ol '> hOlt lite so ld as long

ln(l- a) / a _ 1.3861 - (8.2) k

"· i11l pc I error] =

~( 8 )( 1-1 0 PII ) pc II error] = L/' •

~x i5+(4 ) ,o-' x. ~ ( ) 1o ~'x 51

Student Solutions Manual if lg(.~) -81 > ()

Ma,imit.: thi~. then minimit.: thi'

Thi'> implies that pm/C!riori estimate.

selects (> so that./(0

m or li ~cdum - 9 l 0 o signilicancc lc\d -=> . 1.7

9.01 9.43 I 0 7-l 7.8 1 77.41 0.27 9.43 9.93 83.26

No. of degrees of frc1..·dum 9 I0 o sign i ticancc lc"~l -!>~1.7

Ii , 21.7 ~ Reject

h) pcllhesis that the #"s arc uniform I) di. ll ihutcd in (0. I. . 9)

Chapter 9: Random Processes 9.1 & 9.2

Definition and Specification of a Stochastic Process

\\'e flucl the prohnhili ie" of the he cquh·aJ,•nt e\'ents of;:

X,= J> in tt'rms of the

=~ = P [~4 f2I .. 4:1] = :tI

P[X1 = 1. X2- 1] P[X1

. \., X 2 independent RV's

outside the inlerval [0,1]:

P[. Ytf) =OJ= 1 fort ¢ (0, J) !'or t E [0,

So finally P[X(t) $ .rj = f

Ch. 9. Random Processes

1 1 l1) m, tt) E(X(t)] x dx = -. 0 2 Th~ correlation is again found using conditioning on 7':

g( 11 )g( 11 ·i r) is a periodic function in u so we can b'[X(t)X(t

chang~ t.l!e limits t<> (0, 1):

cos2rrt aJJd t'us2il'(f

rlilii- H'ttl :-ia,n

1 fm I, T such that (wt + f:l) 2,: OJ=~=

+ g( n)] = V 1\ Rl X" I

c) R1 (n,.11.! ) =£[r >:,_]=E[(X , +g(n1 ))(.\', +~(11!)) 1 = E[X, X ,.]+g(n,)E[Xn ] +g(n,)/~ 1 \', J + g(n,)~(n , ) I

>.-=X• + /n 1/ ic;c;imil.u

Student Solutions Manual

@ b ) I or It 11,, 111_ 1(/I.X )\\e define t\\O nuxilia!) \ariablc-. II' and/: 1

.1; 11 111 II JZII 1(ll, II'.;) - / \lf 1

= .f;111l ( : ) /1 l1, ), I 1/, 1(II

we need to define two auxiliary vuriublc!:l a-. \\1.!11.

r yf l(l,l I(I_! (ZI+li',V - :)_/;111 1111 ,(w.:)c/ll'c/·

Sum Process, Binomial Counting Process, and Random Walk t) As tun n' > n. r > j

b;;· indep. iw·;cment p: ope1 ty

r [s,, - s , = 1 - iJr s , = i]

P[S., - i. Sno - A·J P[S, 0 = ~. 'l, 1 - S. P[Sn 0 J. 8n 1

@a) !'eutoolU Protru

1 ~ 3 4 5 6 7 S :J tO

J 2 3 4 5 0 7 8 9 lO

- Sn -I] 0 S. _ = i - l·jP(.Sn - S111 l·]J'[c;',. 1 - Snt - tj

Student Solutions Manual

~f(X"J + 4£[Xn-d = ~P + ~P = P = '-'" [ 4.An 1 2+4 2 A- \"n-1 1

= ~£(4X"X"-r' + 2. Y! + 2X,.H Xn-t

f. ZnZn+ ] = f(Zr. t'(Z +1r] = p1 for k > 1

l ( ) 11 I > 11 1>. ) 11

C.OVI \'n.X. )=E[(X, - m)(X_, - m)] - FI \',X . ]- ml

t ,~rn ~+ r,, ) 1 ] - m 1 =! £[l' 2 J -.L m = 1 V\RP ' I l(lrA = O

Ch 9. Random Processes

Fork > I: I. onl> the tcrmc; in the shaded region arc nonzero. I or k 0. the upper diagonal has n - I

I diagonal has '' - I entries 0 diagonal has n entries 1I

diagonal has n entries 1

Student Solutions Manual

Poisson and Associated Random Processes

let V,- the numher of items dispensed. Note that \\C must have A \\here I is the numhcr or coins ucpositcd in the machine. 1 his affects the IO\\Cr limit in the second equation tx:km.

1'1 '-', k I \'(1) uJ FIN,(!) kl

i:(r)l (l -py-k (. tt)'(! e k

),f f i (Al)A I((l -p). t/)'

= (J.tp )' C! ;. k! 'I his

that il \\-C create a llC\\ process b) selecting C\Cnts from a Pni,son process accoruing to Lkrnoulli trials. the resulting process is aJ,u Pobson.

kl = f>[ N(I-c/) = j. i\'(1)- k I PfN(t) = kj

P[N(I - tl) = .ilPI J\ (I) - I\ U d)

.(pAl)' . . :. . _____:_ e

This shows that nmuom splitti ng of a Poisson process results in incJcpcnuent Poisson rn ndom processes.

@a) P(Z(t) = OIZIO) =OJ= J'(c·:eu

# transitions in (O,t)] = =

?;,1+ot\l+M 1 1 +at 1 _

1 (. lll..) = 1 + 2at I lot

= ;::oo :L-1 -tl -1 ( -1 --tC\t-nt)JJ+• = 1 +at2o:t P(Z

Student Solutions Manual

9.5 Gaussian Random Processes, Wiener Process and Brownian Motion

wf2 + £(Jl]f!B) COS. la~::incdl +£[AJ£[ll) cost. l2 sin w/ 1 + £( 8 2 ) .!lin u.•t, ~ill ~!2 2 u (co.:;wti ccswt2 t siowla sinwl,) 2 - u wsw(tt- t,)

b) Because , ( and Bare jointly Gaussian random variables. X(l) - A cos w f t B sin A cos w(l + s) + B sin w (l I s) arc also jointly Gaussian. with zero means and covariance matrix

cos ro.\ =a s1n ros

P(X(t) + . ::; y 11 X(t + s) + Jt(l + .s) = 1-:y(t) ••\ (!l a - /'/1Y2 - I'< I t s)) = =

hvJ(Yt - Jli)fx,.>(Y'l- Y• 3-(!f1-piJ1 /2tiC ,-(n -111-11'),/:lna

Student Solutions Manual

Stationary Random Processes

. I co..; 2m I ( ·l> cos 2;ct

V1\ Rl.Jl co~ 2;ct 1 cos 21r12 I l cos 27Ci t cos 2lfL2

Aulocovariancc docs no t ucpcnd onl> on I t - 11

is wide Eense staLiona.ry

fx(c 1 )X(r 2 )X(t.t)(:ch :tz, :r3)dztd:t2dx3 = P(:rt ) ~ :r2 I cf:r2, :r3 .

Ch. 9. Random Processes

Express the aho\'P. prCJbabilitics in term of the Xn's:

• - t' ) - Yt · 2''' 1 v "' + •v\. a)-1 ) ·t )(., 1

x. . ~) is idcnticitl 1-o that of (Xh Xz, Xn,

Sinccthe joint prlf of(Xn,-hXn,. X nz-1• Xn;. \',, I . Xn~ 111 12· . , Xr&J-'" 12) if Xn is a :,lu liomny proce >~,+r = YJ]

[~(.\'n 1 t'" + Xr. t r-1)- Yl! . ~(Xn, XnJ+r-d- l/3] = P [~(. \', + x.) = Ya.~Ct',,-n 1 +~ ·I X,,_. +d = !12· ~(Xn3 -ns-t-2 +Xn,-ns+l) = 1/3]

:. (*)hold~ if Xn is a slationuy rr.ndom JII'Ou:ss nmlm pcu licul& if X, is a.n ild process.

c) In order for .\:, and l'n to be jointl> stational). their joint distributions should be in variant to shills in the origin. In parts a) and b) \ . C cx prcs. the joint pmfof >', 's in terms of the X,'s. 'I herel formations ofX(t) and >"(I). l·ur c\ampk.

/. j lcus0 II = l,. o

I hcrclllrc il' s:.unplco, of' X(l) anti HI) arc join11y Un ussiu n. then snmplcs nf /.(f) and XU) arc also jointly (,au lation

of X.t t) c\lf! determined by time averages of s( t).

( I )( - I )dt + 1,1. (I )( I )dl t r

Student Solutions Manual

@Recall the application ofthe clmal7 lnequalil) (Fq. 9.67) during the di scussion of the mean squ~trc periodic process. We had:

lf.\'(1) i~ mean-square periodic. then

Rcpci.IICd appl ic::tl ions 0 r lh is argument lo I I and 11 irnpl ics

I he spcc 1.11 case m cyclostJtlonar).

implies Eq (9.70h) and hcnLc that X( f) is "ide-sense

Continuity, Derivatives, and Integrals of Random Processes

Pl ..\'(1) dbcontinuous at lul = P[s -

Ch. 9. Random Processes

X(l) i~ m "· continuouc;

\\'c cnn al'>o tlctcrmine continuit) from the autocorrelation function:

Next we ucterminc if R,, (1 1,t1 ) is continuous nt

R, Uu 1- s,' I 0 + c2)- R_, (I0 .fu) =e A111[1\fl, II I ,1, I cl) -("-..II.,

'1hus the m.s. deri,ativc does not exist. d) X(l) isms. integrable if the lollo\\ ing int~gral c\bts:

Student Solutions Manual

X(l) is 111.s. inlcgmhlc.

1hen f'mm l q. (9.9 I ): Ill I (/ ) =

VARI X, is mean crgod ic

~ln order for 1 to be a val•l estimlltl" for R ·(r), Y(t) = .\(t> \.(t+.,) iuust I e mea-n-ergodic. Nole t haL

docs not dt•pend ou l. T hus XX(H· r) is mean ergodic iff Cx(r)X(HT)(t., lz) is l:;uch that.

Student Solutions Manual

~a) I kr~ \\ C . uppnse that ''e observe.\:, onl) l ~

\:1\llllling rrti(\.'S'> felr C\1!01 ,,\,~)

>,L /., ~ l~"ll" I= E[u(a - X, )] = !'[.\', = £f\'(t1 )X'(t1 )] ==E1Xe

Ir /·.'fXI - 0, then .Y(I) is a WSS random prm:css.

~ n ) Thf• corrt>ln tion herween Fourier coe1ficients is: = E [2_ LT X(t' )e-' 1"l;r'JT dt']_ fr X (t'')eJ'l·mt''/'f dt"] T .o T lo = T ltis is Gq

If X( I) i!'l rn.c; . po>riodic then Ux ( li) is periodic nnd the inner integral is

Generating Random Processes

@a) \P9 .11 8 \part a

clear all; close all;

Student Solutions Manual sazeros(200,10,3); ls dimensions are: (n, realization , p) p•[0.25 0.5 0.75]; for sample c 1:1:10 fot i • 1:1:3 i f (rand p(j)l; y(sample,1,i,j) =seep; for n = 2:1:200 rn • rand; step= -1 *(rn p(j)); y(sample,n,i,J) = alpha(il*y(sample,n -1, 1,jl+step; end end

Ch. 9. Random Processes

figure(sample*4+il; plot(l:200,y(sample,1:200,i,l), •--',1:200,y(sample,l:200,i,2)); legend ( 'p = 0. 5', 'p = 0. 25') ; xlabel ( 'n') ylabel('Yn, random process') str • sprintf('Problem 9.123a, alpha • \l.lf',alpha(i)); title Cstr); end end m "' mean (y); v • vat(y); \plotting mean and variance fori 1:1:3 figut.e(200+i); subploL(2,1,1); ploL(l:l:200, m(1, 1: 200,i,l), ' --' J:1:200, m(l, l :200,i,2)); legend( ' p = 0.5 ' , ' p = 0.25'); x 1 a be 1 C' n ylabel( 1 mean of Yn') str • sprintf( ' Problem 9.123a, alpha "' \l.lf',alpha(i)); t.:itle(str); subplot(2,1,2); plot(l:200, v(l,1:200,i,l), 1:200, vC1,1:200,i,2)); legend('p = 0.5', 'p = 0.25 xlabel ( 'n l ylabel( variance of Yn title (str); end I

\histogt·am for sample = 1:1:5 fori • 1:1:3 figure(300+sample *4+i); for j .. 1:1 : 2 subplot(2,l,j); hist (y (sample, : , i, j)); xlabel('Yn , output ' ) ylabel ('Histogram cow1t') st.:r = sprintf( ' Problem 9.123a, histog1am for alpha p = \l.lf,sample#%d' ,alpha(]) ,p(j),sample); title(str); l:l end end end 2

Stucl ent Solutions Manual clear all; close a 11; y • zeros(50,200,2); \dimensions are: (realization, n, p) alpha .. 0.5; step • 0; ( 0.5 0.25);

for somple • 1:1:50 for j .. 1:1:2 1·n = rand; step = -1 * (rn

plj) l; y(sample, l ,j) step; for n "' 2:1:200 1:n = rand; step .. -1 * ( r n tJ ( i ) ) ; y(sample , n,j) = a l p ha"' y(sample, n - l,j) t sLcp;

end end m • mean(y);

v • vat (y); \ploLting mean and variance: figute (100); oubplot(2,1,l); plot(l:l:200, m(l,l:200,1), • --• 1 :1:200, m(l,l:200,2)); 1 egend ( • p = 0 . 5 • , • p = o . 2 5 • ) ; xlabel ( 'n') yl

c) cl~ar all; close all; y • zeros(50,200,2); \y dimensions: (realization, n, p) incl(l:4,1:50) 0; inc2(1:4,1:50) = 0; alpha • 0.5; step = 0; p - (0.5 0.25]; for sample = 1:1:50 for j = 1:1:2 rn == rand; step== -l * (rn p(j)); y(sample,l,j)==sLep; for n = 2:1:200 rn = rand; step= -l * (rn p(j)); y(sample,n,j) = alpha*y(sample,n-l,j)+step; end end inc1(1,sample) y(sample,50,1)-y(sample,l,l); incl(2,sample) y(sample,100,1)-y(sample,5l,l); inc1(3,sample) y(sample,l50,1)-y(samp1e,101,1); incl (4,sample) y(sample,200,1)-y(sam~le,151,1); ~nc2(1,sample) y(sample,50,2)-y(sample,l,2); l.nc2(2,sample) y(sample,l00,2)-y(sample,51,2); inc2(3,sample) y(sample,l50,2)-y(sample,101,2); inc2(4,sample) y(sample,200,2)-y(samplP,l51,2); end

Student Solutions Manual hiat (inc l (2,:) ,5); xlabel ( 'inctements [51-100] '); ylabel('number of samples'); title('Ptoblem 9.123c, p = 0.5'); subplot (2,2,3l; hist (incl (3,: l ,5 ) ; xlabel('increments (101-150) '); ylabel('number of samples'); title('Problem 9.123c, p = 0.5'); subplot(2,2,4l; hist (incl (4,: l, 5); xl.!bel ( 1 inc.tements [151-200) '); ylabel( 1 number of samples'); title('Problem 9.123c, p = 0.5 1 ) ; replot; Cigure (2); ( j nc2 ' , 5) ; sulJploL(2,2,1); hist (inc2 (1,:), 5); title('P~oblem 9.123c, p = 0.25'); xlabel('increments [1-50] '); ylabel('number of samples'); subplot(2,2,2l; hi st.(inc 2(2, :l ,5 ) ; title('Problem 9.123c, p = 0.25'); xlabel('increments [51-100] '); ylabel ( 'numher of samples') ; GUbplot(2,2,3 ) ; hJ.Gt (inc2 (3,:) ,5); tltle( 1 Problem 9.123c, p = 0.25'); xlabel ('increments [101-150] 'l; ylabel ('number of samples') ; subplot(2,2,4); hi s l (inc2 (4,: l ,5); xlabel ( 'inc1.·ements (151-200) 1 ) ; ylabel('number of samples'); tit l e( 1 Problem 9.123c, p = 0.25'); \ h ls t;