# Reasoning with Uncertain Reasoning with Uncertain InformationInformation

(1)

## Information Information

(2)

l

l

l

l

l

l

l

l

l

l

l

l

l

l

(3)

l

l

l

l

### Joint probability

(B (BAT_OK), M (MOVES) , L (LIFTABLE), G (GUAGE))

Joint

Probability

(True, True, True, True) 0.5686

(True, True, True, False) 0.0299

(True, True, False, True) 0.0135

(True, True, False, False) 0.0007

Ex.

(4)

l

l

### Conditional probability

¨ Ex. The probability that the battery is charged given Ex.

l

l

### Conditional probability

¨ Ex. The probability that the battery is charged given

(5)

### 19.1 Review of Probability Theory (3/4) 19.1 Review of Probability Theory (3/4)

Figure 19.1 A Venn Diagram

(6)

l

l

### Bayes’ rule

l

¨ Abbreviation for l

l

### Bayes’ rule

l

¨ Abbreviation for

(7)

l

i

i

### given the evidence e =e.

( ) ( )

( )

( )

( )

[ ]

( )

( )

( R) p( R)

p

R p

R Q P p R Q P p R

p

R Q R p

Q p

= Ø Ø

= +

Ø

Ø Ø

+

= Ø Ø

= Ø Ø

3 . 0 1

. 0 2 . 0

, , )

, ,

| ,

l

i

i

### given the evidence e =e.

p(P,Q,R) 0.3

p(P,Q,¬R) 0.2

p(P, ¬Q,R) 0.2

p(P, ¬Q,¬R) 0.1

p(¬P,Q,R) 0.05

p(¬P, Q, ¬R) 0.1 p(¬P, ¬Q,R) 0.05 p(¬P, ¬Q,¬R) 0.0

( ) ( )

( )

( )

( )

[ ]

( )

( )

( R) p( R)

p

R p

R Q P p R Q P p R

p

R Q R p

Q p

= Ø Ø

= +

Ø

Ø Ø

+

= Ø Ø

= Ø Ø

3 . 0 1

. 0 2 . 0

, , )

, ,

| ,

( ) ( )

( )

( )

( )

[ ]

( )

( )

( R) p( R)

p

R p

R Q P p R Q P p R

p

R Q R p

Q p

= Ø Ø

= +

Ø

Ø Ø Ø + Ø

= Ø Ø

Ø

= Ø Ø Ø

1 . 0 0

. 0 1 . 0

, , )

, ,

| ,

( )

( | ) (| | ) 1

75 . 0

|

= Ø

= Ø

R Q p R Q P

R Q p Q

(8)

l

l

l

l

l

l

(9)

l

l

### Characteristics of Bayesian networks

¨ Node Vi is conditionally independent of any subset of nodes that are not descendents of Vi.

l

l

l

l

### Characteristics of Bayesian networks

¨ Node Vi is conditionally independent of any subset of nodes that are not descendents of Vi.

l

l

( )

( )

=

=

k

i

i i

k p V Pa V

V V

V p

1 2

1, ,..., | ( )

(10)

(11)

### Networks (1/3)

l Causal or top-down inference

¨ Ex. The probability that the arm moves given that the block is liftable

M B L

p B p

M B L

p B

### )

p

L B p L B M

p L

B p L B M p

L B M

p L

B M p L

M p

Ø Ø

+

=

Ø Ø

+

=

Ø +

=

,

| ,

|

| ,

|

| ,

|

| ,

| ,

|

l Causal or top-down inference

¨ Ex. The probability that the arm moves given that the block is liftable

M B L

p B p

M B L

p B

p

L B p L B M

p L

B p L B M p

L B M

p L

B M p L

M p

Ø Ø

+

=

Ø Ø

+

=

Ø +

=

,

| ,

|

| ,

|

| ,

|

| ,

| ,

|

(12)

### Networks (2/3)

l Diagnostic or bottom-up inference

¨ Using an effect (or symptom) to infer a cause

¨ Ex. The probability that the block is not liftable given that the arm does not move.

ØM |ØL

### )

= 0.9525

p (using a causal reasoning)

l Diagnostic or bottom-up inference

¨ Using an effect (or symptom) to infer a cause

¨ Ex. The probability that the block is not liftable given that the arm does not move.

ØM |ØL

### )

= 0.9525

p (using a causal reasoning)

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

( | ) 0.88632

03665 .

0 7 . 0 0595 . 0

| |

28575 .

0 3 . 0 9525 . 0

| |

= Ø Ø

= Ø Ø

= ´ Ø

= Ø Ø

= Ø Ø

= ´ Ø

Ø Ø

= Ø Ø

Ø

M L p

M p M

p M

p

L p L M M p

L p

M p M

p M

p

L p L M M p

L

p (Bayes’ rule)

(13)

### Networks (3/3)

l Explaining away

¨ ¬B explains ¬M, making ¬L less certain

( ) ( ) ( )

( )

( ) ( ) ( )

( )

( ) ( ) ( )

( )

88632 .

0 030 . 0

, ,

|

,

| ,

|

,

| , ,

|

<<

=

Ø Ø

Ø Ø

Ø Ø

= Ø

Ø Ø

Ø Ø

Ø Ø

Ø

= Ø

Ø Ø

Ø Ø

Ø

= Ø Ø

Ø Ø

M B p

L p B p L B M p

M B p

L p L B p L B M p

M B p

L p L B M M p

B L

p (Bayes’ rule)

(def. of conditional prob.)

l Explaining away

¨ ¬B explains ¬M, making ¬L less certain

( ) ( ) ( )

( )

( ) ( ) ( )

( )

( ) ( ) ( )

( )

88632 .

0 030 . 0

, ,

|

,

| ,

|

,

| , ,

|

<<

=

Ø Ø

Ø Ø

Ø Ø

= Ø

Ø Ø

Ø Ø

Ø Ø

Ø

= Ø

Ø Ø

Ø Ø

Ø

= Ø Ø

Ø Ø

M B p

L p B p L B M p

M B p

L p L B p L B M p

M B p

L p L B M M p

B L p

(structure of the Bayes network)

(14)

### 19.5 Uncertain Evidence 19.5 Uncertain Evidence

l We must be certain about the truth or falsity of the propositions they represent.

¨ Each uncertain evidence node should have a child node, about which we can be certain.

¨ Ex. Suppose the robot is not certain that its arm did not move.

< Introducing M’ : “The arm sensor says that the arm moved”

– We can be certain that that proposition is either true or false.

< p(¬L| ¬B, ¬M’) instead of p(¬L| ¬B, ¬M)

¨ Ex. Suppose we are uncertain about whether or not the battery is charged.

< Introducing G : “Battery guage”

l We must be certain about the truth or falsity of the propositions they represent.

¨ Each uncertain evidence node should have a child node, about which we can be certain.

¨ Ex. Suppose the robot is not certain that its arm did not move.

< Introducing M’ : “The arm sensor says that the arm moved”

– We can be certain that that proposition is either true or false.

< p(¬L| ¬B, ¬M’) instead of p(¬L| ¬B, ¬M)

¨ Ex. Suppose we are uncertain about whether or not the battery is charged.

< Introducing G : “Battery guage”

(15)

### 19.6 D--Separation (1/3) Separation (1/3)

l e d-separates Vi and Vj if for every undirected path in the Bayes network between Vi and Vj, there is some node, Vb, on the path having one of the following three properties.

¨ Vb is in e, and both arcs on the path lead out of Vb

¨ Vb is in e, and one arc on the path leads in to Vb and one arc leads out.

¨ Neither Vb nor any descendant of Vb is in e, and both arcs on the path lead in to Vb.

l Vb blocks the path given e when any one of these conditions holds for a path.

l Two nodes Vi and Vj are conditionally independent given a set of node e

l e d-separates Vi and Vj if for every undirected path in the Bayes network between Vi and Vj, there is some node, Vb, on the path having one of the following three properties.

¨ Vb is in e, and both arcs on the path lead out of Vb

¨ Vb is in e, and one arc on the path leads in to Vb and one arc leads out.

¨ Neither Vb nor any descendant of Vb is in e, and both arcs on the path lead in to Vb.

l Vb blocks the path given e when any one of these conditions holds for a path.

l Two nodes Vi and Vj are conditionally independent given a set of node e

(16)

(17)

l

### Ex.

¨ I(G, L|B) by rules 1 and 3

<By rule 1, B blocks the (only) path between G and L, given B.

<By rule 3, M also blocks this path given B.

¨ I(G, L)

<By rule 3, M blocks the path between G and L.

¨ I(B, L)

<By rule 3, M blocks the path between B and L.

l

l

### Ex.

¨ I(G, L|B) by rules 1 and 3

<By rule 1, B blocks the (only) path between G and L, given B.

<By rule 3, M also blocks this path given B.

¨ I(G, L)

<By rule 3, M blocks the path between G and L.

¨ I(B, L)

<By rule 3, M blocks the path between B and L.

l

(18)

l

### Polytree

¨ A DAG for which there is just one path, along arcs in either direction, between any two nodes in the DAG.

(19)

l

### A node is above Q

¨ The node is connected to Q only through Q’s parents l

### A node is below Q

¨ The node is connected to Q only through Q’s immediate successors.

l

### Three types of evidence.

¨ All evidence nodes are above Q.

¨ All evidence nodes are below Q.

¨ There are evidence nodes both above and below Q.

l

### A node is above Q

¨ The node is connected to Q only through Q’s parents l

### A node is below Q

¨ The node is connected to Q only through Q’s immediate successors.

l

### Three types of evidence.

¨ All evidence nodes are above Q.

¨ All evidence nodes are below Q.

¨ There are evidence nodes both above and below Q.

(20)

l

l

### å å å å å

=

=

=

=

=

7 , 6

7 , 6

7 , 6

7 , 6

7 , 6

4

| 7 5

| 6 7

, 6

|

4 , 5

| 7 4

, 5

| 6 7

, 6

|

4 , 5

| 7 , 6 7

, 6

|

4 , 5

| 7 , 6 4

, 5 , 7 , 6

|

4 , 5

| 7 , 6 , 4

, 5

|

P P

P P

P P

P P

P P

P P p P P p P P Q p

P P P p P P P p P P Q p

P P P P p P P Q p

P P P P p P P P P Q p

P P P P Q p P

P Q p

(Structure of

The Bayes network) l

l

### å å å å å

=

=

=

=

=

7 , 6

7 , 6

7 , 6

7 , 6

7 , 6

4

| 7 5

| 6 7

, 6

|

4 , 5

| 7 4

, 5

| 6 7

, 6

|

4 , 5

| 7 , 6 7

, 6

|

4 , 5

| 7 , 6 4

, 5 , 7 , 6

|

4 , 5

| 7 , 6 , 4

, 5

|

P P

P P

P P

P P

P P

P P p P P p P P Q p

P P P p P P P p P P Q p

P P P P p P P Q p

P P P P p P P P P Q p

P P P P Q p P

P Q p

(Structure of

The Bayes network) (d-separation)

(d-separation)

(21)

l

l

### Calculating p(P5|P1)

¨ Evidence is “below”

¨ Here, we use Bayes’ rule

( ) ( ) ( ) ( ) ( )

( )

( ) ( ) ( )

### å å

=

=

=

2 , 1

3 3

2 5

| 1 2

, 1

| 6 5

| 6

3 4

, 3

| 7 4

| 3 4

, 3

| 7 4

| 7

P P

P P

P p P P p P P P p P

P p

P p P P P p P

P p P P P p P

P p

l

l

### Calculating p(P5|P1)

¨ Evidence is “below”

¨ Here, we use Bayes’ rule

( ) ( ) ( ) ( ) ( )

( )

( ) ( ) ( )

### å å

=

=

=

2 , 1

3 3

2 5

| 1 2

, 1

| 6 5

| 6

3 4

, 3

| 7 4

| 3 4

, 3

| 7 4

| 7

P P

P P

P p P P p P P P p P

P p

P p P P P p P

P p P P P p P

P p

5 1 1

| 5 5

|

1 p P

P p P P P p

P

p =

(22)

l

P P Q

p P P Q

p Q

kp

Q p Q P

P P

P kp

P P

P P

p

Q p Q P

P P

P P p

P P

P Q p

| 11 ,

14

| 13 ,

12

| 11 ,

14 ,

13 ,

12

11 ,

14 ,

13 ,

12

| 11 ,

14 ,

13 ,

1 12 ,

14 ,

13 ,

12

|

=

=

=

l

P P Q

p P P Q

p Q

kp

Q p Q P

P P

P kp

P P

P P

p

Q p Q P

P P

P P p

P P

P Q p

| 11 ,

14

| 13 ,

12

| 11 ,

14 ,

13 ,

12

11 ,

14 ,

13 ,

12

| 11 ,

14 ,

13 ,

1 12 ,

14 ,

13 ,

12

|

=

=

=

=

=

9

| 9 9

| 13 ,

12

| 9 ,

9

| 13 ,

12

| 13 ,

12

P

Q P

p P P

P p

Q p

p Q P P

P p Q

P P

p

(23)

### å å

=

=

10 10

| 10 10

| 11 10

| 14

| 10 10

| 11 ,

14

| 11 ,

14

P P

Q P

p P

P p P

P p

Q P

p P

P P

p Q

P P

p

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

( 15, 10| 11) ( 15| 10, 11) ( 10| 11) ( 15| 10, 11) ( 10)

11 11

| 10 , 10 15

, 15

11 11

| 10 , 10 15

, 15

| 11

11 11

, 10

| 15 10

| 15

10

| 15 10

, 15

| 11 10

| 11

1 11

15

P p P P

P p P

P p P P

P p P

P P

p

P p P P

P p P k

P p

P p P P

P P p

P P

p

P p P P

P p P

P p

P P

p P

P P

p P

P p

P P

=

=

- =

=

=

=

( )

### å

( ) ( )

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

( 15, 10| 11) ( 15| 10, 11) ( 10| 11) ( 15| 10, 11) ( 10)

11 11

| 10 , 10 15

, 15

11 11

| 10 , 10 15

, 15

| 11

11 11

, 10

| 15 10

| 15

10

| 15 10

, 15

| 11 10

| 11

1 11

15

P p P P

P p P

P p P P

P p P

P P

p

P p P P

P p P k

P p

P p P P

P P p

P P

p

P p P P

P p P

P p

P P

p P

P P

p P

P p

P P

=

=

- =

=

=

=

(24)

-

+

+ +

-

+ -

+ +

- -

+

=

=

=

e e

e e

e

e e

e e

e e e

|

|

| ,

|

|

| ,

, |

|

2 2

Q p Q p

k

Q p Q

p k

p

Q p Q

Q p p

e+ e-

-

+

+ +

-

+ -

+ +

- -

+

=

=

=

e e

e e

e

e e

e e

e e e

|

|

| ,

|

|

| ,

, |

|

2 2

Q p Q p

k

Q p Q

p k

p

Q p Q

Q p p

(25)

Q U

kp

U Q

p Q

p | = |

### ( ) ( ) ( ) ( )

80 . 0 99 . 0 8 . 0 01 . 0 95 . 0

,

| ,

|

,

|

|

=

´ +

´

=

Ø Ø

+

=

=

R p Q R P

p R

p Q R P p

R p Q R P p Q

P p

R

### ( ) ( ) ( ) ( )

80 . 0 99 . 0 8 . 0 01 . 0 95 . 0

,

| ,

|

,

|

|

=

´ +

´

=

Ø Ø

+

=

=

R p Q R P

p R

p Q R P p

R p Q R P p Q

P p

R

### ( ) ( ) ( ) ( )

019 . 0 99 . 0 01 . 0 01 . 0 90 . 0

,

| ,

|

,

|

|

=

´ +

´

=

Ø Ø

Ø +

Ø

=

Ø

=

Ø

R p Q R P

p R p Q R P p

R p Q R P p Q

P p

R

(26)

### A Numerical Example (2/2) A Numerical Example (2/2)

l Other techniques

### ( ) ( ) ( )

60 . 0 2 . 0 2 . 0 8 . 0 7 . 0

2 . 0

| 8

. 0

|

|

=

´ +

´

=

´ Ø

+

´

= p U P p U P

Q U p

### ( ) ( ) ( )

21 . 0 98 . 0 2 . 0 019 . 0 7 . 0

98 . 0

| 019

. 0

|

|

=

´ +

´

=

´ Ø

+

´

=

ØQ p U P p U P

U p

l Other techniques

### ( ) ( ) ( )

21 . 0 98 . 0 2 . 0 019 . 0 7 . 0

98 . 0

| 019

. 0

|

|

=

´ +

´

=

´ Ø

+

´

=

ØQ p U P p U P

U p

|

4.35 0.03 0.13 ,

35 . 4

20 . 0 95

. 0 21 . 0

|

03 . 0 05

. 0 6 . 0

|

=

´

=

=

\

´

=

´

´

= Ø

´

=

´

´

=

U Q p k

k k

U Q p

k k

U Q p

(27)

l

### [Feller 1968]

¨ Probability Theory

l

### [Goldszmidt, Morris & Pearl 1990]

¨ Non-monotonic inference through probabilistic method l

### [Pearl 1982a, Kim & Pearl 1983]

¨ Message-passing algorithm

l

### [Russell & Norvig 1995, pp.447ff]

¨ Polytree methods l

### [Feller 1968]

¨ Probability Theory

l

### [Goldszmidt, Morris & Pearl 1990]

¨ Non-monotonic inference through probabilistic method l

### [Pearl 1982a, Kim & Pearl 1983]

¨ Message-passing algorithm

l

### [Russell & Norvig 1995, pp.447ff]

¨ Polytree methods

(28)

l

### [Shachter & Kenley 1989]

¨ Bayesian network for continuous random variables l

### [Wellman 1990]

¨ Qualitative networks l

### [Neapolitan 1990]

¨ Probabilistic methods in expert systems l

### [Henrion 1990]

¨ Probability inference in Bayesian networks l

### [Shachter & Kenley 1989]

¨ Bayesian network for continuous random variables l

### [Wellman 1990]

¨ Qualitative networks l

### [Neapolitan 1990]

¨ Probabilistic methods in expert systems l

### [Henrion 1990]

¨ Probability inference in Bayesian networks

(29)

l

### [Jensen 1996]

¨ Bayesian networks: HUGIN system l

### [Neal 1991]

¨ Relationships between Bayesian networks and neural networks

l

¨ PATHFINDER

l

¨ CPCSBN l

### [Jensen 1996]

¨ Bayesian networks: HUGIN system l

### [Neal 1991]

¨ Relationships between Bayesian networks and neural networks

l

¨ PATHFINDER

l

¨ CPCSBN

(30)

l

### [Shortliffe 1976, Buchanan & Shortliffe 1984]

¨ MYCIN: uses certainty factor l

### [Duda, Hart & Nilsson 1987]

¨ PROSPECTOR: uses sufficiency index and necessity index l

¨ Fuzzy logic and possibility theory l

### [Dempster 1968, Shafer 1979]

¨ Dempster-Shafer’s combination rules

l

### [Shortliffe 1976, Buchanan & Shortliffe 1984]

¨ MYCIN: uses certainty factor l

### [Duda, Hart & Nilsson 1987]

¨ PROSPECTOR: uses sufficiency index and necessity index l

¨ Fuzzy logic and possibility theory l

### [Dempster 1968, Shafer 1979]

¨ Dempster-Shafer’s combination rules

(31)

l

### [Nilsson 1986]

¨ Probabilistic logic

l

### [Tversky & Kahneman 1982]

¨ Human generally loses consistency facing uncertainty l

### [Shafer & Pearl 1990]

¨ Papers for uncertain inference l

### Proceedings & Journals

¨ Uncertainty in Artificial Intelligence (UAI)

¨ International Journal of Approximate Reasoning l

### [Nilsson 1986]

¨ Probabilistic logic

l

### [Tversky & Kahneman 1982]

¨ Human generally loses consistency facing uncertainty l

### [Shafer & Pearl 1990]

¨ Papers for uncertain inference l

### Proceedings & Journals

¨ Uncertainty in Artificial Intelligence (UAI)

¨ International Journal of Approximate Reasoning

Updating...

관련 주제 :