18.03.2014 Views

Recognition of facial expressions - Knowledge Based Systems ...

Recognition of facial expressions - Knowledge Based Systems ...

Recognition of facial expressions - Knowledge Based Systems ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

In the final step <strong>of</strong> constructing a Bayesian network, the local probability distributions<br />

p x i<br />

| pa ) were assessed. In the example, where all variables are discrete, one<br />

(<br />

i<br />

distribution for<br />

are shown in Figure 8.<br />

X<br />

i<br />

was assessed for every configuration <strong>of</strong><br />

Inference in a Bayesian Network<br />

Pa<br />

i<br />

. Example distributions<br />

Once a Bayesian network has been constructed (from prior knowledge, data, or a<br />

combination), the next step is to determine various probabilities <strong>of</strong> interest from the<br />

model. In the problem concerning detection <strong>of</strong> <strong>facial</strong> <strong>expressions</strong>, the probability <strong>of</strong> the<br />

existence <strong>of</strong> happiness expression, given observations <strong>of</strong> the other variables is to be<br />

discovered. This probability is not stored directly in the model, and hence needs to be<br />

computed. In general, the computation <strong>of</strong> a probability <strong>of</strong> interest given a model is known<br />

as probabilistic inference. Because a Bayesian network for X determines a joint<br />

probability distribution for X , the Bayesian network was used to compute any probability<br />

<strong>of</strong> interest. For example, from the Bayesian network in Figure 8, the probability <strong>of</strong> a<br />

certain expression given observations <strong>of</strong> the other variables can be computed as follows:<br />

Equation 6<br />

p ( Expression | P , P ,..., P<br />

1<br />

2<br />

10<br />

) =<br />

p(<br />

Expression,<br />

P1<br />

, P2<br />

,..., P<br />

p(<br />

P , P ,..., P )<br />

1<br />

2<br />

10<br />

10<br />

)<br />

=<br />

p(Expression, P1<br />

, P2<br />

,..., P10<br />

)<br />

p(Expression', P , P ,..., P<br />

Expression'<br />

1<br />

2<br />

10<br />

)<br />

For problems with many variables, however, this direct approach is not practical.<br />

Fortunately, at least when all variables are discrete, the conditional independencies can be<br />

exploited encoded in a Bayesian network to make the computation more efficient. In the<br />

example, given the conditional independencies in Equation 5, Equation 6 becomes:<br />

p ( Expression | P , P ,..., P<br />

1<br />

2<br />

10<br />

) =<br />

Expression'<br />

Equation 7<br />

p(Expression)p(P1<br />

| Expression)...p(P10<br />

| Expression)<br />

p(Expression' )p(P | Expression' )...p(P | Expression' )<br />

1<br />

10<br />

- 36 -

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!