Justify all answers!
1) Find the most general unifier for the following pairs, if it exists
(variables are identifiers with a ? prefix).
a) P(A, ?z, C), P(?x, ?x, ?z)
b) Foo(?x, Foo(?x)), Foo(Foo(?y), ?y)
c) Q(?y, G(A, B)), Q(G(?x, ?x), ?y)
2) Write down the axioms and facts for example below in FOPC:
The taxonomic hierarchy for animals, contains birds, which
which in turn contains penguins and sparrows (which are
disjoint).
All birds can fly, unless they are penguins. No penguins can fly.
Anything that can fly has wings. Poopoo is a penguin.
a) Convert the knowledge base into normal form
b) Show how this knowledge-base should be indexed - using
a table.
c) Prove or disprove by using resolution (or argue that neither
can be done):
1) Poopoo has wings,
2) Poopoo can fly.
3) Poopoo is an animal.
4) If poopoo is a sparrow, then Poopoo cannot fly.
d) Would using INPUT RESOLUTION only change the answers to c?
3) It is known that there is a correlation between smoking and cancer.
Medical associations claim that smoking causes cancer. Tobacco companies
claim that there is a hidden element that causes both cancer and smoking.
Draw a Bayes network for each of the above claims.
4) Consider the following variables and their immediate causes (parents):
variable parents
-------------------------------
A none
E none
B A
C A E
D A
F B C
a) Is this network a poly-tree?
b) Determine the truth of the following independence
statements (using d-separation):
1) I({A}, {E} | {})
2) I({A}, {E} | {B})
3) I({A}, {E} | {C})
4) I({A}, {E} | {F})
5) I({A}, {C, F} | {B})
6) I({A}, {F} | {B, C})
c) Determine at least 2 different ways to convert the network into
a poly-tree using clustering.
d) Suppose that P(A) = 0.9, P(E) = 0.3, P(B|A) = 0.1, P(B|~A) = 0.8.
Find P(A|E)
5) You need to construct a 3-input neural network that has value 1 in the
output just when exactly 2 of its inputs are 1, and 0 otherwise
(assume inputs are in {0, 1}).
a) Can this be done without hidden units?
b) Show a network using threshold elements that performs
the required computation (specify the weights, too!)
6) Using the perceptron learning rule (the "delta" rule), show the steps
of learning the function OR with a single perceptron
and a step activation function with threshold 1. Initially all
weights are 1, and the learning rate is 0.4.
7) Is it possible to combine back-propagation with genetic algorithms, and
if so, how?
8) a) Construct by hand a decision tree for the following table,
where the first decision is based on A, then if necessary B, etc.
input variables decision
A B C D
--------------------------------------------
T 1 F 5 fly
T 1 F 4 fly
T 1 F 3 fly
T 1 T 3 fly
T 2 F 3 walk
T 2 F 4 walk
F 1 F 2 fly
F 1 T 4 crawl
F 1 T 5 crawl
b) Which node will be pruned first, based on least information gain?
c) Is a more compact decision tree possible (possibly using a different
ordering of the variables)? Which tree?
Deadline: Tuesday, June 12
Note: submission (as well as WORK) on this exercise is SOLO (that is, NOT in pairs or groups of size > 1)