601.664 Artificial Intelligence


Homework 4: Uncertainty

Question 1. We wish to transmit an n-bit message to a receiving agent. The bits in the message are independently corrupted (flipped) during transmission with ε probability each. With an extra parity bit sent along with the original information, a message can be corrected by the receiver if at most one bit in the entire message (including the parity bit) has been corrupted. Suppose we want to ensure that the correct message is received with probability at least 1-δ. What is the maximum feasible value of n? Calculate this value for the case ε = 0.001, δ = 0.01.

Question 2. Deciding to put probability theory to good use, we encounter a slot machine with three independent wheels, each producing one of the four symbols BAR, BELL, LEMON, or CHERRY with equal probability. The slot machine has the following payout scheme for a bet of 1 coin (where “?” denotes that we don’t care what comes up for that wheel) (Note that order matters):
BAR/BAR/BAR pays 20 coins
BELL/BELL/BELL pays 15 coins
LEMON/LEMON/LEMON pays 5 coins
CHERRY/CHERRY/CHERRY pays 3 coins
CHERRY/CHERRY/? pays 2 coins
CHERRY/?/? pays 1 coin
Note that the machine returns the highest possible payout as defined by this scheme.

  1. Compute the expected “payback” percentage of the machine. In other words, for each coin played, what is the expected coin return (note: the cost of the inserting 1 coin does not matter)?
  2. Compute the probability that playing the slot machine once will result in a win.
  3. Compute the variance of the coin return of the machine

Question 3. We have a bag of three biased coins a, b, and c with probabilities of coming up heads of 20%, 60%, and 80%, respectively. One coin is drawn randomly from the bag (with equal likelihood of drawing each of the three coins), and then the coin is flipped three times to generate the outcomes X1, X2, and X3.

  1. Draw the Bayesian network corresponding to this setup and define the necessary CPTs.
  2. Calculate which coin was most likely to have been drawn from the bag if the observed flips come out heads twice and tails once.

Question 4. Consider the Bayesian network in the "Burglar and Earthquake" example in the lecture (slide 6).

  1. If no evidence is observed, are Burglary and Earthquake independent? Verify by numerical calculation.
  2. If we observe Alarm = true, are Burglary and Earthquake independent? Justify your answer by calculating whether the probabilities involved satisfy the definition of conditional independence.

Question 5. Show that any second-order Markov process can be rewritten as a first-order Markov process with an augmented set of state variables. Can this always be done parsimoniously, i.e., without increasing the number of parameters needed to specify the transition model?

Question 6. A professor wants to know if students are getting enough sleep. Each day, the professor observes whether the students sleep in class, and whether they have red eyes. The professor has the following domain theory:

Formulate this information as a dynamic Bayesian network that the professor could use to filter or predict from a sequence of observations. Then reformulate it as a hidden Markov model that has only a single observation variable. Give the complete probability tables for the model.