introduction

Description

Assignment due Sunday, March 17, 2024 by 11:00pm

Don't use plagiarized sources. Get Your Custom Assignment on
introduction
From as Little as $13/Page

Answer the following questions. You have to upload a PDF file as the primary resource. You can upload any additional file as a secondary resource. Please note that you need to provide clear and detailed explanations for all the solutions that you provide.

Answer the following questions. Upload your answers in a pdf file.

Question 1

KB entails a sentence α (KB |= α) if and only if, in every model KB is true, α is true as well. M (KB) is a subset of M(α). One way to implement the inference is to enumerate all the models and check that α is true in every model that KB is true.

Assume a simplified version of the problem with breezes and pits. Squares next to pits are breezy, and breezy squares are next to squares with pits.

The agent did not detect a breeze at square [1,1] (column, row). The agent detected a Breeze in [2,1]. Thus, your knowledge base is KB : (¬ B1,1) ∧ (B2,1), where Bx,y is true if there is a breeze in [x,y].

Below you can see all possible models of adjacent pits: A pit is represented as a black cell.

1.1. Surround with a line the possible worlds above that are models of KB

1.2. Consider the sentence α1 = “Square [1,2] does not have a pit.” Surround with a line the possible worlds below that are models of α1.

1.3. Does KB |= α1? Explain your answer

1.4. Consider the sentence α2 = “Square [2,2] does not have a pit.” Surround with a line the possible worlds below that are models of α2.

Question 2:

Assume that you are given the following configuration. Compute the probability P3,1. Each square other than [1,1] contains a pit with a probability of 0.3.

Hint: Use section 12.7 for a similar example.

2.1 What is the evidence?

2.2. Write the formula for the full joint distribution. How many entries are there?

2.2 Use conditional independence to simplify the summation.

Question 3:

Given the network below, calculate marginal and conditional probabilities P (¬p3), P(p2|¬p3), P(p1|p2, ¬p3) a P(p1|¬p3, p4). Apply inference by enumeration. P(p1)=0.4 P(p2/p1)=0.8, P(p3/p2)=0.2 P(p3/¬p2)=0.3, P(p4/p2)=0.8, P(p4/¬p2)=0.5. Optional: Can you consider the case of using variable elimination?

Assignment Information
Weight:

20%

Learning Outcomes Added
LO1_FundamentalsAI: Identify key concepts relating to various AI techniques.
LO2_ReasoningAI: Apply logic, probabilistic reasoning, and knowledge representation strategies in solving AI problems.

Above is the assignment requirements, please note that i have completed the assignment and the task i need you to complete is review everything and fix any mistakes.


Unformatted Attachment Preview

Assignment due Sunday, March 17, 2024 by 11:00pm
Answer the following questions. You have to upload a PDF file as the primary resource. You can
upload any additional file as a secondary resource. Please note that you need to provide clear and
detailed explanations for all the solutions that you provide.
Answer the following questions. Upload your answers in a pdf file.
Question 1
KB entails a sentence α (KB |= α) if and only if, in every model KB is true, α is true as well. M
(KB) is a subset of M(α). One way to implement the inference is to enumerate all the models and
check that α is true in every model that KB is true.
Assume a simplified version of the problem with breezes and pits. Squares next to pits are
breezy, and breezy squares are next to squares with pits.
The agent did not detect a breeze at square [1,1] (column, row). The agent detected a Breeze in
[2,1]. Thus, your knowledge base is KB : (¬ B1,1) ∧ (B2,1), where Bx,y is true if there is a
breeze in [x,y].
Below you can see all possible models of adjacent pits: A pit is represented as a black cell.
1.1. Surround with a line the possible worlds above that are models of KB
(1) we can see that there is no breeeze in square [1,1] , which means that there cannot be a pit in
any of th eadjacent squares to [1,1].
(2) we can also see that there is a breeze in square [2,1], which evidently indicates that there is a
put in one of th eadjacent squares to [2,1].
Model 1
Model 1 cannot be correct since there is a pit
in the square [1,2]. This contradicts the fact
that [1,1] has no breeze
Model 2:
This is a potential model because it has a pit
in [3,1], which can explain the breeze at [2,1]
and no pit is adjacent to [1,1]
Model 3:
Model there seemse to be not valid because it
has a pit in [1,2], making it impossible since
[1,1] does not have breeze.
Model 4:
This model works because there’s a pit in
[2,2] to cause the breeze in [2,1], and no pits
adjacent to [1,1]
Model 5:
Similer to model 1, its not correct since the
pit in [1,2] Would result in a breze at [1,1]
Model 6:
Doesn’t fit since it lacks a pit near [2,1] to
explain the detected breeze.
Model 7:
Algins with the condition since the pit is in
[3,1]. This could potentially create a breeze at
[2,1] and [1,1] would not be affected.
1.2. Consider the sentence α1 = “Square [1,2] does not have a pit.” Surround with a line the
possible worlds below that are models of α1.
Model 1:
Correct-> the cell at [1,2] is white
Model 2:
incorrect-> the cell at [1,2] is black
Model 3:
Correct-> the cell at [1,2] is white
Model 4:
incorrect-> the cell at [1,2] is black
Model 5:
Correct-> the cell at [1,2] is white
Model 6:
Correct-> the cell at [1,2] is white
Model 7:
incorrect-> the cell at [1,2] is black
Model 8:
Correct-> the cell at [1,2] is white
Represents the possible confiugrations where square [1,2] is without a pit, complying with a1
1.3. Does KB |= α1? Explain your answer
The relationship KBB |=α1 holds if in every model in which KB is true, α1 is also true. This
means that if we know a knowledge base KB that is satisfied by a model, then th esentance α1 is
true for the entailment to hold.
Given that KB : (¬B1,1)∧(B2,1), which means that there is no breeze in square [1,1] and there is
a breeze in square [2,1], we may conlclude that;
No breeze in [1,1]→ cannot be a pit of the adjacent sqaures to [1,1], [2,1] and [1,1] itself taking
into considerating that we are wrapping arround the grid.
Breeze in [2,1]→ there must be at least one piut in one of the adacent squares.
Based on this, we can now consider the possible worlds where KB is true and check wheather
α1 is true in all of these worlds.
—————————————————————————————–Lets say that all of the models of KB satsifies α1, then we can say that KB |= α1. We can see that
the models where KB is true are those that do not have a pit in [1,2] and have a pit in one of the
sqaures adajcent to [2,1] that are not adjacent to [1,1]. From this, we can say that the models that
satisfy KB also satsifyies α1-→ KB | α1.
Surely, this si based on the assumption that there are no other squares outside of our conisdration
that could potentailly infleunce the truth of KB.
1.4. Consider the sentence α2 = “Square [2,2] does not have a pit.” Surround with a line the
possible worlds below that are models of α2.
Model 1:
Cell at [2,2] white → model is correct
Model 2:
Cell at [2,2] black → model is incorrect
Model 3:
Cell at [2,2] white → model is correct
Model 4:
Cell at [2,2] black → model is incorrect
Model 5:
Cell at [2,2] white → model is correct
Model 6:
Cell at [2,2] black → model is incorrect
Model 7:
Cell at [2,2] white → model is correct
Model 8:
Cell at [2,2] black → model is incorrect
These are the possible worlds that satisfy the condition set by a2, indicating that square
[2,2] does not contain a pit:
Question 2:
Assume that you are given the following configuration. Compute the probability P3,1. Each
square other than [1,1] contains a pit with a probability of 0.3.
Hint: Use section 12.7 for a similar example.
Given the configuration of a pit square [3,1] and taking into account that the probability of a pit
in any square other than [1,1] which contains a pit with a probability 0.3, we would be able to
compute the probability of P3,1. Since we know that all fo the squares are independent, the
probability of a pit in a spricifc square would not have an infleuince of a pit in another sqaure.
This means that the probability P3,1 is the given probability probability for any square
2.1 What is the evidence?
Based on the configuration of the grid and how it has an infleunce on the computation of the
probability of a piut being in a specific square, we can say that the evidence is :
(1) Each square except than [1,1] contains a pit woth a probability of 0.3
(2) Taking into account the known locations [1,1] , [2,1] and [3,1] with respect to the
configuration of the grid.
2.2. Write the formula for the full joint distribution. How many entries are there?
P(D)=∏ i∈squares P(D i )
P(Di) is eith er Pi or ¬P I since it depends wheather square I has a pit or not.
Since there are two possible states (pit or no pit) and there are 2^n entries in the full joint
distirbution. We can consider all combiations. Lets say that we are only considering squares of
[2,1] and [3,1] as [1,1] is not included, and we have n=2 . ie (2^2 = 4 entries in the full joint
distirbution):
(1) Neither [2,1] nor [3,1] has a pit: P ( ¬P 2,1¬P 3,1 ) = 0.7 * 0.7
(2) Only [2,1] has a pit. : P(P 2,1 ,¬P 3,1 )=0.3×0.7
(3) Only [3,1] has a pit: P(¬P 2,1 ,P 3,1 )=0.7×0.3
(4) Both [2,1] and [3,1] have a pit : P(P 2,1 ,P 3,1 )=0.3×0.3
2.2 Use conditional independence to simplify the summation.
In order to use the conditional independence to simplify the summation for the full joint
distribution, what we can do is to leverage the preceense of a pit in any given square is indepdent
of the precence of a pit in another square. Meaning, the probability of a put being in one square
does not have any infleunce to the probability of a put being in another. Thius allows us to treat
each event seperately. More importatnly, we know for a fact that the full joint distribtion is a set
of a conditional indepdnent varaibles that can be expressed as the marginal probabiltiies
product. Lets say that we have varaibles A, and B that are conditioned on an indiepdent vartaible
C
P(A,B∣C)=P(A∣C)⋅P(B∣C)
Going back to our case of the grid, lets say that we want to compute the probability of a specific
configuration of pits. Considering all combiations of pits will not be needed to take into account
since we can istead multiply the probabiltiies individually.
If we let Pit i be the event that there is a pit in square i and ¬Piti , be the event that there is no
pit. Simplified :
P(D)=P(Pit 2,1 )⋅P(Pit 3,1 )
Above is the full joint probability for two independent events: preecnse of pits in square [2,1]
and [3,1]. NOTe that P(d) represents the probaiblity of a speicric confiduration where borth
sqaures might or might not have puts (note summation of different configurations)
Since each square other than [1,1] has a pit with a proabiblity of 0.3, we can simplify the formula
of the full joint disrtribution as:
P(D)=P(Pit 2,1)⋅P(Pit 3,1 )+P(¬Pit 2,1 )⋅P(Pit 3,1 )+P(Pit 2,1 )⋅P(¬Pit 3,1 )+P(¬Pit 2,1 )⋅P(¬Pit
3,1 )
Where
Where P(Piti)=0.3P(Pit i )=0.3 and ¬ Piti)=0.7P(¬Pit i )=0.7 for each square i.
Question 3:
Given the network below, calculate marginal and conditional probabilities P (¬p3), P(p2|¬p3),
P(p1|p2, ¬p3) a P(p1|¬p3, p4). Apply inference by enumeration. P(p1)=0.4 P(p2/p1)=0.8,
P(p3/p2)=0.2 P(p3/¬p2)=0.3, P(p4/p2)=0.8, P(p4/¬p2)=0.5. Optional: Can you consider the case
of using variable elimination?
Answer:
We are Given that:
P(p1) = 0.4
P(p2 given p1) = 0.8 P
(p3 given p2) = 0.2 P
(p3 given not p2) = 0.3
P(p4 given p2) = 0.8
P(p4 given not p2) = 0.5
Solution:
(1) Estimation of P( not p3): In this step, we will be using the rule of total probabilitity. Thius
is where we will weiggh all scenarios of P1 and p2 with respect to their probabilities:
P(not p3) is equal to the sum of:
= P(p1) P(p2|p1) P(¬p3|p2) P(p4|p2) + P(p1) P(p2|p1) P(¬p3|p2) P(¬p4|p2)+
+ P(p1) P(¬p2|p1) P(¬p3|¬p2) P(p4|¬p2) + P(p1) P(¬p2|p1) P(¬p3|¬p2) P(¬p4|¬p2)+
+ P(¬p1) P(p2|¬p1) P(¬p3|p2) P(p4|p2) + P(¬p1) P(p2|¬p1) P(¬p3|p2) P(¬p4|p2)+
+ P(¬p1) P(¬p2|¬p1) P(¬p3|¬p2) P(p4|¬p2) + P(¬p1) P(¬p2|¬p1) P(¬p3|¬p2) P(¬p4|¬p2)
Substituting:
(0.4 * 0.8 * 0.8 * 0.8) + (0.4 * 0.8 * 0.8 * 0.2) + (0.4 * 0.2 * 0.7 * 0.5) + (0.4 * 0.2 * 0.7 * 0.5)
= 0.762
Note that we have used the data given in the formulas
(2) Estimation of P(p2 given not p3): this is where we will start and determine the combined
chance of p2 and not p3 in order to implement conditional probability by divding it by not
p3.
P(p2|¬p3) = P(p2, ¬p3) / P(¬p3) = 0.496 / 0.762 = 0.6509
P(p2, ¬p3) = ∑ P1,P4 P(P1, p2, ¬p3, P4) = ∑ P1,P4 P(P1) P(p2|P1) P(¬p3|p2) P(P4|p2) = =
P(p1) P(p2|p1) P(¬p3|p2) P(p4|p2) + P(p1) P(p2|p1) P(¬p3|p2) P(¬p4|p2)+ + P(¬p1) P(p2|¬p1)
P(¬p3|p2) P(p4|p2) + P(¬p1) P(p2|¬p1) P(¬p3|p2) P(¬p4|p2)
= 0.4 * 0.8 * 0.8 * 0.8 + 0.4 * 0.8 * 0.8 * 0.2 + 0.6 * 0.5 * 0.8 * 0.8 + 0.6 * 0.5 * 0.8 * 0.2 = 0.2048
+ 0.0512 + 0.192 + 0.048
= 0.496
NOTE THAT WE FOUND P(¬p3) → 0.762
(3) Determine P(p1given p2 and npot p3): get the combined chance of p1, p2, and not p3 in
order to divide it bu the combined probability of p2 and not p3:
P(p1|p2, ¬p3) = P(p1, p2, ¬p3) / P(p2, ¬p3) = 0.256 / 0.496 = 0.5161
P(p1, p2, ¬p3) = ∑ P4 P(p1, p2, ¬p3, P4) = ∑ P4 P(p1) P(p2|p1) P(¬p3|p2) P(P4|p2) = P(p1)
P(p2|p1) P(¬p3|p2) P(p4|p2) + P(p1) P(p2|p1) P(¬p3|p2) P(¬p4|p2)
= 0.4 * 0.8 * 0.8 * 0.8 + 0.4 * 0.8 * 0.8 * 0.2 = 0.2048 + 0.0512 = 0.256
P(p2, ¬p3) was given 0.496
(4) P(p1 given not p3 and p4): we need to incorporate the combined chance of P1, not p3 and
p4, then divide it by the combined probaiblity of not p3 and p4:
P(p1|¬p3, p4) = P(p1, ¬p3, p4) / P(¬p3, p4) = 0.2328 / 0.5298 = 0.4394
Process:
P(p1, ¬p3, p4) = ∑ P2 P(p1, P2, ¬p3, p4) = ∑ P2 P(p1) P(P2|p1) P(¬p3|P2) P(p4|P2) = P(p1)
P(p2|p1) P(¬p3|p2) P(p4|p2) + P(p1) P(¬p2|p1) P(¬p3|¬p2) P(p4|¬p2)
= 0.4* 0.8* 0.8* 0.8+ 0.4* 0.2* 0.7* 0.5 = 0.2048 + 0.028
= 0.2328
P(¬p3, p4) = P(p1, ¬p3, p4) + P(¬p1, ¬p3, p4)
= 0.2328 + 0.297 = 0.5298
P(¬p1, ¬p3, p4) = ∑ P2 P(¬p1, P2, ¬p3, p4) = ∑ P2 P(¬p1) P(P2|¬p1) P(¬p3|P2) P(p4|P2) =
P(¬p1) P(p2|¬p1) P(¬p3|p2) P(p4|p2) + P(¬p1) P(¬p2|¬p1) P(¬p3|¬p2) P(p4|¬p2)
= 0.6* 0.5* 0.8* 0.8+ 0.6* 0.5* 0.7* 0.5 = 0.192 + 0.105
= 0.297
Conclusion:
P(¬p3) = 0.762, P(p2|¬p3) = 0.6509 P(p1|p2, ¬p3) = 0.5161
P(p1|¬p3, p4) = 0.4394
Assignment Information
Weight:
20%
Learning Outcomes Added


LO1_FundamentalsAI: Identify key concepts relating to various AI techniques.
LO2_ReasoningAI: Apply logic, probabilistic reasoning, and knowledge representation
strategies in solving AI problems.

Purchase answer to see full
attachment