## Erik J. Olsson

Print publication date: 2005

Print ISBN-13: 9780199279999

Published to Oxford Scholarship Online: July 2005

DOI: 10.1093/0199279993.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (www.oxfordscholarship.com). (c) Copyright Oxford University Press, 2017. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a monograph in OSO for personal use (for details see http://www.oxfordscholarship.com/page/privacy-policy). Subscriber: null; date: 24 February 2017

# (p.216) Appendix C Proofs of Observations

Source:
Against Coherence
Publisher:
Oxford University Press

Observation 2.1: $P ( H / E 1 , … , E w ) = 1 1 + ( n − 1 ) ( 1 − i i ( n − 1 ) ) w$in the generalized Huemer model.

Proof: $P ( H / E 1 , … , E w ) = P ( E 1 , … , E w / H ) P ( H ) P ( E 1 , … , E w ) ( by Bayes ' s theorem ) = P ( E 1 / H ) … P ( E w / H ) P ( H ) P ( E 1 / H ) … P ( E w / H ) P ( H ) + P ( E 1 / ¬ H ) … P ( E w / ¬ H ) P ( ¬ H )$(by generalized conditional independence)

$Display mathematics$

Observation 3.1: Given (i)–(viii) in section 3.2.3, $P ( H / E 1 , E 2 ) = P ( R ) + P ( H ) 2 P ( U ) P ( R ) + P ( H ) P ( U ) .$

Proof: By Bayes's theorem:

$Display mathematics$
We will now calculate the right-hand side of (1), noting that
$Display mathematics$
From (2) and our background assumptions we deduce
$Display mathematics$
Turning to the denominator of (1),
$Display mathematics$
(p.217) By (4) and our assumptions,
$Display mathematics$
Finally, by combining (3) and (5) we get (after some simplification)
$Display mathematics$

Observation 3.2: P(H/E)=P(R)+P(H)P(U)

Proof: We first note

$Display mathematics$
$Display mathematics$
By Bayes's theorem,
$Display mathematics$

Observation 3.3: If P(U) is non-extreme, then P(H/E)>P(H).

Proof: By Observation 3.2, P(H/E)=P(R)+P(H)P(U). By algebra, P(R)+P(H)P(U)>P(H) given that P(U) and P(H) are non-extreme.

Observation 3.4: P(H/E 1,E 2)>P(H/E 1).

Proof: Let a=P(R), b=P(H), and c=P(U). We have assumed, as part of the model, that c < 1 and a+b=1. The statement to be proved follows from these two assumptions given Observations 3.1. and 3.2. What we need to prove is

$Display mathematics$

(p.218) This is established as follows:

$Display mathematics$

Observation 3.5: If P(H/E 1)=P(H/E 2)=P(H), then P(H/E 1,E 2)= P(H/E 1).

Proof: We first show that P(H/E 1)=P(H/E 2)=P(H) only if P(U)=1. By Observation 2.2, P(H/E 1)=P(H) only if P(R)+P(H)P(U)=P(H) which, by algebra, entails P(U)=1. Reasoning as in the proof of Observation 3.3, we can now show that P(U)=1 entails

$Display mathematics$
By Observation 3.1, the left-hand side of that equality equals P(H/E 1,E 2) and by Observation 3.2 the right-hand side equals P(H/E 1).

Observation 4.1: Suppose (1) P(E i/H)=P(E i), (2) P(E 1,E 2/H)= P(E 1/H)P(E 2/H) and (3) P(E 1,E 2H)=P(E 1H)P(E 2H): Then P(H/E 1,E 2)=P(H).

Proof: Bayes's theorem yields:

$Display mathematics$

Observation 4.2: (Tomoji Shogenji 2002) Suppose that report E lacks individual credibility, so that P(H/E)=P(H). Then P(L)=(n−1)P(R) (p.219) and hence P(L)>P(R), when n>2, and P(L)=P(R), when n=2. Moreover, if P(L)=P(R) and n>2, then P(H/E)>P(H).

Proof: If a witness is a truth-teller, her report will be E if and only if H is actually true. If she is a randomizer, she will report E one out of n times no matter what is actually the case. If she is a liar, she will report E only if it is actually the case that H is false; and if it is not the case that H, she tells E one out of n−1 times. Hence,

$Display mathematics$
The probability that a given witness reports E given that H is true is
$Display mathematics$
Let us now assume that P(H/E)=P(H) or, equivalently, P(E/H)= P(E). It follows that
$Display mathematics$
whence
$Display mathematics$
But P(L)=1−P(R)−P(U), and so
$Display mathematics$
It follows from (3) that P(L)=P(R), when n=2, and P(L)>P(R), when n>2. By analogous reasoning, that if P(L)=P(R) and n>2, then P(H/E)>P(H).

Observation 4.3: Suppose truth-telling (R), randomization (U), and lying (L) are mutually exclusive and exhaustive hypotheses about the reliability. Then P(E 1/H, E 2) ≈ P(E 1/H).

(p.220) Informal argument: We have

$Display mathematics$
and
$Display mathematics$
The ‘liar terms’ in these equations will equal 0 since P(E 1/L, H, E 2)=0 and P(E 1/L, H)=0. Hence,
$Display mathematics$
and
$Display mathematics$
Clearly, if the reporter has delivered a true report that fact should raise the probability of her being reliable and diminish the probability of her being a mere randomizer: P(R/H, E 2)>P(R) and P(U/H, E 2)< P(U). As a consequence, we should expect P(E 1/H, E 2) ≈ P(E 1/H). We note that it does not matter whether the lying is coordinated or uncoordinated.

Observation 4.4: Suppose truth-telling (R), randomization (U), and lying (L) are mutually exclusive and exhaustive hypotheses about the reliability. Then P(E 1H, E 2) ≈ P(E 1H), if the liars are uncoordinated. Moreover, P(E 1H, E 2)>P(E 1H), if the liars are coordinated and n is large.

Proof: In general,

$Display mathematics$
In the case of coordinated lying, the probability of one lying witness's testifying to the same effect as another lying witness is 1, that is to say, P(E 1/LH, E 2)=1. Hence,
$Display mathematics$
(p.221) For uncoordinated lying, one the other hand, $P ( E 1 / ¬ H , E 2 ) = 1 n − 1 ,$, and so
$Display mathematics$
Now compare each of these two equations with
$Display mathematics$
By (2) and (3), P(E 1/¬(H, E 2) ≈ P(E 1H), if the liars are uncoordinated, since although P(U)>P(U/H, E 2), this will be counteracted by the fact that P(LH, E 2)>P(L).

It remains to be shown that P(E 1H, E 2)>P(E 1H), if the liars are coordinated and n is large. Clearly, (3) goes to 0 as n goes to ∞. Let us see what happens to (1) as n goes to ∞. The left-hand term in (1) obviously goes to 0. But what happens to the right-hand term? An application of Bayes's theorem gives

$Display mathematics$
Since PH, E 2/R)P(R)=0 and PH, E 2/U) < PH, E 2/L),
$Display mathematics$
Hence, while (3) P(E 1H) goes to 0 as n approaches ∞, (1) P(E 1H, E 2) then approaches a constant greater than 0. We may conclude that (1) is greater than (3) if n is large.

Observation 7.1: Suppose that the following hold:

1. (i) E 1 and E 2 are independent reports on A 1 and A 2.

2. (ii) P(A 1/E 1)=P(A 1) and P(A 2/E 2)=P(A 2).

3. (iii) A 1A 2, A 1∧¬A 2, ¬A 1A 2, and ¬A 1∧¬A 2 all have non-zero probability.

Then P(A 1,A 2/E 1,E 2)=P(A 1,A 2).

(p.222) Proof: By Bayes's theorem,

$Display mathematics$
By conditional independence, P(E 1,E 2/A 1,A 2)=P(E 1/A 1)P(E 2/A 2). It follows from (ii) and familiar probabilistic facts that P(E i/A i)=P(E i), i=1, 2. Hence,
$Display mathematics$
By (iii) and the theorem of total probability, P(E 1,E 2)=P(E 1,E 2/A 1,A 2) P(A 1,A 2)+P(E 1,E 2/A 1A 2)P(A 1A 2)+P(E 1,E 2A 1,A 2)PA 1, A 2)+ P(E 1,E 2A 1A 2)PA 1A 2). By conditional independence, the right-hand side of that equation equals P(E 1/A 1)P(E 2/A 2)P(A 1,A 2)+ P(E 1/A 1)P(E 2A 2)P(A 1A 2)+P(E 1A 1)P(E 2/A 2)PA 1,A 2)+ P(E 1/ ¬A 1) P(E 2A 2)PA 1A 2). As already noticed, it follows from (ii) that P(E 1/A 1) = P(E 1) and P(E 2/A 2) = P(E 2). It also follows from (ii) that P(E 2A 2)=P(E 2) and P(E 1A 1)=P(E 1). Combining all this yields
$Display mathematics$
It follows from (1), (2), and (3) that P(A 1A 2|E 1,E 2)=P(A 1A 2), which ends the proof.

Observation 8.1: $P 1 ( H / E 1 ) = 〈 h r 〉 = ∫ 0 1 h h + h ¯ h r ¯ 1 h + h ¯ r 1 d r 1 = 1 + h 2$

Proof: By arithmetic simplification,

$Display mathematics$
In general,
$Display mathematics$
From (1) and (2),
$Display mathematics$

Observation 8.2: The function

$Display mathematics$
takes on its minimum for h=1/3 in the interval h ∈ (0, 1).

(p.223) Proof: To find the minimum of this function we calculate its derivative with respect to h, set this derivative equal to 0, and solve for h ∈ (0, 1). By arithmetic simplification and derivation,

$Display mathematics$
We set the derivate equal to 0 and solve for h ∈ (0, 1).
$Display mathematics$
The only extreme value for h ∈ (0, 1) is h=1/3 which can be verified to be a minimum.