Page 14 of 23

Re: Carrier's numbers and math in OHJ

Posted: Sun Mar 01, 2015 10:38 pm
by Bernard Muller
to Peter,
No, a Bayesian analysis would have arrived at 5 out of 6 or 83.333%.
But that's not what Carrier would have found with his math. I said he would have found a probability of 20%.
Bayes' Theorem:
Image
P( h | e1, e2, e3 ) =
P( e1, e2, e3 | h ) * P( h ) / ( P( e1, e2, e3 | h ) * P( h ) + P( e1, e2, e3 | ~h ) * P( ~h ) )
But Carrier did not use these equations for calculating the overall result of his sets of consequent odds.
He used a simple multiplication between odds.

Cordially, Bernard

Re: Carrier's numbers and math in OHJ

Posted: Sun Mar 01, 2015 10:42 pm
by Peter Kirby
Bernard Muller wrote:to Peter,
No, a Bayesian analysis would have arrived at 5 out of 6 or 83.333%.
But that's not what Carrier would have found with his math. I said he would have found a probability of 20%.
You are wrong.

You have created a straw man.

Your straw man does not offer a calculation that uses Bayes' theorem correctly.

Carrier uses Bayes' theorem correctly.

You have misunderstood Carrier.
But Carrier did not use these equation for calculating the overall result of his sets of consequent odds.
He used a simple multiplication between odds.
I have already proven the equivalence between Bayes' Theorem, the original form, and "Bayes' Theorem, Odds Form."

The only problem--once again--is that you do not have a proper understanding of what the "Bayes' Theorem, Odds Form" is saying.

The proof follows (again):

I would pay special attention here, regarding "Bayes' Theorem, Odds Form" that Carrier is giving an equation for:

P( h | e.b ) / P ( ~h | e.b )

Which is to say, the ratio of [or, the division of] the conditional probability of h under "e.b" (evidence and background) to [or, by] the conditional probability of ~h under "e.b" (evidence and background).

Recall that Bayes' Theorem itself is this:

Image

Now that implies the following two equations (by substitution):

P( h | e ) = P( e | h ) * P( h ) / P( e )
P( ~h | e ) = P( e | ~h ) * P( ~h ) / P( e )

Now if you divide the terms of the first equation by the terms of the second equation, on each side respectively, you get:

P( h | e ) / P( ~h | e ) = P( e | h ) / P( e | ~h ) * P( h ) / P( ~h ) * ( 1 / P( e ) ) / ( 1 / P( e ) )

Which simplifies to:

P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e | h ) / P( e | ~h )

Which, if you then make every probability "conditional to the background evidence," as Carrier does, by appending "b" everywhere, you get his:

P( h | e.b ) / P( ~h | e.b ) = P( h | b ) / P( ~h | b ) * P( e | h.b ) / P( e | ~h.b )

Image

Some choose to make sure that "b" is explicit when writing Bayesian equations, some don't, and some alternate (Carrier appears to alternate).

Re: Carrier's numbers and math in OHJ

Posted: Sun Mar 01, 2015 10:47 pm
by Peter Kirby
For the record, I guess:

Carrier's Inputs - a fortiori ("upper bound")

P( h ) = 0.33333
P( ~h ) = 0.66667

P( e1 | h ) = 0.4608
P( e2 | h ) = 0.72
P( e3 | h ) = 1
P( e4 | h ) = 1

P( e1 | ~h ) = 1
P( e2 | ~h ) = 1
P( e3 | ~h ) = 1
P( e4 | ~h ) = .34722

e = e1, e2, e3, e4 [a definition of "e" with "," for intersection (AND)]

Unstated assumption - conditional independence

Concept explained here:

http://pages.cs.wisc.edu/~dyer/cs540/no ... ainty.html
http://www.inf.ed.ac.uk/teaching/course ... -bayes.pdf
http://cs.wellesley.edu/~anderson/writi ... -bayes.pdf

Image

This is an assumption, and I know that Carrier can be criticized for it (and I have done so), since it is left implicit.

The general form of the assumption of conditional independence is:

P(X,Y|Z) = P(X|Z)P(Y|Z)

The assumptions here are:

P( e1, e2, e3, e4 | h ) = P ( e1 | h ) * P( e2 | h ) * P( e3 | h ) * P( e4 | h )
P( e1, e2, e3, e4 | ~h ) = P ( e1 | ~h ) * P( e2 | ~h ) * P( e3 | ~h ) * P( e4 | ~h )

Bayesian Calculation

Image

And its corollary:

Image

P( h | e ) =
P( h | e1, e2, e3, e4 ) =
P( e1, e2, e3, e4 | h ) * P ( h ) / ( P( e1, e2, e3, e4 | h ) * P ( h ) + P( e1, e2, e3, e4 | ~h ) * P ( ~h ) )

Now:

P( e1, e2, e3, e4 | h ) = P ( e1 | h ) * P( e2 | h ) * P( e3 | h ) * P( e4 | h ) = 0.4608 * 0.72 * 1 * 1 = 0.331776
P( e1, e2, e3, e4 | ~h ) = P ( e1 | ~h ) * P( e2 | ~h ) * P( e3 | ~h ) * P( e4 | ~h ) = 1 * 1 * 1 * .34722 = 0.34722

So:

P( h | e ) = 0.331776 * 0.33333 / ( 0.331776 * 0.33333 + 0.34722 * 0.66667 ) = 0.110592 / ( 0.110592 + 0.231814 ) = 0.32298

(With a slight difference from rounding error.) Carrier's own approximation to the third decimal place reaches the same result of 32.3%.

Re: Carrier's numbers and math in OHJ

Posted: Sun Mar 01, 2015 11:23 pm
by Bernard Muller
to Peter,
You are wrong.
You have created a straw man.
Your straw man does not offer a calculation that uses Bayes' theorem correctly.
Are you telling me that Carrier, if he had a set of three consequent odds, two being 1/1 and the third one being 1/4, the overall result will not be 1/1 x 1/1 x 1/4 = 1/4 ? (which correspond to a probability of 20%) according to the math he used in the calculation of consequent odds in order to find the overall result.

With a set of four odds, 288/625, 18/25, 1/1 & 72/25, explain to me how he arrived to an overall result of 373248/390625 if not by multiplications.
My straw man, Carrier, did not even use Bayes theorems but just multiplications. So it is not a matter if he used Bayes theorems correctly, he just did not use any before page 599.

When Carrier finally introduced his Bayes theorems in the bottom part of page 598 (no mention he used Bayes theorems before that in his book), it is to calculate the best odds on H (at top of page 599) and the worst odds on H (at top of page 600).
Here, he used the odds form to calculate the final probabilities from the prior odd and the overall consequent odds. And I said he used that Bayes theorem correctly.

Cordially, Bernard

Re: Carrier's numbers and math in OHJ

Posted: Mon Mar 02, 2015 12:21 am
by Peter Kirby
I will repeat what I've said before first, regarding the relationship of the "odds" column to the "probability" column:
As I've said, what Carrier is doing in the "odds" column under "consequent probabilities" is confusing. Very confusing, really.

Carrier is giving the ratio between the probability of P( en | h ) and P( en | ~h ) with these " X/Y " numbers.

And Carrier is -- for whatever reason, perhaps he knows -- choosing to adjust each proportionally so that the larger is always 100%.

(As I've illustrated above, this doesn't seem to affect the result, but it's still odd.)

So if the "X" is bigger than or equal to the "Y", then that means that he's going to write "100%" in that column -- which he does.

But if the "X" is smaller than the "Y", then his ratio lets him calculate the value assigned in that column by dividing "X/Y" -- which he does.

For example:

Let's say a less-woolly mathematician were giving the consequent probabilities in one row as this:

0.4 ~~ for P( en | h ) ~~~~ and 0.5 ~~ for P( en | ~h )

Carrier doesn't like doing that. No idea why, but he doesn't. He wants the larger number to be 100%, always. So he multiplies them by a common factor to achieve that:

0.8 ~~ for P( en | h ) ~~~~ and 1 ~~ for P( en | ~h )

But Carrier is not done being weird. He wants to express them as a ratio. So, again, he does that.

4 : 5 (80% to the 100%) ~~ for P( en | h ) ~~~~ and 5 : 4 (100% to the 80%) ~~ for P( en | ~h )

And just to be extra confusing, Carrier decides to write that with slashes and not colons.

4/5 (80% to the 100%) ~~ for P( en | h ) ~~~~ and 5/4 (100% to the 80%) ~~ for P( en | ~h )

The only saving grace here is that Carrier demystifies all of this by writing the appropriate probability next to it...

...so long as you don't find it too weird already that he has chosen that one of the two must always be 100%...

At this point I must suppose that Carrier might have explained this somewhere in his two books, but you're right to criticize him for letting it be less than fully clear when we come to that exciting moment when all the math is revealed at the end of this book.
Bernard Muller wrote:With a set of four odds, 288/625, 18/25, 1/1 & 72/25, explain to me how he arrived to an overall result of 373248/390625 if not by multiplications.
Richard Carrier notes that his "odds form" can cause confusion (while calling the "standard form" "scary," an opinion I don't share):

"The odds form is much simpler to use, but more confusing if you want to convert its result into a probability. The standard form is much scarier, but directly calculates the probability." (On the Historicity of Jesus, p. 598 n. 3)

The two methods of calculation arrive at the same result (in the end, and not one second sooner!) and use the same inputs applied against the same Bayes' theorem, in two of its equivalent forms.

Yes, the calculation of the arithmetical figure involved multiplication.

But: what are 288/625, 18/25, 1/1 & 72/25? What do they refer to, algebraically?

They refer to P( e1 | h ) / P( e1 | ~h ), P( e2 | h ) / P( e2 | ~h ), P( e3 | h ) / P( e3 | ~h), and P( e4 | h ) / P( e4 | ~h ), respectively.

They refer to the ratio of two different conditional probabilities; what you get by dividing them.

And what is 373248/390625? What does it refer to, algebraically?

The answer: it refers to P( e | h ) / P( e | ~h ). Again, a ratio, or division of two conditional probabilities; yes, arrived at by multiplication.

It is used to solve the odds form of Bayes theorem (getting a ratio), P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e | h ) / P( e | ~h ).

This gives Carrier P( h | e ) / P( ~h | e ), a ratio, which is 1/2 * 373248/390625 or 373248/781250. Let's call it r.

But Carrier's not done. He has to change that ratio into a posterior probability into P( h | e ). Let's call it x.

Fortunately, I think you understand that last step.

Because x+(1/r)x = 1, and we know r = 1/2.093 (this ratio), 3.093x = 1, x = 1/3.093, and x = 0.323 = 32.3%.

Carrier gives a long footnote on page 599 explaining the last step anyway.
Bernard Muller wrote:Are you telling me that Carrier, if he had a set of three consequent odds, two being 1/1 and the third one being 1/4, the overall result will not be 1/1 x 1/1 x 1/4 = 1/4 ? (which correspond to a probability of 20%) according to the math he used in the calculation of consequent odds in order to find the overall result.
Hypothetically, if we "had a set of three consequent odds, two being 1/1 and the third one being 1/4," we'd be claiming this (whether Carrier or not--in this case, it is you creating these particular inputs):

P( e' | h ) / P( e' | ~h ) = 1/1 (yes, redundant, I know)
P( e'' | h ) / P( e'' | ~h ) = 1/1
P( e''' | h ) / P( e''' | ~h ) = 1/4

In order to get posterior probabilities, we also need prior probabilities. Let them be equally probable. So we have:

P( h ) / P( ~h ) = 1/1

Now we use the odds form:

P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e | h ) / P( e | ~h )
P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e' | h ) / P( e' | ~h ) * P( e'' | h ) / P( e'' | ~h ) * P( e''' | h ) / P( e''' | ~h )
P( h | e ) / P( ~h | e ) = 1/1 * 1/1 * 1/1 * 1/1 * 1/4 = 1/4

Let this be r, and let P( h | e ) be x.

Because x+(1/r)x = 1, and we know r = 1/4 (this ratio), 5x = 1, x = 1/5, and x = 0.2 = 20%.

So, you did go from the inputs to the result graciously. Good job!

Unfortunately, if we go back to your example:
OK, let's say you trust someone at 50% to tell the truth on a particular happening.
(let say he/she saw the suspect attacking the victim but it was from a long distance away, so the 50%)
You trust another person at 50% to tell the truth on the same matter.
(ditto)
These two persons do not know each other.
What are the chance of knowing the truth from these two persons. I say 0.50 + (0.50 x 0.50) = 0.75 => probability = 75%
Carrier would say: the odds for each is 1/1. (1/1 x 1/1) = 1/1 => probability = 50%

If we add a third person with 20% of saying the truth on the same thing (same circumstance but has his vision seriously deficient, so the 20%).
For me, that adds 5% (0.25 x 0.20) to the 75% for probability of 80%.
For Carrier 1/1 x 1/4 = 1/4 => probability of 20%
We can see that you don't have comprehension. The inputs you use are not representative of the situation you describe.

The inputs that I used, based on your description of the above hypothetical example, are more representative of what you are describing. You intend all three witnesses to speak in favor of the hypothesis h, yet you deny that in your "inputs," which say that the events of the individual testimony of the witnesses are equally probable under h and under ~h. This is not consistent. You failed to formulate "inputs" in a remotely similar manner, when going from Mullerian mathematics to Bayesian mathematics.

Fortunately, I've already worked out a better representation of such an example.
h = testimony is accurate, e1 = first witness, e2 = second witness, e3 = third witness

P( e1 | h ) = 1
P( e1 | ~h ) = 0.5

P( e2 | h ) = 1
P( e2 | ~h ) = 0.5

P( e3 | h ) = 1
P( e3 | ~h ) = 0.8

P( h ) = 0.5
P( ~h ) = 0.5
Let's do this again in odds form.

P( e1 | h ) / P( e1 | ~h ) = 1/0.5 = 2/1
P( e2 | h ) / P( e2 | ~h ) = 1/0.5 = 2/1
P( e3 | h ) / P( e3 | ~h ) = 1/0.8 = 5/4

P( h ) / P( ~h ) = 0.5/0.5 = 1/1

Now we use the odds form:

P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e | h ) / P( e | ~h )
P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e' | h ) / P( e' | ~h ) * P( e'' | h ) / P( e'' | ~h ) * P( e''' | h ) / P( e''' | ~h )
P( h | e ) / P( ~h | e ) = 2/1 * 2/1 * 5/4 * 1/1 = 20/4 = 5/1

Let this be r, and let P( h | e ) be x, and let P( ~h | e ) be y.

Because y+(1/r)y = 1, and we know r = 5/1 (this ratio), 6y = 1, y = 1/6.
Because x = 1-y, and y = 1/6, x = 5/6 = 83.333%.

Notice that the same conclusion is reached whether you use the odds form or the standard, probability form:
P( h | e1, e2, e3 ) =
P( e1, e2, e3 | h ) * P( h ) / ( P( e1, e2, e3 | h ) * P( h ) + P( e1, e2, e3 | ~h ) * P( ~h ) )

P( e1, e2, e3 | h ) = P( e1 | h ) * P( e2 | h ) * P( e3 | h ) = 1 * 1 * 1 = 1
P( e1, e2, e3 | ~h ) = P( e1 | ~h ) * P( e2 | ~h ) * P( e3 | ~h ) = 0.5 * 0.5 * 0.8 = 0.2

P( h | e1, e2, e3 ) =
1 * 0.5 / ( 1 * 0.5 + 0.2 * 0.5 ) = 0.5 / 0.6 = 0.83333

The posterior probability of the hypothesis h, given the assumptions above, is thus 5 out of 6 or 83.333%.
Now, you can change the inputs if you like. You can even change them to whatever you want, including what you suggested. But that would just be a display of incomprehension, especially when you then compare it to the Mullerian procedure, using different inputs that have a different reading of the evidence entirely (i.e., one that takes each witness as weighty--something your inputs in this example denied to the "straw man" Bayesian comparison; indeed, you actually inverted things, using inputs that made the third witness improve the posterior probability of ~h instead of h).

Last but not least, your example only matters at all in this conversation if it helps improve your comprehension. Carrier isn't using your example or your inputs. If you want to analyze Carrier's numbers and math in OHJ, you need to analyze his inputs, and in relation to Bayes' theorem. Once you have an understanding, you will no longer have any need of these feckless arguments.

Re: Carrier's numbers and math in OHJ

Posted: Mon Mar 02, 2015 12:44 am
by Peter Kirby
For the record, I guess:

Carrier's Inputs - a fortiori ("upper bound")

P( h ) = 0.33333
P( ~h ) = 0.66667

P( e1 | h ) = 0.4608
P( e2 | h ) = 0.72
P( e3 | h ) = 1
P( e4 | h ) = 1

P( e1 | ~h ) = 1
P( e2 | ~h ) = 1
P( e3 | ~h ) = 1
P( e4 | ~h ) = .34722

e = e1, e2, e3, e4 [a definition of "e" with "," for intersection (AND)]

Unstated assumption - conditional independence

Concept explained here:

http://pages.cs.wisc.edu/~dyer/cs540/no ... ainty.html
http://www.inf.ed.ac.uk/teaching/course ... -bayes.pdf
http://cs.wellesley.edu/~anderson/writi ... -bayes.pdf

Image

This is an assumption, and I know that Carrier can be criticized for it (and I have done so), since it is left implicit.

The general form of the assumption of conditional independence is:

P(X,Y|Z) = P(X|Z)P(Y|Z)

The assumptions here are:

P( e1, e2, e3, e4 | h ) = P ( e1 | h ) * P( e2 | h ) * P( e3 | h ) * P( e4 | h )
P( e1, e2, e3, e4 | ~h ) = P ( e1 | ~h ) * P( e2 | ~h ) * P( e3 | ~h ) * P( e4 | ~h )

Bayesian Calculation

Image

And its corollary:

Image

P( h | e ) =
P( h | e1, e2, e3, e4 ) =
P( e1, e2, e3, e4 | h ) * P ( h ) / ( P( e1, e2, e3, e4 | h ) * P ( h ) + P( e1, e2, e3, e4 | ~h ) * P ( ~h ) )

Now:

P( e1, e2, e3, e4 | h ) = P ( e1 | h ) * P( e2 | h ) * P( e3 | h ) * P( e4 | h ) = 0.4608 * 0.72 * 1 * 1 = 0.331776
P( e1, e2, e3, e4 | ~h ) = P ( e1 | ~h ) * P( e2 | ~h ) * P( e3 | ~h ) * P( e4 | ~h ) = 1 * 1 * 1 * .34722 = 0.34722

So:

P( h | e ) = 0.331776 * 0.33333 / ( 0.331776 * 0.33333 + 0.34722 * 0.66667 ) = 0.110592 / ( 0.110592 + 0.231814 ) = 0.32298

(With a slight difference from rounding error.) Carrier's own approximation to the third decimal place reaches the same result of 32.3%.

Postscript-- Odds Form.

Richard Carrier seems to prefer the odds form of Bayes' theorem. Some have been confused by it. First, is it equivalent to the standard form? Second, does the application of the odds form arrive at the same result.

First, it is equivalent. Here is the proof.

"Bayes' Theorem, Odds Form" that Carrier is giving an equation for:

P( h | e.b ) / P ( ~h | e.b )

Which is to say, the ratio of [or, the division of] the conditional probability of h under "e.b" (evidence and background) to [or, by] the conditional probability of ~h under "e.b" (evidence and background).

Recall that Bayes' Theorem itself is this:

Image

Now that implies the following two equations (by substitution):

P( h | e ) = P( e | h ) * P( h ) / P( e )
P( ~h | e ) = P( e | ~h ) * P( ~h ) / P( e )

Now if you divide the terms of the first equation by the terms of the second equation, on each side respectively, you get:

P( h | e ) / P( ~h | e ) = P( e | h ) / P( e | ~h ) * P( h ) / P( ~h ) * ( 1 / P( e ) ) / ( 1 / P( e ) )

Which simplifies to:

P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e | h ) / P( e | ~h )

Which, if you then make every probability "conditional to the background evidence," as Carrier does, by appending "b" everywhere, you get his:

P( h | e.b ) / P( ~h | e.b ) = P( h | b ) / P( ~h | b ) * P( e | h.b ) / P( e | ~h.b )

Image

Some choose to make sure that "b" is explicit when writing Bayesian equations, some don't, and some alternate (Carrier appears to alternate).

Second, it does arrive at the same result. Here is the demonstration.

Carrier refers to expressions such as "1/2" under prior probabilities and "288/625", "18/25", "1/1", and "72/25" under consequent probabilities. But: what are 288/625, 18/25, 1/1, and 72/25? What do they refer to, algebraically?

They refer to P( e1 | h ) / P( e1 | ~h ), P( e2 | h ) / P( e2 | ~h ), P( e3 | h ) / P( e3 | ~h), and P( e4 | h ) / P( e4 | ~h ), respectively.

They refer to the ratio of two different conditional probabilities; what you get by dividing them.

And what is 373248/390625? What does it refer to, algebraically?

It refers to P( e | h ) / P( e | ~h ). Again, a ratio, or division of two conditional probabilities.

It is used to solve the odds form of Bayes theorem (getting a ratio), P( h | e ) / P( ~h | e ) = P( h ) / P( ~h ) * P( e | h ) / P( e | ~h ).

This gives Carrier P( h | e ) / P( ~h | e ), a ratio, which is 1/2 * 373248/390625 or 373248/781250. Let's call it r.

But Carrier's not done. He has to change that ratio into a posterior probability, i.e. into P( h | e ). Let's call it x and let P( ~h | e ) be y.

Because r = x/y , y = (1/r)x, and because x + y = 1,

x + (1/r)x = 1, and since we know r = 1/2.093 (this ratio), 3.093x = 1, x = 1/3.093, and x = 0.323 = 32.3%.

Carrier gives a long footnote on page 599 of On the Historicity of Jesus explaining the last step.

Re: Carrier's numbers and math in OHJ

Posted: Mon Mar 02, 2015 1:16 am
by Peter Kirby
Gosh darn it. I didn't get to translate any of Harnack's Apostolikon today. :cry:

G'night.

Re: Carrier's numbers and math in OHJ

Posted: Tue Mar 03, 2015 7:15 am
by perseusomega9
What I learned in this thread is that Carrier's book will not be properly addressed as there does not exist a single biblical scholar that knows maths, let alone stats.

Carrier's numbers and math in OHJ (cont'd)

Posted: Tue Mar 03, 2015 3:31 pm
by Bernard Muller
Carrier's book doesn't rest only on his math (essentially multiplying the odds together and applying p(h)=O(h)/(O(h)+O(~h)) in order to find the final overall probabilities for h). It rests mostly on Carrier's input data which is most controversial.
For example, that thread discussed extensively Carrier's argument against "seed of David" (Romans 1:3) and there is absolutely no chance that argument is valid in any degree and for multiple reasons. And with a 0, and accepting Carrier's math, the odds for "made from sperm" (OHJ p. 594) become 2/0 (best for historicity) and 1/0 (worst for historicity), which would make the overall result of all consequent & prior odds indicating 100 % probability for historicity, not only for "best odds on H" (p. 599) but also for "worst odds on H" (p. 600).

Cordially Bernard

Re: Carrier's numbers and math in OHJ

Posted: Wed Mar 04, 2015 2:33 am
by GakuseiDon
Peter Kirby wrote:For the record, I guess:

Carrier's Inputs - a fortiori ("upper bound")

P( h ) = 0.33333
P( ~h ) = 0.66667

P( e1 | h ) = 0.4608
P( e2 | h ) = 0.72
P( e3 | h ) = 1
P( e4 | h ) = 1
<snipped>
Peter, just a note to thank you for taking the time by giving those examples. I learned a lot from them (though I can't say I understand everything involved). For example, I was mystified by how Carrier was converting odds into "100%"s in one column, but your explanation makes sense. Thanks for your notes on this in this thread! :notworthy: