The Objective Bayesian Probability that an Unknown Positive Real Variable Is Greater Than a Known Is 1/2

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 308
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorFiorillo, Christopher D.ko
dc.contributor.authorKim, Sunil L.ko
dc.date.accessioned2021-04-26T02:50:14Z-
dc.date.available2021-04-26T02:50:14Z-
dc.date.created2021-04-26-
dc.date.created2021-04-26-
dc.date.created2021-04-26-
dc.date.issued2021-03-
dc.identifier.citationPHILOSOPHIES, v.6, no.1-
dc.identifier.issn2409-9287-
dc.identifier.urihttp://hdl.handle.net/10203/282539-
dc.description.abstractIf there are two dependent positive real variables x(1) and x(2), and only x(1) is known, what is the probability that x(2) is larger versus smaller than x(1)? There is no uniquely correct answer according to "frequentist" and "subjective Bayesian" definitions of probability. Here we derive the answer given the "objective Bayesian" definition developed by Jeffreys, Cox, and Jaynes. We declare the standard distance metric in one dimension, d(A,B)equivalent to|A-B|, and the uniform prior distribution, as axioms. If neither variable is known, P(x(2)<x(1))=P(x(2)>x(1)). This appears obvious, since the state spaces x(2)<x(1) and x(2)>x(1) have equal size. However, if x(1) is known and x(2) unknown, there are infinitely more numbers in the space x(2)>x(1) than x(2)<x(1). Despite this asymmetry, we prove P(x(2)<x(1) divide x(1))=P(x(2)>x(1) divide x(1)), so that x(1) is the median of p(x(2)|x(1)), and x(1) is statistically independent of ratio x(2)/x(1). We present three proofs that apply to all members of a set of distributions. Each member is distinguished by the form of dependence between variables implicit within a statistical model (gamma, Gaussian, etc.), but all exhibit two symmetries in the joint distribution p(x(1),x(2)) that are required in the absence of prior information: exchangeability of variables, and non-informative priors over the marginal distributions p(x(1)) and p(x(2)). We relate our conclusion to physical models of prediction and intelligence, where the known 'sample' could be the present internal energy within a sensor, and the unknown the energy in its external sensory cause or future motor effect.-
dc.languageEnglish-
dc.publisherMDPI-
dc.titleThe Objective Bayesian Probability that an Unknown Positive Real Variable Is Greater Than a Known Is 1/2-
dc.typeArticle-
dc.identifier.scopusid2-s2.0-85112548302-
dc.type.rimsART-
dc.citation.volume6-
dc.citation.issue1-
dc.citation.publicationnamePHILOSOPHIES-
dc.identifier.doi10.3390/philosophies6010024-
dc.contributor.localauthorFiorillo, Christopher D.-
dc.contributor.nonIdAuthorKim, Sunil L.-
dc.description.isOpenAccessY-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorprediction-
dc.subject.keywordAuthorinference-
dc.subject.keywordAuthorBayesian brain-
dc.subject.keywordAuthornon-informative prior-
dc.subject.keywordAuthorJeffreys prior-
dc.subject.keywordAuthorminimal information-
dc.subject.keywordAuthorinvariance-
dc.subject.keywordAuthortransformation groups-
dc.subject.keywordAuthormedian-
dc.subject.keywordAuthorprinciple of indifference-
dc.subject.keywordPlusPRIOR DISTRIBUTIONS-
dc.subject.keywordPlusPRIORS-
dc.subject.keywordPlusINFORMATION-
dc.subject.keywordPlusINVARIANCE-
dc.subject.keywordPlusBRAIN-
Appears in Collection
BiS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0