Check nearby libraries
Buy this book

Under the assumption that individuals know the conditional distributions of signals given the payoff-relevant parameters, existing results conclude that as individuals observe infinitely many signals, their beliefs about the parameters will eventually merge. We first show that these results are fragile when individuals are uncertain about the signal distributions: given any such model, a vanishingly small individual uncertainty about the signal distributions can lead to a substantial (non-vanishing) amount of differences between the asymptotic beliefs. We then characterize the conditions under which a small amount of uncertainty leads only to a small amount of asymptotic disagreement. According to our characterization, this is the case if the uncertainty about the signal distributions is generated by a family with "rapidly-varying tails" (such as the normal or the exponential distributions). However, when this family has "regularly-varying tails" (such as the Pareto, the log-normal, and the t-distributions), a small amount of uncertainty leads to a substantial amount of asymptotic disagreement. Keywords: asymptotic disagreement, Bayesian learning, merging of opinions. JEL Classifications: C11, C72, D83.
Check nearby libraries
Buy this book

Previews available in: English
Edition | Availability |
---|---|
1
Fragility of asymptotic agreement under Bayesian learning
2008, Massachusetts Institute of Technology, Dept. of Economics
in English
|
aaaa
|
Book Details
Edition Notes
"March 15, 2008."
Includes bibliographical references (p. 41-42).
Abstract in HTML and working paper for download in PDF available via World Wide Web at the Social Science Research Network.
The Physical Object
Edition Identifiers
Work Identifiers
Community Reviews (0)
May 2, 2025 | Edited by ImportBot | import existing book |
August 12, 2020 | Edited by MARC Bot | remove fake subjects |
May 7, 2011 | Created by ImportBot | initial import |