2017
DOI: 10.1111/ecoj.12461
|View full text |Cite
|
Sign up to set email alerts
|

The Power of Bias in Economics Research

Abstract: We investigate two critical dimensions of the credibility of empirical economics research: statistical power and bias. We survey 159 empirical economics literatures that draw upon 64,076 estimates of economic parameters reported in more than 6,700 empirical studies. Half of the research areas have nearly 90% of their results under‐powered. The median statistical power is 18%, or less. A simple weighted average of those reported results that are adequately powered (power ≥ 80%) reveals that nearly 80% of the re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

26
479
1
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 476 publications
(507 citation statements)
references
References 79 publications
(153 reference statements)
26
479
1
1
Order By: Relevance
“…First, the median power in intelligence research seems higher than the median power estimated in neuroscience (8-31%; Button et al, 2013), psychology (between 12% and 44%; Szucs & Ioannidis, 2017;Stanley et al, 2017) , behavioral ecology and animal research (13-16% for a small effect and 40-47% for a medium effect; Jennions & Moller, 2003), and economics (18%; Ioannidis, Stanley, & Doucouliagos, 2017), but slightly lower than socialpersonality research (50% for r = .20; Fraley & Vazire, 2014). Second, we did not find clear trends in effect sizes over time, which might indicate that the field of intelligence research is less susceptible to time-lag biases such as the decline effect or the early-extreme effect .…”
Section: Discussionmentioning
confidence: 91%
“…First, the median power in intelligence research seems higher than the median power estimated in neuroscience (8-31%; Button et al, 2013), psychology (between 12% and 44%; Szucs & Ioannidis, 2017;Stanley et al, 2017) , behavioral ecology and animal research (13-16% for a small effect and 40-47% for a medium effect; Jennions & Moller, 2003), and economics (18%; Ioannidis, Stanley, & Doucouliagos, 2017), but slightly lower than socialpersonality research (50% for r = .20; Fraley & Vazire, 2014). Second, we did not find clear trends in effect sizes over time, which might indicate that the field of intelligence research is less susceptible to time-lag biases such as the decline effect or the early-extreme effect .…”
Section: Discussionmentioning
confidence: 91%
“…However, the evidence base reported in leading journals is far from ideal. Indeed, leading journals are subject to publication selection bias (leaving some of the evidence base unreported because the estimates are not statistically significant or because they do not conform to researchers' expectations), and there is a worrying inability to replicate key results [2]. The situation in other journals is no better.…”
Section: Discussion Of Pros and Consmentioning
confidence: 99%
“…Empirical studies have adequate statistical power when they can detect a genuine empirical effect; this requires low rates of false negatives (predictions that are true but are predicted as false). A survey of 159 meta-regression analyses of some 6,700 empirical studies in economics found that such research is greatly underpowered: statistical power in economics is typically no more than 18% (compared to the ideal of 80%), with almost half of the 159 empirical economics areas surveyed having 90% or more of the reported results from underpowered studies [2]. Meta-regression analysis increases statistical power by combining the results from…”
Section: Increasing Statistical Powermentioning
confidence: 99%
See 1 more Smart Citation
“…Several studies from across the social sciences have documented a variety of issues in existing research, including publication bias (Sterling (1959), Rosenthal (1979), Franco, Malhotra, and Simonovits (2014)); specification search (Simmons, Nelson, and Simonsohn (2011), Humphreys, de la Sierra, and van der Windt (2013), Brodeur et al (2016); bias and failures to replicate (Open Science Collaboration (2015), Chang and Li (2015), Camerer et al (2016), Ioannidis, Stanley, and Doucouliagos (2017)); and outright fraud (Simonsohn (2013) and Broockman, Kalla, and Aronow (2015)). 1 This research raises concerns about the validity of empirical work in economics and in social science more generally.…”
Section: Introductionmentioning
confidence: 99%