About this item:

317 Views | 0 Downloads

Author Notes:

Correspondence: Robert H. Lyles, Department of Biostatistics and Bioinformatics, The Rollins School of Public Health of Emory University, 1518 Clifton Rd. N.E., Atlanta, GA 30322; Phone: 404-727-1310; Fax: 404-727-1370; Email: rlyles@sph.emory.edu


Research Funding:

This work was partially supported by National Institute of Environmental Health Sciences Grant 2R01-ES012458-5, an RC4 grant through the National Institute of Nursing Research (1RC4NR012527-01), National Institutes of Health Grant R01-MH079448-01, and a PHS grant (UL 1 RR025008) from the Clinical and Translational Science Award Program, National Institutes of Health, Center for Research Resources.


  • Absolute risk
  • Bias
  • Bootstrap
  • Logistic regression
  • Maximum likelihood
  • Odds ratio
  • Relative risk

Reducing Bias and Mean Squared Error Associated With Regression-Based Odds Ratio Estimators


Journal Title:

Journal of Statistical Planning and Inference


Volume 142, Number 12


, Pages 3235-3241

Type of Work:

Article | Post-print: After Peer Review


Ratio estimators of effect are ordinarily obtained by exponentiating maximum-likelihood estimators (MLEs) of log-linear or logistic regression coefficients. These estimators can display marked positive finite-sample bias, however. We propose a simple correction that removes a substantial portion of the bias due to exponentiation. By combining this correction with bias correction on the log scale, we demonstrate that one achieves complete removal of second-order bias in odds ratio estimators in important special cases. We show how this approach extends to address bias in odds or risk ratio estimators in many common regression settings. We also propose a class of estimators that provide reduced mean bias and squared error, while allowing the investigator to control the risk of underestimating the true ratio parameter. We present simulation studies in which the proposed estimators are shown to exhibit considerable reduction in bias, variance, and mean squared error compared to MLEs. Bootstrapping provides further improvement, including narrower confidence intervals without sacrificing coverage.

Copyright information:

© 2012 Elsevier B.V. All rights reserved.

This is an Open Access work distributed under the terms of the Creative Commons Attribution-NonCommerical-NoDerivs 3.0 Unported License (http://creativecommons.org/licenses/by-nc-nd/3.0/).

Creative Commons License

Export to EndNote