top banner top banner
index
RegularArticles
ReplicationStudies
SpecialIssues
Vignettes
EditorialBoard
Instructions4Authors
JournalGuidelines
Messages
Submission

Search publications

Determining Negligible Associations in Regression

Full text PDF
Bibliographic information: BibTEX format RIS format XML format APA style
Cited references information: BibTEX format APA style
Doi: 10.20982/tqmp.19.1.p059

Alter, Udi , Counsell, Alyssa
59-83
Keywords: Equivalence testing , negligible effect , linear regression , lack of association
Tools: R Shiny
(data file)   (Appendix)

Psychological research is rife with inappropriately concluding “no effect” between predictors and outcome in regression models following statistically nonsignificant results. However, this approach is methodologically flawed because failing to reject the null hypothesis using traditional, difference-based tests does not mean the null is true. Using this approach leads to high rates of incorrect conclusions that flood psychological literature. This paper introduces a novel, methodologically sound alternative. In this paper, we demonstrate how an equivalence testing approach can be applied to multiple regression (which we refer to here as “negligible effect testing”) to evaluate whether a predictor (measured in standardized or unstandardized units) has a negligible association with the outcome. In the first part of the paper, we evaluate the performance of two equivalence-based techniques and compare them to the traditional, difference-based test via a Monte Carlo simulation study. In the second part of the paper, we use examples from the literature to illustrate how researchers can implement the recommended negligible effect testing methods in their own work using open-access and user-friendly tools (negligible R package and Shiny app). Finally, we discuss how to report and interpret results from negligible effect testing and provide practical recommendations for best research practices based on the simulation results. All materials, including R code, results, and additional resources, are publicly available on the Open Science Framework (OSF): \href {https://osf.io/w96xe/}{osf.io/w96xe/}.


Pages © TQMP;
Website last modified: 2024-03-28.
Template last modified: 2022-03-04 18h27.
Page consulted on .
Be informed of the upcoming issues with RSS feed: RSS icon RSS