PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jabaJournal of Applied Behavior Analysis Web SiteSubscriber LoginJournal of the Experimental Analysis of Behavior Web SiteSubscription InformationInformation for AuthorsJournal of Applied Behavior Analysis Web SiteJournal of Applied Behavior Analysis Web Site
 
J Appl Behav Anal. 1977 Spring; 10(1): 141–150.
PMCID: PMC1311161

Artifact, bias, and complexity of assessment: the ABCs of reliability

Abstract

Interobserver agreement (also referred to here as “reliability”) is influenced by diverse sources of artifact, bias, and complexity of the assessment procedures. The literature on reliability assessment frequently has focused on the different methods of computing reliability and the circumstances under which these methods are appropriate. Yet, the credence accorded estimates of interobserver agreement, computed by any method, presupposes eliminating sources of bias that can spuriously affect agreement. The present paper reviews evidence pertaining to various sources of artifact and bias, as well as characteristics of assessment that influence interpretation of interobserver agreement. These include reactivity of reliability assessment, observer drift, complexity of response codes and behavioral observations, observer expectancies and feedback, and others. Recommendations are provided for eliminating or minimizing the influence of these factors from interobserver agreement.

Full text

Full text is available as a scanned copy of the original print version. Get a printable copy (PDF file) of the complete article (1.5M), or click on a page image below to browse page by page.

Articles from Journal of Applied Behavior Analysis are provided here courtesy of Society for the Experimental Analysis of Behavior