<< Click to Display Table of Contents >> Navigation: »No topics above this level« Reliability Checks |
Before publication, you probably need to provide information about the quality of your data. For this INTERACT offers the following routines to check for reliability:
Kappa: Inter-Rater-Reliability - To compare any type of Event-based text codes - based on our special pair-finding routine. Our routine does not strictly compare time sequences, instead it resolves the problem of gaps and overlaps that are common in most observation data. Per observer file, the number of Events my differ and you can specify the allowed offset as well as the required overlapping percentage. |