DescriptionComparison of rubrics for evaluating inter-rater kappa (and intra-class correlation) coefficients.png
English: Kappa is a way of measuring agreement or reliability, correcting for how often ratings might agree by chance. It is similar to a correlation coefficient in that it cannot go above +1.0 or below -1.0. Several authorities have offered "rules of thumb" for interpreting the level of agreement. This figure compares several rubrics that are widely used in psychiatry and psychology.
to share – to copy, distribute and transmit the work
to remix – to adapt the work
Under the following conditions:
attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
share alike – If you remix, transform, or build upon the material, you must distribute your contributions under the same or compatible license as the original.