Skip to main content

Table 2 CCRT, inter rater reliability

From: Core conflictual relationship theme: the reliability of a simplified scoring procedure

Rater contra rater

Number of Ratings

Wishes

Response from Others

Response from Self

1 vs 2

27

0.32**

0.44**

0.31**

1 vs 7

18

0.30**

0.53**

0.58**

2 vs 3

16

0.26*

0.48**

0.47**

2 vs 6

20

0.28**

0.60**

0.40**

2 vs 7

19

0.29**

0.29**

0.46**

6 vs 7

15

0.52**

0.32*

0.49**

  1. ** p < 0.001, * p < 0.05
  2. Mean values: Wishes: 0.33 (0.26–0.52) Response from Others: 0.44 (0.29–0.60) Response from Self: 0.45 (0.31–0.58). Mean in total: 0.41. Agreement according to Cohen’s kappa is arbitrary but 0–0.20 is considered as slight, 0.21–0.40 as fair, 0.41–0.60 as moderate, 0.61–0.80 as substantial, and 0.81–1 as almost perfect agreement, referring to Landis and Koch [31].