Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Lucy Kay View Post
    Why does Stata only show me 4 outcomes (i.e., 1, 2, 3, 7) when I have 9 health outcomes in my survey
    As I have already explained in #9 before, the outcomes are the rating categories not the number of subjects (health outcomes). Obviously, the raters (experts) only choose 4 out of 9 possible rating categories (1, 2, 3, and 7). Stata does not know that there are 9 rating categories if only 4 are used. By the way, kappaetc has an option to specify the rating categories. Anyway, I will use a slightly modified version of the sample data that you provide to show what Stata does. I have added one more rating category to test5.

    Code:
    clear
    input subject    test1    test2    test3    test4    test5
    1    2    3    2    3    3
    2    2    2    2    2    3
    3    2    2    2    2    4
    end
    
    kap test1-test5
    which yields

    Code:
    . kap test1-test5
    
    There are 5 raters per subject:
    
             Outcome |    Kappa          Z     Prob>Z
    -----------------+-------------------------------
                   2 |   -0.0500      -0.27    0.6079
                   3 |    0.1477       0.81    0.2092
                   4 |   -0.0714      -0.39    0.6522
    -----------------+-------------------------------
            combined |    0.0278       0.18    0.4274
    Note that there are 3 outcomes (2, 3, and 4) even though there are 5 raters and possibly many more categories to choose from. I will now replicate the kappa value, reported for outcome 3.

    Code:
    recode test1-test5 (2 4 = 2)
    list
    kap test1-test5
    which yields

    Code:
    . list
    
         +-------------------------------------------------+
         | subject   test1   test2   test3   test4   test5 |
         |-------------------------------------------------|
      1. |       1       2       3       2       3       3 |
      2. |       2       2       2       2       2       3 |
      3. |       3       2       2       2       2       2 |
         +-------------------------------------------------+
    
    . kap test1-test5
    
    There are 5 raters per subject:
    
    Two-outcomes, multiple raters:
    
             Kappa        Z        Prob>Z
            -----------------------------
            0.1477       0.81      0.2092
    This is what I have also explained earlier: Stata collapses all categories (i.e., 2 and 4) but one (i.e., 3) and then calculates kappa for the two remaining rating categories 2 vs. 3. By the way, with only two categories, the numeric values of the categories are irrelevant; replacing 2 by 0 and 3 by 42 yields the exact same kappa. I hope you now have a better understanding of what Stata does.

    Originally posted by Lucy Kay View Post
    [...] I want to see the kappa for agreement between the 55 expert raters for each of the 9 health outcomes
    I have explained in #13 why I do not find that useful. I have stated why this is especially for Fleiss' kappa and have suggested what you could report instead. If you insist on using Stata's kap command to get Fleiss' kappa for single subjects, type

    Code:
    kap test1-test55 in 1
    to get the kappa for your first subject. For the second subject, replace in 1 with in 2, and so on. You could set up a loop but, as I have stated before, this would probably be pointless. Your kappa will either be 0, in which case kap will throw an error or your kappa will be the same value for all 9 subjects.

    I feel that we keep repeating ourselves, so I will bail out at this point. All the best.

    Comment


    • #17
      Okay, I hear you. So now, I am trying to report observed agreement with
      Code:
      icc test subject occasion1, testvalue(0.75) level(90)
      . Here is my transformed dataset where subject is the rater, occasion represents each of the 9 health outcomes and test represents what each rater scored for each of the 9 health outcomes (scored. on a scale from 1-9). I set the 0.75 limit (seen in level(90) part of the code) based on what is considered good agreement and it is one sided because I only want to see if the agreement is above or below 0.75. Yet, my code only gives me one combined ICC score of 0.388.

      Code:
      . icc test subject occasion, testvalue(0.75) level(90) 
      
      Intraclass correlations
      Two-way random-effects model
      Absolute agreement
      
      Random effects: subject          Number of targets =        55
      Random effects: occasion         Number of raters  =         9
      
      --------------------------------------------------------------
                        test |        ICC       [90% Conf. Interval]
      -----------------------+--------------------------------------
                  Individual |   .3884117       .2956394    .4951765
                     Average |    .851097       .7906875      .89825
      --------------------------------------------------------------
      F test that
        ICC(1)=0.75: F(54.0, 126.0) = 0.23          Prob > F = 1.000
        ICC(k)=0.75: F(54.0, 169.3) = 1.72          Prob > F = 0.005
      
      Note: ICCs estimate correlations between individual measurements
            and between average measurements made on the same target.

      Do you know how I can get the observed agreement for each of the 9 health outcomes?

      Dataset:
      Code:
      subject    occasion    test
      1    1    6
      1    2    5
      1    3    5
      1    4    7
      1    5    7
      1    6    5
      1    7    5
      1    8    8
      1    9    7
      2    1    8
      2    2    6
      2    3    6
      2    4    8
      2    5    8
      2    6    8
      2    7    8
      2    8    7
      2    9    7
      3    1    6
      3    2    5
      3    3    5
      3    4    6
      3    5    7
      3    6    5
      3    7    4
      3    8    6
      3    9    4
      4    1    7
      4    2    6
      4    3    6
      4    4    9
      4    5    7
      4    6    7
      4    7    7
      4    8    9
      4    9    9
      5    1    6
      5    2    7
      5    3    7
      5    4    9
      5    5    9
      5    6    7
      5    7    7
      5    8    9
      5    9    9
      6    1    9
      6    2    6
      6    3    8
      6    4    9
      6    5    9
      6    6    8
      6    7    8
      6    8    8
      6    9    8
      7    1    8
      7    2    7
      7    3    5
      7    4    7
      7    5    7
      7    6    5
      7    7    5
      7    8    7
      7    9    9
      8    1    7
      8    2    2
      8    3    3
      8    4    5
      8    5    5
      8    6    6
      8    7    7
      8    8    5
      8    9    5
      9    1    6
      9    2    5
      9    3    3
      9    4    7
      9    5    8
      9    6    5
      9    7    4
      9    8    5
      9    9    6
      10    1    7
      10    2    3
      10    3    3
      10    4    3
      10    5    3
      10    6    3
      10    7    3
      10    8    3
      10    9    3
      11    1    8
      11    2    7
      11    3    7
      11    4    7
      11    5    7
      11    6    6
      11    7    6
      11    8    6
      11    9    6
      12    1    9
      12    2    7
      12    3    7
      12    4    7
      12    5    5
      12    6    9
      12    7    9
      12    8    7
      12    9    7
      13    1    9
      13    2    9
      13    3    9
      13    4    8
      13    5    8
      13    6    8
      13    7    7
      13    8    9
      13    9    9
      14    1    6
      14    2    3
      14    3    3
      14    4    7
      14    5    6
      14    6    3
      14    7    3
      14    8    6
      14    9    6
      15    1    3
      15    2    5
      15    3    5
      15    4    3
      15    5    3
      15    6    3
      15    7    5
      15    8    6
      15    9    6
      16    1    5
      16    2    5
      16    3    7
      16    4    7
      16    5    8
      16    6    8
      16    7    9
      16    8    9
      16    9    9
      17    1    8
      17    2    2
      17    3    5
      17    4    8
      17    5    5
      17    6    6
      17    7    6
      17    8    9
      17    9    8
      18    1    4
      18    2    4
      18    3    4
      18    4    7
      18    5    4
      18    6    3
      18    7    3
      18    8    7
      18    9    7
      19    1    7
      19    2    3
      19    3    3
      19    4    5
      19    5    3
      19    6    7
      19    7    1
      19    8    5
      19    9    5
      20    1    8
      20    2    7
      20    3    2
      20    4    5
      20    5    6
      20    6    8
      20    7    5
      20    8    6
      20    9    6
      21    1    9
      21    2    7
      21    3    8
      21    4    9
      21    5    9
      21    6    7
      21    7    7
      21    8    9
      21    9    7
      22    1    7
      22    2    4
      22    3    7
      22    4    9
      22    5    9
      22    6    6
      22    7    6
      22    8    9
      22    9    9
      23    1    6
      23    2    4
      23    3    4
      23    4    7
      23    5    4
      23    6    3
      23    7    5
      23    8    7
      23    9    7
      24    1    4
      24    2    4
      24    3    3
      24    4    6
      24    5    5
      24    6    2
      24    7    2
      24    8    7
      24    9    8
      25    1    9
      25    2    7
      25    3    7
      25    4    9
      25    5    9
      25    6    8
      25    7    6
      25    8    8
      25    9    8
      26    1    7
      26    2    7
      26    3    7
      26    4    7
      26    5    7
      26    6    7
      26    7    7
      26    8    9
      26    9    9
      27    1    9
      27    2    8
      27    3    3
      27    4    4
      27    5    4
      27    6    2
      27    7    3
      27    8    5
      27    9    5
      28    1    9
      28    2    5
      28    3    8
      28    4    7
      28    5    7
      28    6    6
      28    7    6
      28    8    6
      28    9    7
      29    1    5
      29    2    4
      29    3    4
      29    4    6
      29    5    3
      29    6    3
      29    7    3
      29    8    7
      29    9    7
      30    1    6
      30    2    7
      30    3    7
      30    4    7
      30    5    8
      30    6    7
      30    7    6
      30    8    7
      30    9    6
      31    1    7
      31    2    4
      31    3    3
      31    4    6
      31    5    8
      31    6    2
      31    7    1
      31    8    5
      31    9    4
      32    1    7
      32    2    6
      32    3    6
      32    4    6
      32    5    8
      32    6    5
      32    7    5
      32    8    6
      32    9    7
      33    1    9
      33    2    6
      33    3    7
      33    4    6
      33    5    9
      33    6    6
      33    7    6
      33    8    7
      33    9    7
      34    1    7
      34    2    3
      34    3    3
      34    4    6
      34    5    6
      34    6    3
      34    7    3
      34    8    6
      34    9    6
      35    1    5
      35    2    6
      35    3    7
      35    4    4
      35    5    4
      35    6    3
      35    7    5
      35    8    5
      35    9    6
      36    1    9
      36    2    6
      36    3    8
      36    4    7
      36    5    7
      36    6    7
      36    7    8
      36    8    9
      36    9    9
      37    1    6
      37    2    5
      37    3    4
      37    4    5
      37    5    5
      37    6    4
      37    7    3
      37    8    4
      37    9    4
      38    1    7
      38    2    5
      38    3    5
      38    4    5
      38    5    5
      38    6    5
      38    7    6
      38    8    5
      38    9    5
      39    1    7
      39    2    6
      39    3    6
      39    4    7
      39    5    5
      39    6    5
      39    7    5
      39    8    8
      39    9    8
      40    1    5
      40    2    5
      40    3    5
      40    4    3
      40    5    7
      40    6    5
      40    7    5
      40    8    3
      40    9    3
      41    1    7
      41    2    4
      41    3    5
      41    4    9
      41    5    8
      41    6    6
      41    7    5
      41    8    9
      41    9    9
      42    1    8
      42    2    3
      42    3    3
      42    4    4
      42    5    8
      42    6    7
      42    7    7
      42    8    5
      42    9    7
      43    1    4
      43    2    2
      43    3    2
      43    4    8
      43    5    7
      43    6    3
      43    7    2
      43    8    8
      43    9    7
      44    1    8
      44    2    7
      44    3    6
      44    4    8
      44    5    7
      44    6    5
      44    7    4
      44    8    6
      44    9    6
      45    1    7
      45    2    7
      45    3    9
      45    4    9
      45    5    7
      45    6    7
      45    7    7
      45    8    9
      45    9    9
      46    1    7
      46    2    6
      46    3    6
      46    4    8
      46    5    5
      46    6    3
      46    7    3
      46    8    5
      46    9    5
      47    1    9
      47    2    5
      47    3    2
      47    4    7
      47    5    7
      47    6    7
      47    7    6
      47    8    7
      47    9    7
      48    1    9
      48    2    5
      48    3    5
      48    4    5
      48    5    5
      48    6    7
      48    7    5
      48    8    7
      48    9    7
      49    1    6
      49    2    7
      49    3    5
      49    4    4
      49    5    7
      49    6    8
      49    7    6
      49    8    4
      49    9    4
      50    1    9
      50    2    2
      50    3    6
      50    4    7
      50    5    9
      50    6    9
      50    7    9
      50    8    7
      50    9    7
      51    1    2
      51    2    2
      51    3    5
      51    4    5
      51    5    2
      51    6    2
      51    7    2
      51    8    4
      51    9    4
      52    1    6
      52    2    6
      52    3    6
      52    4    6
      52    5    7
      52    6    6
      52    7    6
      52    8    9
      52    9    9
      53    1    8
      53    2    6
      53    3    8
      53    4    6
      53    5    8
      53    6    6
      53    7    7
      53    8    7
      53    9    8
      54    1    5
      54    2    4
      54    3    1
      54    4    7
      54    5    8
      54    6    5
      54    7    3
      54    8    7
      54    9    7
      55    1    3
      55    2    4
      55    3    5
      55    4    8
      55    5    6
      55    6    5
      55    7    6
      55    8    8
      55    9    7

      Comment


      • #18
        Okay, I hear you. So now, I am trying to report observed agreement with
        Code:
        icc test subject occasion1, testvalue(0.75) level(90)
        . Here is my transformed dataset where subject is the rater (I have 55 experts rating 9 health outcomes), occasion represents each of the 9 health outcomes and test represents what each rater scored for each of the 9 health outcomes (scored. on a scale from 1-9). I set the 0.75 limit (seen in level(90) part of the code) based on what is considered good agreement and it is one sided because I only want to see if the agreement is above or below 0.75. Yet, my code only gives me one combined ICC score of 0.388.

        Code:
        . icc test subject occasion, testvalue(0.75) level(90) 
        
        Intraclass correlations
        Two-way random-effects model
        Absolute agreement
        
        Random effects: subject          Number of targets =        55
        Random effects: occasion         Number of raters  =         9
        
        --------------------------------------------------------------
                          test |        ICC       [90% Conf. Interval]
        -----------------------+--------------------------------------
                    Individual |   .3884117       .2956394    .4951765
                       Average |    .851097       .7906875      .89825
        --------------------------------------------------------------
        F test that
          ICC(1)=0.75: F(54.0, 126.0) = 0.23          Prob > F = 1.000
          ICC(k)=0.75: F(54.0, 169.3) = 1.72          Prob > F = 0.005
        
        Note: ICCs estimate correlations between individual measurements
              and between average measurements made on the same target.

        Do you know how I can get the observed agreement for each of the 9 health outcomes?

        Dataset:
        Code:
        subject    occasion    test
        1    1    6
        1    2    5
        1    3    5
        1    4    7
        1    5    7
        1    6    5
        1    7    5
        1    8    8
        1    9    7
        2    1    8
        2    2    6
        2    3    6
        2    4    8
        2    5    8
        2    6    8
        2    7    8
        2    8    7
        2    9    7
        3    1    6
        3    2    5
        3    3    5
        3    4    6
        3    5    7
        3    6    5
        3    7    4
        3    8    6
        3    9    4
        4    1    7
        4    2    6
        4    3    6
        4    4    9
        4    5    7
        4    6    7
        4    7    7
        4    8    9
        4    9    9
        5    1    6
        5    2    7
        5    3    7
        5    4    9
        5    5    9
        5    6    7
        5    7    7
        5    8    9
        5    9    9
        6    1    9
        6    2    6
        6    3    8
        6    4    9
        6    5    9
        6    6    8
        6    7    8
        6    8    8
        6    9    8
        7    1    8
        7    2    7
        7    3    5
        7    4    7
        7    5    7
        7    6    5
        7    7    5
        7    8    7
        7    9    9
        8    1    7
        8    2    2
        8    3    3
        8    4    5
        8    5    5
        8    6    6
        8    7    7
        8    8    5
        8    9    5
        9    1    6
        9    2    5
        9    3    3
        9    4    7
        9    5    8
        9    6    5
        9    7    4
        9    8    5
        9    9    6
        10    1    7
        10    2    3
        10    3    3
        10    4    3
        10    5    3
        10    6    3
        10    7    3
        10    8    3
        10    9    3
        11    1    8
        11    2    7
        11    3    7
        11    4    7
        11    5    7
        11    6    6
        11    7    6
        11    8    6
        11    9    6
        12    1    9
        12    2    7
        12    3    7
        12    4    7
        12    5    5
        12    6    9
        12    7    9
        12    8    7
        12    9    7
        13    1    9
        13    2    9
        13    3    9
        13    4    8
        13    5    8
        13    6    8
        13    7    7
        13    8    9
        13    9    9
        14    1    6
        14    2    3
        14    3    3
        14    4    7
        14    5    6
        14    6    3
        14    7    3
        14    8    6
        14    9    6
        15    1    3
        15    2    5
        15    3    5
        15    4    3
        15    5    3
        15    6    3
        15    7    5
        15    8    6
        15    9    6
        16    1    5
        16    2    5
        16    3    7
        16    4    7
        16    5    8
        16    6    8
        16    7    9
        16    8    9
        16    9    9
        17    1    8
        17    2    2
        17    3    5
        17    4    8
        17    5    5
        17    6    6
        17    7    6
        17    8    9
        17    9    8
        18    1    4
        18    2    4
        18    3    4
        18    4    7
        18    5    4
        18    6    3
        18    7    3
        18    8    7
        18    9    7
        19    1    7
        19    2    3
        19    3    3
        19    4    5
        19    5    3
        19    6    7
        19    7    1
        19    8    5
        19    9    5
        20    1    8
        20    2    7
        20    3    2
        20    4    5
        20    5    6
        20    6    8
        20    7    5
        20    8    6
        20    9    6
        21    1    9
        21    2    7
        21    3    8
        21    4    9
        21    5    9
        21    6    7
        21    7    7
        21    8    9
        21    9    7
        22    1    7
        22    2    4
        22    3    7
        22    4    9
        22    5    9
        22    6    6
        22    7    6
        22    8    9
        22    9    9
        23    1    6
        23    2    4
        23    3    4
        23    4    7
        23    5    4
        23    6    3
        23    7    5
        23    8    7
        23    9    7
        24    1    4
        24    2    4
        24    3    3
        24    4    6
        24    5    5
        24    6    2
        24    7    2
        24    8    7
        24    9    8
        25    1    9
        25    2    7
        25    3    7
        25    4    9
        25    5    9
        25    6    8
        25    7    6
        25    8    8
        25    9    8
        26    1    7
        26    2    7
        26    3    7
        26    4    7
        26    5    7
        26    6    7
        26    7    7
        26    8    9
        26    9    9
        27    1    9
        27    2    8
        27    3    3
        27    4    4
        27    5    4
        27    6    2
        27    7    3
        27    8    5
        27    9    5
        28    1    9
        28    2    5
        28    3    8
        28    4    7
        28    5    7
        28    6    6
        28    7    6
        28    8    6
        28    9    7
        29    1    5
        29    2    4
        29    3    4
        29    4    6
        29    5    3
        29    6    3
        29    7    3
        29    8    7
        29    9    7
        30    1    6
        30    2    7
        30    3    7
        30    4    7
        30    5    8
        30    6    7
        30    7    6
        30    8    7
        30    9    6
        31    1    7
        31    2    4
        31    3    3
        31    4    6
        31    5    8
        31    6    2
        31    7    1
        31    8    5
        31    9    4
        32    1    7
        32    2    6
        32    3    6
        32    4    6
        32    5    8
        32    6    5
        32    7    5
        32    8    6
        32    9    7
        33    1    9
        33    2    6
        33    3    7
        33    4    6
        33    5    9
        33    6    6
        33    7    6
        33    8    7
        33    9    7
        34    1    7
        34    2    3
        34    3    3
        34    4    6
        34    5    6
        34    6    3
        34    7    3
        34    8    6
        34    9    6
        35    1    5
        35    2    6
        35    3    7
        35    4    4
        35    5    4
        35    6    3
        35    7    5
        35    8    5
        35    9    6
        36    1    9
        36    2    6
        36    3    8
        36    4    7
        36    5    7
        36    6    7
        36    7    8
        36    8    9
        36    9    9
        37    1    6
        37    2    5
        37    3    4
        37    4    5
        37    5    5
        37    6    4
        37    7    3
        37    8    4
        37    9    4
        38    1    7
        38    2    5
        38    3    5
        38    4    5
        38    5    5
        38    6    5
        38    7    6
        38    8    5
        38    9    5
        39    1    7
        39    2    6
        39    3    6
        39    4    7
        39    5    5
        39    6    5
        39    7    5
        39    8    8
        39    9    8
        40    1    5
        40    2    5
        40    3    5
        40    4    3
        40    5    7
        40    6    5
        40    7    5
        40    8    3
        40    9    3
        41    1    7
        41    2    4
        41    3    5
        41    4    9
        41    5    8
        41    6    6
        41    7    5
        41    8    9
        41    9    9
        42    1    8
        42    2    3
        42    3    3
        42    4    4
        42    5    8
        42    6    7
        42    7    7
        42    8    5
        42    9    7
        43    1    4
        43    2    2
        43    3    2
        43    4    8
        43    5    7
        43    6    3
        43    7    2
        43    8    8
        43    9    7
        44    1    8
        44    2    7
        44    3    6
        44    4    8
        44    5    7
        44    6    5
        44    7    4
        44    8    6
        44    9    6
        45    1    7
        45    2    7
        45    3    9
        45    4    9
        45    5    7
        45    6    7
        45    7    7
        45    8    9
        45    9    9
        46    1    7
        46    2    6
        46    3    6
        46    4    8
        46    5    5
        46    6    3
        46    7    3
        46    8    5
        46    9    5
        47    1    9
        47    2    5
        47    3    2
        47    4    7
        47    5    7
        47    6    7
        47    7    6
        47    8    7
        47    9    7
        48    1    9
        48    2    5
        48    3    5
        48    4    5
        48    5    5
        48    6    7
        48    7    5
        48    8    7
        48    9    7
        49    1    6
        49    2    7
        49    3    5
        49    4    4
        49    5    7
        49    6    8
        49    7    6
        49    8    4
        49    9    4
        50    1    9
        50    2    2
        50    3    6
        50    4    7
        50    5    9
        50    6    9
        50    7    9
        50    8    7
        50    9    7
        51    1    2
        51    2    2
        51    3    5
        51    4    5
        51    5    2
        51    6    2
        51    7    2
        51    8    4
        51    9    4
        52    1    6
        52    2    6
        52    3    6
        52    4    6
        52    5    7
        52    6    6
        52    7    6
        52    8    9
        52    9    9
        53    1    8
        53    2    6
        53    3    8
        53    4    6
        53    5    8
        53    6    6
        53    7    7
        53    8    7
        53    9    8
        54    1    5
        54    2    4
        54    3    1
        54    4    7
        54    5    8
        54    6    5
        54    7    3
        54    8    7
        54    9    7
        55    1    3
        55    2    4
        55    3    5
        55    4    8
        55    5    6
        55    6    5
        55    7    6
        55    8    8
        55    9    7

        Comment


        • #19
          I do not understand why you are now switching to ICC. An ICC basically decomposes the total variance into two or more parts. When there is only one target/subject/test/health outcome, there is no variance for that part and, thus, there is no ICC to be calculated. Also, you seem to have the syntax wrong. The output tells you that you have 55 targets and 9 raters. The way you describe your data, it should be the other way round.

          You can easily get the observed agreement with kappaetc (SSC) as I have shown. If for whatever reason, you do not want to or cannot download additional software, you will have to calculate observed agreement yourself. I should mention that there is more than one way to define observed agreement among more than two raters. Alternatively, you might consider reporting overall agreement, which would answer the question of how much experts agree on rating health outcomes' importance. If you decide to stick with ICC, make sure the syntax is the way you want it.

          Comment


          • #20
            I did that and it gave me the same fleiss kappa of -0.0185 for each of the 9 health outcome (note, I transformed the original dataset in this discussion thread, not the subsequent ones):

            Code:
            help kappaetc
            
            . use "/Users/juliadocs/Desktop/raterdata2.dta"
            
            . reshape long subject , i(test)
            (note: j = 1 2 3 4 5 6 7 8 9)
            
            Data                               wide   ->   long
            -----------------------------------------------------------------------------
            Number of obs.                       55   ->     495
            Number of variables                  10   ->       3
            j variable (9 values)                     ->   _j
            xij variables:
                     subject1 subject2 ... subject9   ->   subject
            -----------------------------------------------------------------------------
            
            . 
            . rename (test _j subject) (_j subject test)
            
            . reshape wide test , i(subject) j(_j)
            (note: j = 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 
            > 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55)
            
            Data                               long   ->   wide
            -----------------------------------------------------------------------------
            Number of obs.                      495   ->       9
            Number of variables                   3   ->      56
            j variable (55 values)               _j   ->   (dropped)
            xij variables:
                                               test   ->   test1 test2 ... test55
            -----------------------------------------------------------------------------
            
            . bysort subject , rc0 : kappaetc test1-test55
            
            --------------------------------------------------------------------------------------
            -> subject = 1
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       8
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1643         .      .       .          .          .
            Brennan and Prediger |  0.0449         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0533         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 2
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       8
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1562         .      .       .          .          .
            Brennan and Prediger |  0.0357         .      .       .          .          .
            Cohen/Conger's Kappa | -0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0430         .      .       .          .          .
            Krippendorff's Alpha | -0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 3
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       9
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1374         .      .       .          .          .
            Brennan and Prediger |  0.0295         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0352         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 4
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       7
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1690         .      .       .          .          .
            Brennan and Prediger |  0.0305         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0382         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 5
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       8
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1582         .      .       .          .          .
            Brennan and Prediger |  0.0380         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0456         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 6
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       8
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1475         .      .       .          .          .
            Brennan and Prediger |  0.0257         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0317         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 7
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       9
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1387         .      .       .          .          .
            Brennan and Prediger |  0.0311         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0369         .      .       .          .          .
            Krippendorff's Alpha | -0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 8
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       7
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1670         .      .       .          .          .
            Brennan and Prediger |  0.0282         .      .       .          .          .
            Cohen/Conger's Kappa |  0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0355         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            ------------------------------------------------------------------------------
            
            --------------------------------------------------------------------------------------
            -> subject = 9
            
            Interrater agreement                             Number of subjects =       1
                                                            Ratings per subject =      55
                                                    Number of rating categories =       7
            ------------------------------------------------------------------------------
                                 |   Coef.  Std. Err.    t    P>|t|   [95% Conf. Interval]
            ---------------------+--------------------------------------------------------
               Percent Agreement |  0.1771         .      .       .          .          .
            Brennan and Prediger |  0.0400         .      .       .          .          .
            Cohen/Conger's Kappa | -0.0000         .      .       .          .          .
             Scott/Fleiss' Kappa | -0.0185         .      .       .          .          .
                       Gwet's AC |  0.0491         .      .       .          .          .
            Krippendorff's Alpha |  0.0000         .      .       .          .          .
            -----------------------------------------------------------------------

            Comment


            • #21
              This is the dataset I used for the kappaetc code:

              Code:
              subject    test1    test2    test3    test4    test5    test6    test7    test8    test9    test10    test11    test12    test13    test14    test15    test16    test17    test18    test19    test20    test21    test22    test23    test24    test25    test26    test27    test28    test29    test30    test31    test32    test33    test34    test35    test36    test37    test38    test39    test40    test41    test42    test43    test44    test45    test46    test47    test48    test49    test50    test51    test52    test53    test54    test55
              1    2    3    2    3    2    3    3    3    2    3    3    3    3    2    1    2    3    2    3    3    3    3    2    2    3    3    3    3    2    2    3    3    3    3    2    3    2    3    3    2    3    3    2    3    3    3    3    3    2    3    1    2    3    2    1
              2    2    2    2    2    3    2    3    1    2    1    3    3    3    1    2    2    1    2    1    3    3    2    2    2    3    3    3    2    2    3    2    2    2    1    2    2    2    2    2    2    2    1    1    3    3    2    2    2    3    1    1    2    2    2    2
              3    2    2    2    2    3    3    2    1    1    1    3    3    3    1    2    3    2    2    1    1    3    3    2    1    3    3    1    3    2    3    1    2    3    1    3    3    2    2    2    2    2    1    1    2    3    2    1    2    2    2    2    2    3    1    2
              4    3    3    2    3    3    3    3    2    3    1    3    3    3    3    1    3    3    3    2    2    3    3    3    2    3    3    2    3    2    3    2    2    2    2    2    3    2    2    3    1    3    2    3    3    3    3    3    2    2    3    2    2    2    3    3
              5    3    3    3    3    3    3    3    2    3    1    3    2    3    2    1    3    2    2    1    2    3    3    2    2    3    3    2    3    1    3    3    3    3    2    2    3    2    2    2    3    3    3    3    3    3    2    3    2    3    3    1    3    3    3    2
              6    2    3    2    3    3    3    2    2    2    1    2    3    3    1    1    3    2    1    3    3    3    2    1    1    3    3    1    2    1    3    1    2    2    1    1    3    2    2    2    2    2    3    1    2    3    1    3    3    3    3    1    2    2    2    2
              7    2    3    2    3    3    3    2    3    2    1    2    3    3    1    2    3    2    1    1    2    3    2    2    1    2    3    1    2    1    2    1    2    2    1    2    3    1    2    2    2    2    3    1    2    3    1    2    2    2    3    1    2    3    1    2
              8    3    3    2    3    3    3    3    2    2    1    2    3    3    2    2    3    3    3    2    2    3    3    3    3    3    3    2    2    3    3    2    2    3    2    2    3    2    2    3    1    3    2    3    2    3    2    3    3    2    3    2    3    3    3    3
              9    3    3    2    3    3    3    3    2    2    1    2    3    3    2    2    3    3    3    2    2    3    3    3    3    3    3    2    3    3    2    2    3    3    2    2    3    2    2    3    1    3    3    3    2    3    2    3    3    2    3    2    3    3    3    3

              Comment


              • #22
                Sorry, I changed the 1-9 rating scale so that a 1-3 rating was given a score of 1, 4-6 a score of 2 and 7-9 a score of 3 so I didnt get confused by the number of health outcomes and the score (also on a scale from 1-9)

                Comment


                • #23
                  Perhaps I should have mentioned that kappaetc labels the observed agreement "Percent Agreement". It seems you now have the answer that you were looking for.

                  Note that you might want to use weights for disagreement. If you do, be sure to specify the possible rating categories in the categories() option. The help files discuss these issues.

                  Comment


                  • #24
                    Do you know why I can't see the 95% confidence interval for the percent agreement? I tried doing
                    Code:
                    bysort subject , rc0 : kappaetc test1-test55, level (95)
                    like it tells me in the help file but that doesn't give me anything.

                    Comment


                    • #25
                      Confidence intervals are a part of statistical inference. Statistical inference is used to generalize our results to a larger population. In inter-rater-agreement studies, there are potentially two larger populations: a subject universe, which in your case would consist of more health outcomes, and a rater population, which in your case would consist of other experts. Because you insist on estimating agreement for N=1 subject, your sample size of subjects is not sufficient for estimating standard errors and, hence, confidence intervals. You could get standard errors conditional on the (sample of) subject(s), using the se(conditional subjects) option. That would allow you to generalize your results to a larger rater/expert population; provided, of course, that your raters/experts represent a random sample of the rater population.

                      Comment


                      • #26
                        Thanks, the code worked on getting me the confidence interval. I was wondering, what do you mean by "standard errors conditional on the (sample of) subject(s)"? Why does that give me the confidence interval despite my insufficiently small sample size not giving me the CI using just the code?:
                        Code:
                        bysort subject , rc0 : kappaetc test1-test55

                        Comment


                        • #27
                          Originally posted by Lucy Kay View Post
                          Thanks, the code worked on getting me the confidence interval.
                          As I have tried to explain, there is not the confidence interval. There are two sources of variance: subjects and raters. The sample size of N=1 subject is insufficient for estimating standard errors; The sample size of N=55 raters is sufficient for estimating standard errors. Conditioning on the sample of subjects means that you cannot infer anything about the agreement among raters concerning other subjects (health outcomes) than the one(s) that you have observed. In other words, the subject(s are/)is considered the only one(s) of interest. If the raters (experts) are a random sample from a larger rater/expert population, you can infer agreement to that rater/expert population.
                          Last edited by daniel klein; 04 Jun 2020, 09:26.

                          Comment


                          • #28
                            Oh I see what you mean now. Applying the code means that I am specifying that the agreement is just for the health outcomes listed and that the agreement does not provide information (aka cannot be inferred) for health outcomes beyond the one that is in question. Thank you so much, I have finally figured out this part of my study and can move on. Your help has been immeasurable and I cannot thank you enough!

                            Comment


                            • #29
                              Okay, I have a question again, why is the output of the kappaetc code in the "percent agreement" row rather than the "Scott/Fleiss' Kappa" row?

                              Comment


                              • #30
                                kappaetc usually gives you the observed agreement along with five chance-corrected agreement coefficients. Please show the specific output you are referring to and describe what you do not understand about it.

                                Comment

                                Working...
                                X