Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • How to conduct a k-sample test of proportions

    Hello,

    I would like to know how to conduct a k-sample test of proportion in Stata. I did a google search on the matter but I only found some very old posts talking about it (and without giving, as I would have expected, a program able to do that).

    More precisely, I have a binary explicated variable y that should be affected by a treatment x.
    I want to see if the proportion of cases in which y = 1 varies significantly across my 8 groups (including the control one). My data looks like that :

    Treatment Proportion of y = 1
    Group 1 6,8%
    Group 2 26,5%
    Group 3 25,2%
    Group 4 23,1%
    Group 5 31,4%
    Group 6 27,7%
    Group 7 28,8%
    Group 8 28,1%

    Ideally, I would like to see whether there is a significant difference in the proportion of y in each treatment compared to the control group, and if there is a difference between the treatments.

    Does someone know of a simple program which could help me, please ?

  • #2
    do you have the denominators? if no, I know of nothing

    Comment


    • #3
      Yes, I have the denominators, I simply presented it this way to illustrate what I want to do.

      Comment


      • #4
        One conventional solution would be a Chi-Squared test from -tabulate-. Is this undesirable in your situation? Or, if your problem is that you have frequencies and no raw data, -tabi- is a solution. Regards, Mike

        Comment


        • #5
          Hello,

          I do have raw data. I did not know that the tabulate command could do that, thank you very much ! It allowed me to see whether there was any difference at all between the treatments, which should be sufficient for now, but is there a way to get a table that shows whether each individual difference between the treatment is significant ? We would then have a p-value for the difference between treatment 1 and 2, between 1 and 3, 1 and 4, 2 and 3, 2 and 4 and so on...
          I feel like the chi² test here only allows us to draw very general conclusions.

          In order to do what I want, would it be statistically correct to simply remove every data from the table except the 2 treatments I'm interested in (using the "if" statement) ?

          Finally, if the p-value for the Chi-square test is high, does that mean that there is absolutely no effect of the treatment, or could I still find a significant difference between two particular treatments ?

          I apologize in advance for the many questions, but I want to be sure that what I do makes sense.
          Last edited by Matthieu Plonquet; 20 Jul 2015, 02:37.

          Comment


          • #6
            Yes, you can remove all the pairs except the one of interest. However, some people are concerned with multiple testing in a situation like this. (Googling "experimentwise Type I error" would be helpful. That might also help answer your questions about individual comparisons. It's possible to have an overall larger p-value, even though the p-value for and individual comparison is small.) That being said, you could test all possible pairs of treatments as follows:

            Code:
             //Assume variables are named treatment and y.
            tab2 y treatment, chi2
            levelsof treatment, local(txlist)
            local top = wordcount("`txlist'")
            local topm1 = `top' -1
            forval i = 1/`topm1' {
              local ip1 = `i' + 1
              forval j = `ip1'/`top' {
                local ti = word("`txlist'", `i')
                local tj = word("`txlist'", `j')
                tab2 y treatment if inlist(treatment, `ti', `tj'), chi2
               }
            }
            Regards, Mike

            Comment


            • #7
              I never heard of experimenwise Type I error, but that was interesting. I'll be sure to keep this in mind.

              Your code did exactly what I needed. Thank you very much for your help !

              Comment

              Working...
              X