Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Generating variable based on match of two other variables

    Hello,

    I have a dataset looking as follows:
    ISIN deal_type Year
    CH0012221716 1 2000
    CH0012221716 1 2000
    US0008863096 0 2012
    JP3121980001 1 2013
    JP3121980001 0 2013
    JP3121980001 0 2013
    US0077681049 1 2015
    where deal_type =1: M&A; deal_type=0: Firm Divestment

    Now, I am trying to generate two variables. One that gives me the frequency of M&A activity per firm (the firm is identified by ISIN variable) per year. Similarly, the second one should give me the frequency of firm divestment activity per firm (again, the individual firm is identified by ISIN variable) and per year.

    Essentially, I am trying to achieve the following from the above table:

    Firm 1 (CH0012221716), year 2000: 2 M&A deals

    Firm 2 (US0008863096), year 2012: 1 firm divestment

    Firm 3 (JP3121980001), year 2013: 1 M&A deal, 2 firm divestments

    Firm 4 (US0077681049), year 2015: 1 M&A deal

    Is there a way to create two variables that could display the sum of M&A deals and divestment deals in separate rows?

    Thank you and best regards,

    Sebastian

  • #2
    Code:
    * Example generated by -dataex-. For more info, type help dataex
    clear
    input str12 isin byte deal_type int year
    "CH0012221716" 1 2000
    "CH0012221716" 1 2000
    "US0008863096" 0 2012
    "JP3121980001" 1 2013
    "JP3121980001" 0 2013
    "JP3121980001" 0 2013
    "US0077681049" 1 2015
    end
    
    bys isin year: egen MA= total(deal_type)
    bys isin year: egen divest= total(!deal_type)
    Assumes deal_type is binary and never missing. Otherwise, be specific with

    Code:
    bys isin year: egen MA= total(deal_type==1)
    The negation operator is always good for ==0.

    Res.:

    Code:
    . l, sepby(isin year)
    
         +----------------------------------------------+
         |         isin   deal_t~e   year   MA   divest |
         |----------------------------------------------|
      1. | CH0012221716          1   2000    2        0 |
      2. | CH0012221716          1   2000    2        0 |
         |----------------------------------------------|
      3. | JP3121980001          0   2013    1        2 |
      4. | JP3121980001          1   2013    1        2 |
      5. | JP3121980001          0   2013    1        2 |
         |----------------------------------------------|
      6. | US0008863096          0   2012    0        1 |
         |----------------------------------------------|
      7. | US0077681049          1   2015    1        0 |
         +----------------------------------------------+

    Comment


    • #3
      Thank you very much, this already helped a lot!

      Is there also a method that eliminates the doubling/tripling etc. of entries? Eventually, the goal is to end up with the sum of M&A or divestment activity for each firm for each year without having doubles etc because they need to be converted into a sequence.

      Best regards

      Comment


      • #4
        Then you want collapse:

        See

        Code:
        help collapse
        Code:
        * Example generated by -dataex-. For more info, type help dataex
        clear
        input str12 isin byte deal_type int year
        "CH0012221716" 1 2000
        "CH0012221716" 1 2000
        "US0008863096" 0 2012
        "JP3121980001" 1 2013
        "JP3121980001" 0 2013
        "JP3121980001" 0 2013
        "US0077681049" 1 2015
        end
        
        gen divest=!deal_type
        collapse (sum) MA=deal_type divest, by(isin year)
        Res.:

        Code:
        . l, sep(0)
        
             +-----------------------------------+
             |         isin   year   MA   divest |
             |-----------------------------------|
          1. | CH0012221716   2000    2        0 |
          2. | JP3121980001   2013    1        2 |
          3. | US0008863096   2012    0        1 |
          4. | US0077681049   2015    1        0 |
             +-----------------------------------+

        Comment


        • #5
          This is exactly what I need! Thank you so much, I appreciate your help!

          Comment

          Working...
          X