Dear all,
im relatively new to programming in Mata.
I'm trying to solve linear optimization problems for multiple observations (individuals). As q.setCoefficients() apparently only supports row vectors (not matrices) but each individual faces different prices in the objective function, I thought I could simply loop over each individual separately and save the results in a matrix. For simplicity, I kept the parameters identical over the loop in the following example:
While the code works without the "for()" loop, the program aborts with the loop at "q.setCoefficients(c)" telling me "type mismatch: exp.exp: transmorphic found where struct expected".
I looked around in the forum and used Google for similar problems. But I was unable to come up with a solution.
Can anyone help me please? Of course, alternative solutions that allow the coefficients to vary in the objective function would be appreciated too.
Thank you,
Sebastian
im relatively new to programming in Mata.
I'm trying to solve linear optimization problems for multiple observations (individuals). As q.setCoefficients() apparently only supports row vectors (not matrices) but each individual faces different prices in the objective function, I thought I could simply loop over each individual separately and save the results in a matrix. For simplicity, I kept the parameters identical over the loop in the following example:
Code:
mata: x = J(2,3,.) for (i=1; i<=3; i++) { c = (1, 1, 0) // Would change wih each individual Aec = (1, 1, 1 \ 1, -1, 2) bec = (5 \ 8) lowerbd = (1, 0, 0) upperbd = (., 2, .) q = LinearProgram() q.setCoefficients(c) q.setMaxOrMin("min") q.setEquality(Aec, bec) q.setBounds(lowerbd, upperbd) q.optimize() q.parameters() x[i,.] = q.parameters() // Store parameter } x st_matrix("coef", x) end
I looked around in the forum and used Google for similar problems. But I was unable to come up with a solution.
Can anyone help me please? Of course, alternative solutions that allow the coefficients to vary in the objective function would be appreciated too.
Thank you,
Sebastian
Comment