Dear Statalisters,
I am drawing repeatedly a random networks (stored in Mata as square matrix), to estimate a mean of "reciprocity" index in a random weigthed network with a given size (167 nodes) and density (0.7279), to later compare it to the one observed in an empirical network with such dimensions.
Everybody would agree that the greater number of drawn would lead me toward a better estimation of the average random reciprocity of such networks.
However my way to store the result of each estimation in one vector (to then input it in Stata, and study its mean, and confidence intervals) is not elegant at all, and leads Mata to an error (-too many tokens r(3000) -), when storing the result of over 150 estimations.
I know that 150 drawn could be enough for what I need, since the distribution is already very close to a Normal one, but I'd like to learn a proper way to do it, and let it roll eventually over 1000 draws, to avoid critics about this detail.
This is what I do until now (using some nwcommands from Thomas Grund package).
When setting `loop'>150, the error appears during the (long) line : mata R = (r_1 \r_2 \r_3 \r_4 \r_5 \r_6 \ r_7 \r_8 ......... )
I guess there is a better way to store all the previous results directly as a vector row, and building the vector iteratively (each loop would take the previous stored result vectors, and add one observation.) but I didn't find a way to do it.
Thanks to all,
Charlie
I am drawing repeatedly a random networks (stored in Mata as square matrix), to estimate a mean of "reciprocity" index in a random weigthed network with a given size (167 nodes) and density (0.7279), to later compare it to the one observed in an empirical network with such dimensions.
Everybody would agree that the greater number of drawn would lead me toward a better estimation of the average random reciprocity of such networks.
However my way to store the result of each estimation in one vector (to then input it in Stata, and study its mean, and confidence intervals) is not elegant at all, and leads Mata to an error (-too many tokens r(3000) -), when storing the result of over 150 estimations.
I know that 150 drawn could be enough for what I need, since the distribution is already very close to a Normal one, but I'd like to learn a proper way to do it, and let it roll eventually over 1000 draws, to avoid critics about this detail.
This is what I do until now (using some nwcommands from Thomas Grund package).
Code:
nwclear set more off local size=167 local density=0.7279 local loop=150 forvalues n=1/`loop'{ noi di `n' nwrandom `size',density(`density') nwreplace random = random*1000 forvalues i =1/`size'{ replace net`i'=net`i'*runiform() replace net`i'=round(net`i') } nwset net* nwsummarize network nwtomata network,mat(W) mata W mata : s=sum(W) mata Z = W :* (W :< W') + W' :* (W' :< W) mata E=sum(Z) mata r=E/s mata r_`n'=r nwdrop random network, attr(net*) mata r_`n' } set obs `loop' mata R = (r_1 \r_2 \r_3 \r_4 \r_5 \r_6 \ r_7 \r_8 ......... \r_150) capture drop reciprocity mata : st_addvar("double","reciprocity") mata : st_store(.,"reciprocity",R) noi su reciprocity,de noi kdensity reciprocity,normal
I guess there is a better way to store all the previous results directly as a vector row, and building the vector iteratively (each loop would take the previous stored result vectors, and add one observation.) but I didn't find a way to do it.
Thanks to all,
Charlie
Comment