Part of a program I'm writing needs a parsing function to separate code-based data requests (delimited by ","). While I've been able to provide a function that produces locals for each request, I understand there is a simpler way using either -tokenget- or -tokengetall-. However, using that approach, the advantages of a resulting string vector are outweighed by my lack of knowledge in why and how to call them from the containing program. I've read the -tokenget- section of the Stata manual, and haven't been able to reproduce content that gives the (tokens[1], tokens[2], ...) result.
From the below code, the data strings I need are simply the locals, manually displayed at the bottom. How can I go about using the token-based commands in future functions, and how do I call the results outside the function? Any other immediate thoughts?
From the below code, the data strings I need are simply the locals, manually displayed at the bottom. How can I go about using the token-based commands in future functions, and how do I call the results outside the function? Any other immediate thoughts?
Code:
local abc "SMS1,SMS2" mata: function blstokenize(string scalar txt, real scalar snum) { string scalar series, y1 real scalar z1, z2, ct ct=0 st_strscalar("txt",txt) while(ct<snum) { ct=1+ct z1=strpos(st_strscalar("txt"),",") if(z1>0) { st_strscalar("y1",substr(st_strscalar("txt"),1,z1)) z2=z1-1 st_strscalar("series",substr(st_strscalar("y1"),1,st_numscalar("z2"))) } else { st_strscalar("y1",substr(st_strscalar("txt"),1,strlen(st_strscalar("txt")))) z2=0 st_strscalar("series",st_strscalar("y1")) } st_local("series"+strofreal(ct),st_strscalar("series")) st_strscalar("txt",ustrregexra(st_strscalar("txt"),st_strscalar("y1"),"")) } } blstokenize(st_local("abc"),2) display(st_local("series1")) display(st_local("series2")) end
Comment