Let there be light!
Requirements:
Mata file (don't forget to add your API, then run it the same way you run do files):
In Stata:
Requirements:
Code:
pip install openai
Code:
*! version 1.0.0 05sep2024 version 16.0 clear all loc RS real scalar loc SS string scalar loc SM string matrix mata: mata set matastrict on `SM' chatgpt(|`SM' text, `SS' cmd, `SS' model) { `RS' i, j if (text == J(0,0,"")) { printf("arguments: - 1x1 or a bigger string matrix\n") printf(" - command: a sentence, e.g., ending in ':'\n") printf(" - 'chatgpt-4o-latest' (by default), etc.\n") exit(0) } text = ustrregexra(text, `"([^\\])(['"])"', "$1\\\$2") model = model != "" ? model : "chatgpt-4o-latest" stata("python: import sfi", 1) stata("python: from openai import OpenAI", 0) stata("python: client = OpenAI(api_key='YOUR API GOES HERE')", 0) for(i = 1; i <= rows(text); i++) { for(j = 1; j <= cols(text); j++) { stata("python: sfi.Macro.setLocal('msg', client.chat.completions." + "create(model='" + model + "',messages=[{'role' : " + "'user','content' : '" + cmd + " " + text[i,j] + "'}]).choices[0].message.content)", 0) text[i,j] = st_local("msg") } } return(text) } end version 16.0: lmbuild lchatgpt.mlib, replace size(2)
Code:
mata: chatgpt() mata: st_local("msg", chatgpt("lorem ipsum", "produce a paragraph of: ")) di "`msg'"
Comment