Thursday, October 16, 2025

Stata instructions to run ChatGPT, Claude, Gemini, and Grok


I wrote a weblog submit in 2023 titled A Stata command to run ChatGPT, and it stays in style. Sadly, OpenAI has modified the API code, and the chatgpt command in that submit now not runs. On this submit, I’ll present you methods to replace the API code and methods to write related Stata instructions that use Claude, Gemini, and Grok like this:

. chatgpt "Write a haiku about Stata."

Knowledge flows with ease,
Stata charts the silent truths,
Insights bloom in code.

. claude "Write a haiku about Stata."

Here's a haiku about Stata:

Stata, my previous buddy
Analyzing knowledge with ease
Insights ever discovered

. gemini "Write a haiku about Stata."

Instructions stream so quick,
Knowledge formed, fashions outlined,
Insights now seem.

. grok "Write a haiku about Stata."

Knowledge streams unfold,
Stata weaves the threads of fact,
Insights bloom in code.

The main focus of this submit, just like the earlier one, is to display how simple it’s to make the most of the PyStata options to hook up with ChatGPT and different AI instruments moderately than to provide recommendation on methods to use AI instruments to reply Stata-specific questions. Subsequently, the examples I present merely ask for a haiku about Stata. Nonetheless, you may cross any request that you’d discover useful in your Stata workflow.

Evaluate of Stata/Python integration

I’ll assume that you’re acquainted with Stata/Python integration and methods to write the unique chatgpt command. It would be best to learn the weblog posts under if these subjects are unfamiliar.

Updating the ChatGPT command

You will want an Open AI person account and your personal Open AI API key to make use of the code under. I used to be unable to make use of my previous API key from 2023, and I needed to create a brand new key.

Additionally, you will have to sort shell pip set up openai within the Stata Command window to set up the Python bundle openai. It’s possible you’ll want to make use of a special technique to put in the openai bundle if you’re utilizing Python as a part of a platform reminiscent of Anaconda. I needed to sort shell pip uninstall openai to take away the previous model and sort shell pip set up openai to put in the newer model.

Subsequent we might want to exchange the previous Python code with newer code utilizing the trendy API syntax. I typed python operate to immediate chatgpt by way of api right into a search engine that led me to the Developer quickstart web page on the OpenAI web site. Some studying adopted by trial and error resulted within the Python code under. The Python operate query_openai() sends the immediate by way of the API, makes use of the “gpt-4.1-mini” mannequin, and receives the response. I didn’t embody any choices for different fashions, however you may change the mannequin should you like.

The remaining Python code does three issues with the response. First, it prints the response in Stata’s Outcomes window. Second, it writes the response to a file named chatgpt_output.txt. And third, it makes use of Stata’s SFI module to cross the response from Python to a neighborhood macro in Stata. The third step works effectively for easy responses, however it might result in errors for lengthy responses that embody nonstandard characters or many single or double quotations. You may place a # character at the start of the road “Macro.setLocal(…” to remark out that line and forestall the error.

It can save you the code under to a file named chatgpt.ado, place the file in your private ado-folder, and use it like another Stata command. You may sort adopath to find your private ado-folder.


seize program drop chatgpt
program chatgpt, rclass
    model 19.5 // (or model 19 should you shouldn't have StataNow)
    args InputText
    show ""
    python: query_openai("`InputText'", "gpt-4.1-mini")
    return native OutputText = `"`OutputText'"'
finish
    
python:
import os
from openai import OpenAI
from sfi import Macro
    
def query_openai(immediate: str, mannequin: str = "gpt-4.1-mini") -> str:
    # Move the enter string from a Stata native macro to Python
    inputtext = Macro.getLocal('InputText')

    # Enter your API key
    consumer = OpenAI(api_key=”PASTE YOUR API KEY HERE“)

    # Ship the immediate by way of the API and obtain the response
    response = consumer.chat.completions.create(
        mannequin= mannequin,
        messages=[
            {“role”: “user”, “content”: inputtext}
        ]
    )

    # Print the response within the Outcomes window
    print(response.selections[0].message.content material)

    # Write the response to a textual content file
    f = open(“chatgpt_output.txt”, “w”)
    f.write(response.selections[0].message.content material)
    f.shut()

    # Move the response string from Python again to a Stata native macro
    Macro.setLocal(“OutputText”, response.selections[0].message.content material)
finish

Now we will run our chatgpt command and look at the response within the Outcomes window.

. chatgpt "Write a haiku about Stata."

Knowledge flows with ease,
Stata charts the silent truths,
Insights bloom in code.

We will sort return checklist to view the response saved within the native macro r(OutputText).

. return checklist

macros:
         r(OutputText) : "Knowledge flows with ease, Stata charts the silent truths, Insights bloom .."

And we will sort sort chatgpt_output.txt to view the response saved within the file chatgpt_output.txt.

. sort chatgpt_output.txt
Knowledge flows with ease,
Stata charts the silent truths,
Insights bloom in code.

It labored! Let’s see whether or not we will use an analogous technique to create a Stata command for one more AI mannequin.

A Stata command to make use of Claude

Claude is a well-liked AI mannequin developed by Anthropic. Claude contains an API interface, and you will have to arrange a person account and get an API key on its web site. After buying my API key, I typed python operate to question claude api, which led me to the Get began with Claude web site. Once more, some studying and trial and error led to the Python code under. You will want to sort shell pip set up anthropic in Stata’s Command window to put in the anthropic bundle.

Discover how related the Python code under is to the Python code in our chatgpt command. The one main distinction is the code that sends the immediate by way of the API and receives the response. Every little thing else is sort of similar.

It can save you the code under to a file named claude.ado, put the file in your private ado-folder, and use it similar to another Stata command.


seize program drop claude
program claude, rclass
    model 19.5 // (or model 19 should you shouldn't have StataNow)
    args InputText
    show ""
    python: query_claude()
    return native OutputText = `"`OutputText'"'
finish
    
python:
import os
from sfi import Macro
from anthropic import Anthropic
    
def query_claude():
    # Move the enter string from a Stata native macro to Python
    inputtext = Macro.getLocal('InputText')

    # Enter your API key
    consumer = Anthropic(
        api_key=’PASTE YOUR API KEY HERE
    )

    # Ship the immediate by way of the API and obtain the response
    response = consumer.messages.create(
        mannequin=”claude-3-haiku-20240307″,
        max_tokens=1000,
        messages=[
            {“role”: “user”, “content”: inputtext}
        ]
    )

    # Print the response to the Outcomes window
    print(response.content material[0].textual content)

    # Write the response to a textual content file
    f = open(“claude_output.txt”, “w”)
    f.write(response.content material[0].textual content)
    f.shut()

    # Move the response string from Python again to a Stata native macro
    Macro.setLocal(“OutputText”, response.content material[0].textual content)

finish

Now we will run our claude command and look at the response.

. claude "Write a haiku about Stata."

Here's a haiku about Stata:

Stata, my previous buddy
Analyzing knowledge with ease
Insights ever discovered

We will sort return checklist to view the response saved within the native macro r(OutputText).

. return checklist

macros:
         r(OutputText) : "Here's a haiku about Stata: Stata, my previous buddy Analyzing knowledge with ea.."

And we will sort sort claude_output.txt to view the response saved within the file claude_output.txt.

. sort claude_output.txt
Here's a haiku about Stata:

Stata, my previous buddy
Analyzing knowledge with ease
Insights ever discovered

It’s possible you’ll typically see an error just like the one under. This doesn’t point out an issue along with your code. It’s telling you that the API service or community has timed out or has been interrupted. Merely wait and check out once more.

  File "C:UsersChuckStataAppDataLocalProgramsPythonPython313Libsite-packagesanthropic
> _base_client.py", line 1065, in request
    elevate APITimeoutError(request=request) from err
anthropic.APITimeoutError: Request timed out or interrupted. This may very well be as a result of a community timeout, 
> dropped connection, or request cancellation. See https://docs.anthropic.com/en/api/errors#long-requests
> for extra particulars.
r(7102);

A Stata command to make use of Gemini

Gemini is a well-liked AI mannequin developed by Google. Gemini additionally contains an API interface and you will have to arrange a person account and get an API key on its web site. After buying my API key, I typed python operate to question gemini api, which led me to the Gemini API quickstart web site. Once more, some studying and trial and error led to the Python code under. You will want to sort shell pip set up -q -U google-genai in Stata’s Command window to put in the google-genai bundle.

Once more, it can save you the code under to a file named gemini.ado, put the file in your private ado-folder, and use it similar to another Stata command.


seize program drop gemini
program gemini, rclass
    model 19.5 // (or model 19 should you shouldn't have StataNow)
    args InputText
    show ""
    python: query_gemini()
    return native OutputText = `"`OutputText'"'
finish
    
python:
import os
from sfi import Macro
from google import genai
    
def query_gemini():
    # Move the enter string from a Stata native macro to Python
    inputtext = Macro.getLocal('InputText')

    # Enter your API key
    consumer = genai.Consumer(api_key=”PASTE YOUR API KEY HERE“)

    # Ship immediate by way of the claude API key and get response
    response = consumer.fashions.generate_content(
        mannequin=”gemini-2.5-flash”, contents=inputtext
    )

    # Print the response to the Outcomes window
    print(response.textual content)

    # Write the response to a textual content file
    f = open(“gemini_output.txt”, “w”)
    f.write(response.textual content)
    f.shut()

    # Move the response string from Python again to a Stata native macro
    Macro.setLocal(“OutputText”, response.textual content)
finish

Now we will run our gemini command and look at the response.

. gemini "Write a haiku about Stata."

Instructions stream so quick,
Knowledge formed, fashions outlined,
Insights now seem.

We will sort return checklist to view the response saved within the native macro r(OutputText).

. return checklist

macros:
         r(OutputText) : "Instructions stream so quick, Knowledge formed, fashions outlined, Insights now appea.."

And we will sort sort gemini_output.txt to view the response saved within the file gemini_output.txt.

. sort gemini_output.txt
Instructions stream so quick,
Knowledge formed, fashions outlined,
Insights now seem.

A Stata command to make use of Grok

OK, another only for enjoyable. Grok is one other in style AI mannequin developed by xAI. You will want to arrange a person account and get an API key on its web site. After buying my API key, I typed python operate to question grok api, which led me to the Hitchhiker’s Information to Grok web site. Once more, some studying and trial and error led to the Python code under. You will want to sort shell pip set up xai_sdk in Stata’s Command window to put in the xai_sdk bundle.

As soon as once more, it can save you the code under to a file named grok.ado, put the file in your private ado-folder, and use it similar to another Stata command.


seize program drop grok
program grok, rclass
    model 19.5 // (or model 19 should you shouldn't have StataNow)
    args InputText
    show ""
    python: query_grok("`InputText'", "grok-4")
    return native OutputText = `"`OutputText'"'
finish
    
python:
import os
from sfi import Macro
from xai_sdk import Consumer
from xai_sdk.chat import person, system
    
def query_grok(immediate: str, mannequin: str = "grok-4") -> str:
    # Move the enter string from a Stata native macro to Python
    inputtext = Macro.getLocal('InputText')

    # Enter your API key
    consumer = Consumer(api_key=”PASTE YOUR API KEY HERE“)

    # Ship immediate by way of the claude API key and get response
    chat = consumer.chat.create(mannequin=mannequin)
    chat.append(person(inputtext))
    response = chat.pattern()

    # Print the response to the Outcomes window
    print(response.content material)

    # Write the response to a textual content file
    f = open(“grok_output.txt”, “w”)
    f.write(response.content material)
    f.shut()

    # Move the response string from Python again to a Stata native macro
    Macro.setLocal(“OutputText”, response.content material)
finish

Now we will run our grok command and look at the response within the Outcomes window.

. grok "Write a haiku about Stata."

Knowledge streams unfold,
Stata weaves the threads of fact,
Insights bloom in code.

We will sort return checklist to view the reply saved within the native macro r(OutputText).

. return checklist

macros:
         r(OutputText) : "Knowledge streams unfold, Stata weaves the threads of fact,  Insights b.."

And we will sort sort grok_output.txt to view the leads to the file grok_output.txt.

. sort grok_output.txt
Knowledge streams unfold,
Stata weaves the threads of fact,
Insights bloom in code.

Conclusion

I hope the examples above have satisfied you that it’s comparatively simple to write down and replace your personal Stata instructions to run AI fashions. My examples have been deliberately easy and just for instructional functions. However I’m certain you possibly can think about many choices that you may add to permit the usage of different fashions for different kinds of prompts, reminiscent of sound or photos. Maybe a few of you can be impressed to write down your personal instructions and submit them on the internet.

These of you who learn this submit sooner or later could discover it irritating that the API syntax has modified once more and the code above now not works. That is the character of utilizing APIs. They modify and you’ll have to do some homework to replace your code. However there are various sources accessible on the web that can assist you replace your code or write new instructions. Good luck and have enjoyable!



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles