OpenAI API#
The OpenAI Application Programming Interface (API) became a de-facto standard to communicate with LLMs programmatically. The Python interface is open source.
As the field is moving fast an APIs break sometimes, consider printing out the version of the library you used.
import openai
openai.__version__
'1.35.14'
For accessing LLMs you create a client object first.
client = openai.OpenAI()
client
<openai.OpenAI at 0x2aaf62d7d90>
The API expects messages in a certain format:
my_messages = []
my_messages.append({
"role": "user",
"content": "What's the capital of France?"
})
my_messages
[{'role': 'user', 'content': "What's the capital of France?"}]
You can send a request to the server using the chat.completions
API. If you’re planning to use ChatGPT, possible OpenAI models and their prices can be found here.
response = client.chat.completions.create(
model="gpt-3.5-turbo", # or:
messages=my_messages
)
response
ChatCompletion(id='chatcmpl-9mKZtUlo8l2NUTwGzF9AwXe1pcIMD', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='The capital of France is Paris.', role='assistant', function_call=None, tool_calls=None))], created=1721305873, model='gpt-3.5-turbo-0125', object='chat.completion', service_tier=None, system_fingerprint=None, usage=CompletionUsage(completion_tokens=7, prompt_tokens=14, total_tokens=21))
The answer comes in a similar format like the request was sent. It is a list of answers actually.
response.choices
[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='The capital of France is Paris.', role='assistant', function_call=None, tool_calls=None))]
You can access the text-answer like this:
response.choices[0].message.content
'The capital of France is Paris.'
Helper functions#
For using the API, it is highly recommended to write some helper functions such as this one:
def prompt_chatgpt(message:str, model="gpt-3.5-turbo"):
"""A prompt helper function that sends a message to openAI
and returns only the text response.
"""
# convert message in the right format if necessary
if isinstance(message, str):
message = [{"role": "user", "content": message}]
# setup connection to the LLM
client = openai.OpenAI()
# submit prompt
response = client.chat.completions.create(
model=model,
messages=message
)
# extract answer
return response.choices[0].message.content
This makes our life easier because we can easily access the LLM like this:
prompt_chatgpt("How many o are in Woolloomoloo?")
'There are a total of six "o" in the word "Woolloomoloo."'