DeepSeek endpoint#

In this notebook we will use the DeepSeek API. Before you can access it, you need to create an account and API key. You will see that also this method uses the OpenAI API and we change the base_url.

import os
import openai
openai.__version__
'1.58.1'
def prompt_deepseek(message:str, model="deepseek-chat"):
    """A prompt helper function that sends a message to DeepSeek
    and returns only the text response.
    """
    import os
    
    # convert message in the right format if necessary
    if isinstance(message, str):
        message = [{"role": "user", "content": message}]
    
    # setup connection to the LLM
    client = openai.OpenAI()
    client.base_url = "https://api.deepseek.com"
    client.api_key = os.environ.get('DEEPSEEK_API_KEY')
    response = client.chat.completions.create(
        model=model,
        messages=message
    )
    
    # extract answer
    return response.choices[0].message.content
prompt_deepseek("Hi!")
'Hello! How can I assist you today? 😊'

Exercise#

List the models available in the DeepSeek endpoint and try them out by specifying them when calling prompt_deepseek().

client = openai.OpenAI()
client.base_url = "https://api.deepseek.com"
client.api_key = os.environ.get('DEEPSEEK_API_KEY')

print("\n".join([model.id for model in client.models.list().data]))
deepseek-chat
deepseek-reasoner