Long context prompting#

To answer questions, e.g. about the HPC system, we can provide the entire documentation as text. In this example, we will use the HPC compendium which is licensed under CC BY 4.0 by TU Dresden ZIH.

This technique is also known as in-context learning.

from utilities import prompt_gemini
from IPython.display import display, Markdown

This is the question we aim to answer:

question = "How can I access the Jupyter Hub on the HPC system?"
# Read the full-text content of the HPC compendium 
with open('hpc_compendium_full_text.md', 'r', encoding='utf-8') as f:
    content = f.read()

response = prompt_gemini(f"""
Given a long text, answer this question:

## Question

{question}

## Long text

{content}

## Your task:

Answer the question: {question}
""", model="gemini-1.5-pro-exp-0827")

display(Markdown(response))

You can access the Jupyter Hub on the HPC system through the following URL: https://jupyterhub.hpc.tu-dresden.de. You will need an active HPC project and your ZIH credentials to log in.

Exercise#

The example above uses Google gemini to prompt for an answer to our question. Why do you think so?

Hint: Try to run the code using prompt_scadsai_llm().