Skip to content

archaeo_super_prompt.modeling.struct_extract.language_model

source module archaeo_super_prompt.modeling.struct_extract.language_model

Module to load the language model provider.

Functions

  • get_openai_model Return a dspy language model client bound to the OpenAI API.

  • get_ollama_model Return a dspy language model client bound to an ollama server.

  • get_vllm_model Return a dspy language model client bound to a vllm server.

source get_openai_model(model_id='gpt-4.1', temperature=0.0)

Return a dspy language model client bound to the OpenAI API.

Parameters

  • model_id the identifier as in the OpenAI api: https://dspy.ai/learn/programming/language_models/

  • temperature the temperature of the model during its usage.

Environment requirements

The OPENAI_API_KEY envrionment variable must be defined to use the API

source get_ollama_model(model_id='gemma3:27b', temperature=0.0)

Return a dspy language model client bound to an ollama server.

Parameters

  • model_id see this page: https://dspy.ai/learn/programming/language_models/

  • temperature the temperature of the model during its usage.

Environment requirements

The OLLAMA_SERVER_BASE_URL envrionment variable can be defined to override the default ollama api's base url, served on http://localhost:11434

source get_vllm_model(model_id='google/gemma-3-27b-it', temperature=0.0)

Return a dspy language model client bound to a vllm server.

Parameters

  • model_id the identifier of the model as in the hugging face hub; see this page: https://dspy.ai/learn/programming/language_models/

  • temperature the temperature of the model during its usage.

Environment requirements

The VLLM_SERVER_BASE_URL envrionment variable can be defined to override the default ollama api's base url, served on http://localhost:8006/v1