Skip to content

How to config model? ​

This guide will show you how to configure custom language models.

gcop use - Promptulate: Large language model automation and Autonomous Language Agents development framework to drive the model. Promptulate allows you to create any language model and build LLM application.

Model template ​

Here is the basic template for configuring the model:

shell
model:
  model_name: provider/name,eg openai/gpt-4o
  api_key: your_api_key

OpenAI ​

If you want to initialize the model, you can use the following command:

shell
model:
  model_name: openai/gpt-4o
  api_key: your_api_key

Claude ​

shell
model:
  model_name: claude-2
  api_key: your_api_key

Deepseek ​

shell
model:
  model_name: deepseek/deepseek-chat
  api_key: your_api_key

Ollama ​

shell
model:
  model_name: ollama/llama2
  api_key: your_api_key
  api_base: http://localhost:11434

OpenAI Proxy ​

If you want to use Zhipu GLM4 by OpenAI proxy, you can use the following configuration:

shell
model:
  model_name: openai/glm-4
  api_key: your_api_key
  api_base: https://open.bigmodel.cn/api/paas/v4/

Use openai/model_name provider means you are using OpenAI SDK to call the model.

OpenRouter ​

shell
model:
  model_name: openrouter/google/palm-2-chat-bison
  api_key: your_api_key

HuggingFace ​

shell
model:
  model_name: huggingface/gpt2
  api_key: your_api_key

More models ​

gcop use promptulate standard to name the model. You can see how to write your model name in here.

promptulate integrates litellm's capabilities and model name standards, so if you want to use any model, you can go directly to the litellm website to view the model name and then use it in promptulate.

Released under the MIT License.