local-llm-server/llm_server/messages.py

2 lines
92 B
Python

BACKEND_OFFLINE = 'The model you requested is not a valid choice. Please retry your query.'