pi: fix llama.cpp provider discovery with auth
Some checks failed
Build and Deploy Desktop / deploy (push) Failing after 4s
Some checks failed
Build and Deploy Desktop / deploy (push) Failing after 4s
Add api, authHeader, and discovery.type fields so omp can discover models via GET /v1/models with the Bearer token.
This commit is contained in:
@@ -28,6 +28,9 @@ let
|
||||
"llama.cpp" = {
|
||||
baseUrl = "https://llm.sigkill.computer";
|
||||
apiKey = lib.strings.trim (builtins.readFile ../secrets/llama_cpp_api_key);
|
||||
api = "openai-responses";
|
||||
authHeader = true;
|
||||
discovery.type = "llama.cpp";
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
Reference in New Issue
Block a user