llm — lilu API

←index

Overview

LLM client factory. Creates backend-specific clients for text generation via llama.cpp or OpenAI-compatible APIs. Exposes the chat template engine and tool registry as submodules.

Submodules

ModuleDescription
llm.discoveryProvider model discovery framework.
llm.llamacppllama.cpp completion API client.
llm.oaicOpenAI-compatible API client.
llm.templatesChat template engine for prompt formatting.
llm.tool_callsShared helpers for normalizing LLM tool call payloads across different provider wire formats (tool_calls, function, and legacy function_call), while preserving backward compatibility.
llm.toolsTool registry and execution engine for LLM function calling.
llm.utilsShared helpers for LLM backend clients.

Functions

NameSignature
newnew(backend, api_url, api_key) -> client, err

new(backend, api_url, api_key) -> client, err

Create a new LLM client for the specified backend

Returns a client instance with complete, stream, chat_complete, and chat_stream methods. Backend defaults to "llamacpp"; also accepts "oaic" for OpenAI-compatible APIs. API URL and key fall back to LLM_API_URL / LLM_API_KEY environment variables.