Small, runnable samples that talk to localaik on http://localhost:8090. Use them to sanity-check your setup or as templates for your own code.
Start localaik (pick one):
docker run -d -p 8090:8090 gokhalh/localaik
Or from the repo root: make docker-up (defaults to port 18090 — set PORT=8090 if you want the examples unchanged).
Wait until the model is loaded (GET /health returns 200). The first start can take a while.
Run an example from the directory listed below (each folder has its own dependencies).
Python 3
google-genai, OpenAI client, etc.).curl, but they pipe the response through python3 -m json.tool so the JSON is pretty-printed. Install Python 3 on your PATH, or remove the | python3 -m json.tool suffix and read raw JSON instead.| Language | Gemini | OpenAI | Structured output (Gemini) |
|---|---|---|---|
| curl | curl/gemini.sh | curl/openai.sh | curl/gemini-structured.sh |
| Go | go/gemini | go/openai | go/gemini-structured |
| Python | python/gemini | python/openai | python/gemini-structured |
| JavaScript | javascript/gemini | javascript/openai | javascript/gemini-structured |
| Java | java/gemini | java/openai | java/gemini-structured |
http://localhost:8090 for Gemini-style calls; OpenAI clients use http://localhost:8090/v1.test where the SDK requires one; localaik does not validate keys.localaik as the model id where applicable; the proxy forwards to the bundled upstream model.Examples do not share a single lockfile. Install what you need per language (e.g. google-genai for Python, google.golang.org/genai for Go, official OpenAI packages for OpenAI examples). curl-only: aside from optional python3 for JSON formatting (see above), no extra installs.