Labs / Optional path

Bootstrap a real local model endpoint

This is the non-toy entrance into the same lab journey: prove that a real local runtime answers one boring request, then rejoin bootstrap.

Recommended default

Do one real local probe before you build any wrappers.

The goal is not to finish local AI. The goal is to confirm that your chosen runtime exposes a real endpoint you can script against.

Artifact

probe.py speaks either Ollama-style or OpenAI-compatible local endpoints.

Boundary

This is the first optional path where the model surface is expected to be real, not simulated.

Rejoin point

Once one prompt works, return to the bootstrap step and treat that local endpoint as your starting model surface.

Companions: local hosting and model artifacts and local hardware and runtime fit.