Artifact
probe.py
speaks either Ollama-style or OpenAI-compatible local endpoints.
Labs / Optional path
This is the non-toy entrance into the same lab journey: prove that a real local runtime answers one boring request, then rejoin bootstrap.
Recommended default
The goal is not to finish local AI. The goal is to confirm that your chosen runtime exposes a real endpoint you can script against.
probe.py
speaks either Ollama-style or OpenAI-compatible local endpoints.
This is the first optional path where the model surface is expected to be real, not simulated.
Once one prompt works, return to the bootstrap step and treat that local endpoint as your starting model surface.
Companions: local hosting and model artifacts and local hardware and runtime fit.