block by mrchrisadams 0d4748ded24c22189ecb386b8b6f16c9

Experimenting with getting a chat interface running with a local LLM using Marimo. Run with marimo edit `local-llm-with-ollama.py`

local-llm-with-ollama.md

local-llm-with-ollama.py