mirror of
https://github.com/kharonsec/ollama-python
synced 2026-05-10 00:52:32 +02:00
async-chat-stream
This example demonstrates how to create a conversation history using an asynchronous Ollama client and the chat endpoint. The streaming response is outputted to stdout as well as a TTS if enabled with --speak and available. Supported TTS are say on macOS and espeak on Linux.