mirror of
https://github.com/kharonsec/ollama-python
synced 2026-05-11 01:22:14 +02:00
4 lines
305 B
Markdown
4 lines
305 B
Markdown
# async-chat-stream
|
|
|
|
This example demonstrates how to create a conversation history using an asynchronous Ollama client and the chat endpoint. The streaming response is outputted to `stdout` as well as a TTS if enabled with `--speak` and available. Supported TTS are `say` on macOS and `espeak` on Linux.
|