Files
ollama-python/examples/async-chat-stream/README.md
2024-01-18 11:20:48 -08:00

305 B

async-chat-stream

This example demonstrates how to create a conversation history using an asynchronous Ollama client and the chat endpoint. The streaming response is outputted to stdout as well as a TTS if enabled with --speak and available. Supported TTS are say on macOS and espeak on Linux.