Logo
Explore Help
Sign In
eliott/anything-llm
1
0
Fork 0
You've already forked anything-llm
mirror of https://github.com/Mintplex-Labs/anything-llm synced 2026-04-25 17:15:37 +02:00
Code Issues Packages Projects Releases Wiki Activity
Files
990a2e85bf1d9cc3eaf324982a9445d04ea99a02
anything-llm/server/utils
History
timothycarambat 3e088f22b1 fix: Patch tiktoken method missing
resolves #541
2024-01-05 09:39:19 -08:00
..
AiProviders
Implement AzureOpenAI model chat streaming (#518)
2024-01-03 16:25:39 -08:00
boot
523-Added support for HTTPS to Server. (#524)
2024-01-04 17:22:15 -08:00
chats
Handle undefined stream chunk for native LLM (#534)
2024-01-04 18:05:06 -08:00
database
Full developer api (#221)
2023-08-23 19:15:07 -07:00
EmbeddingEngines
fix: fully separate chunkconcurrency from chunk length
2023-12-20 11:20:40 -08:00
files
chore: Force VectorCache to always be on;
2023-12-20 10:45:03 -08:00
helpers
fix: Patch tiktoken method missing
2024-01-05 09:39:19 -08:00
http
Add API key option to LocalAI (#407)
2023-12-04 08:38:15 -08:00
middleware
Create manager role and limit default role (#351)
2023-11-13 14:51:16 -08:00
prisma
Add built-in embedding engine into AnythingLLM (#411)
2023-12-06 10:36:22 -08:00
telemetry
Replace custom sqlite dbms with prisma (#239)
2023-09-28 14:00:03 -07:00
vectorDbProviders
Issue #204 Added a check to ensure that 'chunk.payload' exists and contains the 'id' property (#526)
2024-01-04 16:39:43 -08:00
Powered by Gitea Version: 1.25.4 Page: 67ms Template: 5ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API