·
34 commits
to a5ef8a95566101deca846651031ffefb36fb688a
since this release
What's Changed
- Fix UI Flicker in Dashboard by @crisshaker in #10261
- Keys and tools pages: Use proper terminology for loading and no data cases by @msabramo in #10253
- adding support for cohere command-a-03-2025 by @ryanchase-cohere in #10295
- [Feat] Add GET, DELETE Responses endpoints on LiteLLM Proxy by @ishaan-jaff in #10297
- [Bug Fix] Timestamp Granularities are not properly passed to whisper in Azure by @ishaan-jaff in #10299
- Contributor PR - Support max_completion_tokens on Sagemaker (#10243) by @ishaan-jaff in #10300
- feat(grafana_dashboard): enable datasource selection via templating by @minatoaquaMK2 in #10257
- Update image_generation.md parameters by @daureg in #10312
- Update deprecation dates and prices by @o-khytrov in #10308
- Fix SSO user login - invalid token error by @krrishdholakia in #10298
- UI - Add team based filtering to models page by @krrishdholakia in #10325
- UI (Teams Page) - Support filtering by team id + team name by @krrishdholakia in #10324
- Move UI to encrypted token usage by @krrishdholakia in #10302
- add azure/gpt-image-1 pricing by @marty-sullivan in #10327
- fix(ui_sso.py): support experimental jwt keys for UI auth w/ SSO by @krrishdholakia in #10326
- UI (Keys Page) - Support cross filtering, filter by user id, filter by key hash by @krrishdholakia in #10322
- [Feat] Responses API - Add session management support for non-openai models by @ishaan-jaff in #10321
- Fix the table render on key creation. by @NANDINI-star in #10224
- Internal Users: Refresh user list on create by @crisshaker in #10296
- [Docs] UI Session Logs by @ishaan-jaff in #10334
New Contributors
- @ryanchase-cohere made their first contribution in #10295
- @minatoaquaMK2 made their first contribution in #10257
- @daureg made their first contribution in #10312
- @o-khytrov made their first contribution in #10308
Full Changelog: v1.67.3.dev1...v1.67.4-nightly
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.4-nightly
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.4-nightly
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 220.0 | 247.38248619018765 | 6.061326672784343 | 0.0 | 1814 | 0 | 197.54123199999185 | 2435.6727050000018 |
Aggregated | Passed ✅ | 220.0 | 247.38248619018765 | 6.061326672784343 | 0.0 | 1814 | 0 | 197.54123199999185 | 2435.6727050000018 |