-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Issues: BerriAI/litellm
[Bug]: Function Calling Not Working with New o1 Model via lit...
#7292
by mvrodrig
was closed Dec 19, 2024
Closed
13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: Bedrock cross-region-inference for Llama not working
bug
Something isn't working
#7385
opened Dec 23, 2024 by
shadi-fsai
[Bug]: vertex_ai does not support parameters 'presence_penalty', 'frequency_penalty'
bug
Something isn't working
mlops user request
#7378
opened Dec 23, 2024 by
mbukeRepo
[Bug]: New o1 model not supported by litellm?
bug
Something isn't working
#7375
opened Dec 23, 2024 by
cpath-ukk
[Bug]: Streaming Structured Output is not working compared to native OpenAI SDK
bug
Something isn't working
mlops user request
#7374
opened Dec 23, 2024 by
hem210
[Bug]: Key Alias Validation
bug
Something isn't working
mlops user request
#7373
opened Dec 23, 2024 by
lokeish
[Feature]: print alert log to console?
enhancement
New feature or request
#7372
opened Dec 23, 2024 by
elvis-cai
[Bug]: gpt-4o-audio-preview-2024-12-17 is not available on litellm
bug
Something isn't working
#7367
opened Dec 22, 2024 by
ahmed-mohamed-sn
[Bug]: Invalid double await in ollama embeddings in Proxy (fix in report)
bug
Something isn't working
#7366
opened Dec 22, 2024 by
aguadoenzo
[Bug]: stream_chunk_builder does not handle o1 tool_calls properly
bug
Something isn't working
#7364
opened Dec 22, 2024 by
iwamot
[Bug]: Vertex AI - Code Gecko stream not working
bug
Something isn't working
#7360
opened Dec 22, 2024 by
ishaan-jaff
[Bug]: Rate Limit Errors when using with PaperQA
bug
Something isn't working
mlops user request
#7358
opened Dec 22, 2024 by
gurugecl
[Bug]: JSON mode with Ollama assumes Function Calling
bug
Something isn't working
#7355
opened Dec 21, 2024 by
sidjha1
[Feature]: Hi please support system message for gemini 2.0 flash exp thinking
enhancement
New feature or request
#7341
opened Dec 21, 2024 by
dat-lequoc
[Feature]: Automatic Handling of Files Larger Than 20MB for Gemini API
enhancement
New feature or request
#7338
opened Dec 21, 2024 by
icefox57
[Bug]: Non Admin Users able to generate keys using other user's user_id (Vulnerability)
bug
Something isn't working
#7336
opened Dec 21, 2024 by
mirodrr2
[Bug]: Ollama as custom provider does not default to sync by default
bug
Something isn't working
#7332
opened Dec 20, 2024 by
shanbady
[Bug]: Auth issues trying to run replicate model
bug
Something isn't working
#7327
opened Dec 20, 2024 by
geekodour
[Feature]: Add Retry Logic for Guardrails or Allow Skip Post Call Rules or Add Response Format Validator with Type JSON
enhancement
New feature or request
#7320
opened Dec 20, 2024 by
aleksandrphilippov
[Bug]: docker-based build for UI_BASE_PATH fails
bug
Something isn't working
#7318
opened Dec 19, 2024 by
Jflick58
[Bug]: Some small inconsistencies in LiteLLM_SpendLogs -> api_base found
bug
Something isn't working
#7317
opened Dec 19, 2024 by
stronk7
[Bug]: Deferred cache key calculation leads to caching failure when modifying inputs
bug
Something isn't working
#7316
opened Dec 19, 2024 by
npt
Previous Next
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.