Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: stream_chunk_builder does not handle o1 tool_calls properly #7364

Open
iwamot opened this issue Dec 22, 2024 · 0 comments
Open

[Bug]: stream_chunk_builder does not handle o1 tool_calls properly #7364

iwamot opened this issue Dec 22, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@iwamot
Copy link
Contributor

iwamot commented Dec 22, 2024

What happened?

When calling o1 with streaming and combining the results with stream_chunk_builder, the tool_calls information is missing.

import litellm
import json

import os
os.environ['OPENAI_API_KEY'] = "sk-xxxxxxxx"

def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    if "tokyo" in location.lower():
        return json.dumps({"location": "Tokyo", "temperature": "10", "unit": "celsius"})
    elif "san francisco" in location.lower():
        return json.dumps({"location": "San Francisco", "temperature": "72", "unit": "fahrenheit"})
    elif "paris" in location.lower():
        return json.dumps({"location": "Paris", "temperature": "22", "unit": "celsius"})
    else:
        return json.dumps({"location": location, "temperature": "unknown"})

messages = [{"role": "user", "content": "What's the weather like in San Francisco, Tokyo, and Paris?"}]
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        },
    }
]

# Without streaming
response = litellm.completion(
    model="o1-2024-12-17",
    messages=messages,
    tools=tools,
    tool_choice="auto",
)
print(response)

# With streaming
response = litellm.completion(
    model="o1-2024-12-17",
    messages=messages,
    tools=tools,
    tool_choice="auto",
    stream=True,
)
chunks = []
for chunk in response:
    chunks.append(chunk)
print(litellm.stream_chunk_builder(chunks))

Relevant log output

# Without streaming
ModelResponse(id='chatcmpl-Ah855e8il626Lt5NQtGd4QSBl1M1c', created=1734842531, model='o1-2024-12-17', object='chat.completion', system_fingerprint='fp_e6d02d4a78', choices=[Choices(finish_reason='tool_calls', index=0, message=Message(content=None, role='assistant', tool_calls=[ChatCompletionMessageToolCall(function=Function(arguments='{"location":"San Francisco, CA","unit":"fahrenheit"}', name='get_current_weather'), id='call_KFG9HpBA2j1du52mmQ4dgQOd', type='function')], function_call=None))], usage=Usage(completion_tokens=2211, prompt_tokens=85, total_tokens=2296, completion_tokens_details=CompletionTokensDetailsWrapper(accepted_prediction_tokens=0, audio_tokens=0, reasoning_tokens=2176, rejected_prediction_tokens=0, text_tokens=None), prompt_tokens_details=PromptTokensDetailsWrapper(audio_tokens=0, cached_tokens=0, text_tokens=None, image_tokens=None)), service_tier=None)

# With streaming
ModelResponse(id='chatcmpl-56efbef8-a038-48cc-86e7-1ad701012367', created=1734842576, model='o1-2024-12-17', object='chat.completion', system_fingerprint=None, choices=[Choices(finish_reason='tool_calls', index=0, message=Message(content='', role='assistant', tool_calls=None, function_call=None))], usage=Usage(completion_tokens=2339, prompt_tokens=85, total_tokens=2424, completion_tokens_details=None, prompt_tokens_details=None))

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.55.9

Twitter / LinkedIn details

@iwamot / https://www.linkedin.com/in/iwamot/

@iwamot iwamot added the bug Something isn't working label Dec 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant