Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Client.__init__() got an unexpected keyword argument 'proxies' [Google Colab] #294

Closed
fm1320 opened this issue Dec 10, 2024 · 2 comments · Fixed by #308
Closed
Assignees
Labels
bug Something isn't working, either in /adalflow, /tutorials, or /use cases...

Comments

@fm1320
Copy link
Collaborator

fm1320 commented Dec 10, 2024

Bug description

TypeError: Client.__init__() got an unexpected keyword argument 'proxies'

In colab notebooks adalflow library's OpenAIClient still attempting to pass the proxies keyword argument during the initialization of the OpenAI client from the openai library. Newer versions of the openai library have removed the proxies argument from their Client.init method. The problem is with the new version of the httpx library which openai and other clients use

The solution is downgrading httpx but doing just that causes dependency conflicts so what we need to do is update multiple packages and versions.

Bug Hypothesis: Dependency Conflict Resolution in Different Environments

Observed Behavior:

  • Local environment: Code works fine with:

    • anyio 4.4.0
    • httpx 0.27.0
    • httpx-sse 0.4.0
    • jupyter-server 2.14.2
    • jupyter-server-terminals 0.5.2
  • Colab environment: Code fails with dependency conflict requiring manual downgrade

    • Error indicates jupyter-server 1.24.0 requiring anyio < 4.0

Root Cause Analysis:
The core issue appears to be version mismatching between environments. While the local setup uses a newer jupyter-server (2.14.2) that's compatible with anyio 4.x, Colab is locked to an older jupyter-server version (1.24.0) that explicitly requires anyio < 4.0.

Supporting Evidence:

  1. Local environment shows newer versions working harmoniously (jupyter-server 2.14.2)
  2. Colab environment errors specifically mention jupyter-server 1.24.0's constraint
  3. The fix (downgrading anyio and then installing httpx) works consistently in Colab

Conclusion:
This isn't actually a bug in the code, but rather an environment-specific dependency constraint. The local environment's newer jupyter-server version allows for newer anyio versions, while Colab's older jupyter-server forces us to maintain compatibility with older anyio versions.

Solution is to add these commands after installing adalflow


!pip uninstall httpx anyio -y
!pip install "anyio>=3.1.0,<4.0"
!pip install httpx==0.24.1

What version are you seeing the problem on?

Name: adalflow
Version: 0.2.6
Summary: The Library to Build and Auto-optimize LLM Applications
Home-page: https://github.com/SylphAI-Inc/AdalFlow
Author: Li Yin
Author-email: li@sylphai.com
License: MIT
Location: /usr/local/lib/python3.10/dist-packages
Requires: backoff, boto3, botocore, colorama, diskcache, jinja2, jsonlines, nest-asyncio, numpy, python-dotenv, pyyaml, tiktoken, tqdm
Required-by: 


Name: openai
Version: 1.57.1
Summary: The official Python library for the openai API
Home-page: https://github.com/openai/openai-python
Author: 
Author-email: OpenAI <support@openai.com>
License: Apache-2.0
Location: /usr/local/lib/python3.10/dist-packages
Requires: anyio, distro, httpx, jiter, pydantic, sniffio, tqdm, typing-extensions
Required-by:

How to reproduce the bug

import adalflow as adal

def use_llm():
  openai_llm = adal.Generator(
      model_client = adal.OpenAIClient(),
      model_kwargs = {"model":"gpt-3.5-turbo", "temperature":0.5, "max_tokens":100},
  )


use_llm()

Error messages and logs

 [<ipython-input-18-75dddbbab067>](https://localhost:8080/#) in use_llm()
      3 def use_llm():
      4   openai_llm = adal.Generator(
----> 5       model_client = adal.OpenAIClient(),
      6       model_kwargs = {"model":"gpt-3.5-turbo", "temperature":0.5, "max_tokens":100},
      7   )

[/usr/local/lib/python3.10/dist-packages/adalflow/utils/lazy_import.py](https://localhost:8080/#) in __call__(self, *args, **kwargs)
    119         log.debug(f"Creating class instance: {self.class_}")
    120         # normal class initialization
--> 121         return self.class_(*args, **kwargs)
    122 
    123 

[/usr/local/lib/python3.10/dist-packages/adalflow/components/model_client/openai_client.py](https://localhost:8080/#) in __init__(self, api_key, chat_completion_parser, input_type)
    135         super().__init__()
    136         self._api_key = api_key
--> 137         self.sync_client = self.init_sync_client()
    138         self.async_client = None  # only initialize if the async call is called
    139         self.chat_completion_parser = (

[/usr/local/lib/python3.10/dist-packages/adalflow/components/model_client/openai_client.py](https://localhost:8080/#) in init_sync_client(self)
    146         if not api_key:
    147             raise ValueError("Environment variable OPENAI_API_KEY must be set")
--> 148         return OpenAI(api_key=api_key)
    149 
    150     def init_async_client(self):

[/usr/local/lib/python3.10/dist-packages/openai/_client.py](https://localhost:8080/#) in __init__(self, api_key, organization, project, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
    121             base_url = f"https://api.openai.com/v1"
    122 
--> 123         super().__init__(
    124             version=__version__,
    125             base_url=base_url,

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in __init__(self, version, base_url, max_retries, timeout, transport, proxies, limits, http_client, custom_headers, custom_query, _strict_response_validation)
    855             max_retries=max_retries,
    856             custom_query=custom_query,
--> 857             custom_headers=custom_headers,
    858             _strict_response_validation=_strict_response_validation,
    859         )

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in __init__(self, **kwargs)
    753         kwargs.setdefault("limits", DEFAULT_CONNECTION_LIMITS)
    754         kwargs.setdefault("follow_redirects", True)
--> 755         super().__init__(**kwargs)
    756 
    757 

TypeError: Client.__init__() got an unexpected keyword argument 'proxies'

Environment

  • Google colab notebook

More info

The problem persists for at least but not limited to:
OpenAI and Groq client

@fm1320 fm1320 added the bug Something isn't working, either in /adalflow, /tutorials, or /use cases... label Dec 10, 2024
@fm1320 fm1320 self-assigned this Dec 10, 2024
@fm1320 fm1320 closed this as completed Dec 10, 2024
@fm1320 fm1320 reopened this Dec 16, 2024
@fm1320
Copy link
Collaborator Author

fm1320 commented Dec 16, 2024

Will start working on a PR for this

@fm1320 fm1320 changed the title TypeError: Client.__init__() got an unexpected keyword argument 'proxies' TypeError: Client.__init__() got an unexpected keyword argument 'proxies' [Google Colab] Dec 17, 2024
fm1320 added a commit that referenced this issue Dec 17, 2024
fm1320 added a commit that referenced this issue Dec 17, 2024
@fm1320 fm1320 mentioned this issue Dec 17, 2024
3 tasks
@fm1320
Copy link
Collaborator Author

fm1320 commented Dec 17, 2024

fixed in #308

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working, either in /adalflow, /tutorials, or /use cases...
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant