Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MagenticOne API #4782

Open
wants to merge 23 commits into
base: main
Choose a base branch
from
Open

Add MagenticOne API #4782

wants to merge 23 commits into from

Conversation

gagb
Copy link
Collaborator

@gagb gagb commented Dec 21, 2024

Enables a simple API

import asyncio
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.teams.magentic_one import MagenticOne
from autogen_agentchat.ui import Console

async def example_usage_hil():
    client = OpenAIChatCompletionClient(model="gpt-4o")
    # to enable human-in-the-loop mode, set hil_mode=True
    m1 = MagenticOne(client=client, hil_mode=True)
    task = "Write a Python script to fetch data from an API."
    result = await Console(m1.run_stream(task=task))
    print(result)


if __name__ == "__main__":
    asyncio.run(example_usage_hil())

Copy link
Contributor

@husseinmozannar husseinmozannar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

Let's create a doc/blog that has contents similar to the m1 package readme https://github.com/microsoft/autogen/tree/main/python/packages/autogen-magentic-one

let's also add a hil_mode (human-in-loop mode) (default is False) where we optionally add the user proxy as part of the agent team like the m1 example https://github.com/microsoft/autogen/blob/main/python/packages/autogen-magentic-one/examples/example.py

@gagb
Copy link
Collaborator Author

gagb commented Dec 22, 2024

Added:

  • hil_mode
  • caution statement
  • architecture section
  • Logging
  • Azure instructions
  • BING API KEY instructions

Do you think logging, azure, bing are necessary? This is what docs currently look like.

image image image image

@husseinmozannar husseinmozannar marked this pull request as ready for review December 22, 2024 02:46
@husseinmozannar
Copy link
Contributor

Logging
Azure instructions
BING API KEY instructions

are no longer necessary! so I think this is good to go

@husseinmozannar
Copy link
Contributor

I pushed a small commit to disable ocr of the websurfer, I am only seeing refusals for OCR so it is a wasted API call

@husseinmozannar
Copy link
Contributor

@gagb Maybe also point to this API in the readme of the magentic-one package and say this is a preferred way to use magentic-one. I wonder if it's also worth a reference in the main readme of autogen or in other prominent places

"""

def __init__(self, client: OpenAIChatCompletionClient):
def __init__(self, client: OpenAIChatCompletionClient, hil_mode: bool = False):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use autogen_core.models.ChatCompletionClient base class instead of concrete class for the argument type

Copy link
Collaborator Author

@gagb gagb Dec 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did this because we only tested with openai. Do you still think I should change it?

Copy link
Collaborator

@ekzhu ekzhu Dec 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see. Though this client has already been used by many for Ollama models. Also there is AzureOpenAIChatCompletionClient, which we also tested on. So I think we should still update it to use the base class.

Instead of rely on type, I think we need to validate the model capabilities in the constructor -- unless it has already been done in the base class. Sorry on mobile so a bit hard to switch page here.

We can also validate the model name, and raise warning if the model is something we haven't tested.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay good point. I will do this.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

  1. I added a private method to the class that validates the client.
    image

  2. I had to expose the _create_args in BaseOpenAIChatCompletion to achieve this.
    image

@ekzhu
Copy link
Collaborator

ekzhu commented Dec 22, 2024

@gagb Maybe also point to this API in the readme of the magentic-one package and say this is a preferred way to use magentic-one. I wonder if it's also worth a reference in the main readme of autogen or in other prominent places

I agree. I think we should update the MagenticOne page to point to this API.

Would be good to have this team as a preset in Studio also. @victordibia

@gagb
Copy link
Collaborator Author

gagb commented Dec 23, 2024

What if we added a simple CLI that looks like this if you install autogen_ext?

pip install autogen_ext
m1 "find me a nice shwarma place in my neighborhood"

@gagb gagb mentioned this pull request Dec 23, 2024
@ekzhu
Copy link
Collaborator

ekzhu commented Dec 23, 2024

What if we added a simple CLI that looks like this if you install autogen_ext?

pip install autogen_ext

m1 "find me a nice shwarma place in my neighborhood"

This rocks!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants