Skip to content

Commit

Permalink
Merge branch 'microsoft:main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
Josephrp authored Sep 22, 2024
2 parents 979d112 + 6c9d9d8 commit 36eef96
Show file tree
Hide file tree
Showing 44 changed files with 3,279 additions and 1,204 deletions.
76 changes: 36 additions & 40 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,27 +5,49 @@

<img src="https://microsoft.github.io/autogen/img/ag.svg" alt="AutoGen Logo" width="100">


[![PyPI version](https://badge.fury.io/py/pyautogen.svg)](https://badge.fury.io/py/pyautogen)
[![Build](https://github.com/microsoft/autogen/actions/workflows/python-package.yml/badge.svg)](https://github.com/microsoft/autogen/actions/workflows/python-package.yml)
![Python Version](https://img.shields.io/badge/3.8%20%7C%203.9%20%7C%203.10%20%7C%203.11%20%7C%203.12-blue)
[![Downloads](https://static.pepy.tech/badge/pyautogen/week)](https://pepy.tech/project/pyautogen)

![Python Version](https://img.shields.io/badge/3.8%20%7C%203.9%20%7C%203.10%20%7C%203.11%20%7C%203.12-blue) [![PyPI version](https://img.shields.io/badge/PyPI-v0.2.34-blue.svg)](https://pypi.org/project/pyautogen/)
[![NuGet version](https://badge.fury.io/nu/AutoGen.Core.svg)](https://badge.fury.io/nu/AutoGen.Core)


[![Downloads](https://static.pepy.tech/badge/pyautogen/week)](https://pepy.tech/project/pyautogen)
[![Discord](https://img.shields.io/discord/1153072414184452236?logo=discord&style=flat)](https://aka.ms/autogen-dc)

[![Twitter](https://img.shields.io/twitter/url/https/twitter.com/cloudposse.svg?style=social&label=Follow%20%40pyautogen)](https://twitter.com/pyautogen)

</div>

# AutoGen

[📚 Cite paper](#related-papers).
<!-- <p align="center">
<img src="https://github.com/microsoft/autogen/blob/main/website/static/img/flaml.svg" width=200>
<br>
</p> -->
AutoGen is an open-source programming framework for building AI agents and facilitating cooperation among multiple agents to solve tasks. AutoGen aims to streamline the development and research of agentic AI, much like PyTorch does for Deep Learning. It offers features such as agents capable of interacting with each other, facilitates the use of various large language models (LLMs) and tool use support, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns.

> [!IMPORTANT]
> *Note for contributors and users*</b>: [microsoft/autogen](https://aka.ms/autogen-gh) is the official repository of AutoGen project and it is under active development and maintenance under MIT license. We welcome contributions from developers and organizations worldwide. Our goal is to foster a collaborative and inclusive community where diverse perspectives and expertise can drive innovation and enhance the project's capabilities. We acknowledge the invaluable contributions from our existing contributors, as listed in [contributors.md](./CONTRIBUTORS.md). Whether you are an individual contributor or represent an organization, we invite you to join us in shaping the future of this project. For further information please also see [Microsoft open-source contributing guidelines](https://github.com/microsoft/autogen?tab=readme-ov-file#contributing).
>
> -_Maintainers (Sept 6th, 2024)_

![AutoGen Overview](https://github.com/microsoft/autogen/blob/main/website/static/img/autogen_agentchat.png)

- AutoGen enables building next-gen LLM applications based on [multi-agent conversations](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat) with minimal effort. It simplifies the orchestration, automation, and optimization of a complex LLM workflow. It maximizes the performance of LLM models and overcomes their weaknesses.
- It supports [diverse conversation patterns](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat#supporting-diverse-conversation-patterns) for complex workflows. With customizable and conversable agents, developers can use AutoGen to build a wide range of conversation patterns concerning conversation autonomy,
the number of agents, and agent conversation topology.
- It provides a collection of working systems with different complexities. These systems span a [wide range of applications](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat#diverse-applications-implemented-with-autogen) from various domains and complexities. This demonstrates how AutoGen can easily support diverse conversation patterns.
- AutoGen provides [enhanced LLM inference](https://microsoft.github.io/autogen/docs/Use-Cases/enhanced_inference#api-unification). It offers utilities like API unification and caching, and advanced usage patterns, such as error handling, multi-config inference, context programming, etc.

AutoGen was created out of collaborative [research](https://microsoft.github.io/autogen/docs/Research) from Microsoft, Penn State University, and the University of Washington.

<p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
<a href="#readme-top" style="text-decoration: none; color: blue; font-weight: bold;">
↑ Back to Top ↑
</a>
</p>



## News
<details>

<summary>Expand</summary>

:fire: June 6, 2024: WIRED publishes a new article on AutoGen: [Chatbot Teamwork Makes the AI Dream Work](https://www.wired.com/story/chatbot-teamwork-makes-the-ai-dream-work/) based on interview with [Adam Fourney](https://github.com/afourney).

:fire: June 4th, 2024: Microsoft Research Forum publishes new update and video on [AutoGen and Complex Tasks](https://www.microsoft.com/en-us/research/video/autogen-update-complex-tasks-and-agents/) presented by [Adam Fourney](https://github.com/afourney).
Expand All @@ -38,7 +60,7 @@

:fire: May 11, 2024: [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation](https://openreview.net/pdf?id=uAjxFFing2) received the best paper award at the [ICLR 2024 LLM Agents Workshop](https://llmagents.github.io/).

:fire: Apr 26, 2024: [AutoGen.NET](https://microsoft.github.io/autogen-for-net/) is available for .NET developers!
:fire: Apr 26, 2024: [AutoGen.NET](https://microsoft.github.io/autogen-for-net/) is available for .NET developers! Thanks [XiaoYun Zhang](https://www.linkedin.com/in/xiaoyun-zhang-1b531013a/)

:fire: Apr 17, 2024: Andrew Ng cited AutoGen in [The Batch newsletter](https://www.deeplearning.ai/the-batch/issue-245/) and [What's next for AI agentic workflows](https://youtu.be/sal78ACtGTc?si=JduUzN_1kDnMq0vF) at Sequoia Capital's AI Ascent (Mar 26).

Expand Down Expand Up @@ -73,33 +95,7 @@
:fire: FLAML supports Code-First AutoML & Tuning – Private Preview in [Microsoft Fabric Data Science](https://learn.microsoft.com/en-us/fabric/data-science/). -->

<p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
<a href="#readme-top" style="text-decoration: none; color: blue; font-weight: bold;">
↑ Back to Top ↑
</a>
</p>

## What is AutoGen

AutoGen is an open-source programming framework for building AI agents and facilitating cooperation among multiple agents to solve tasks. AutoGen aims to streamline the development and research of agentic AI, much like PyTorch does for Deep Learning. It offers features such as agents capable of interacting with each other, facilitates the use of various large language models (LLMs) and tool use support, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns.

We welcome contributions from developers and organizations worldwide. Our goal is to foster a collaborative and inclusive community where diverse perspectives and expertise can drive innovation and enhance the project's capabilities. We acknowledge the invaluable contributions from our existing contributors, as listed in [contributors.md](./CONTRIBUTORS.md). Whether you are an individual contributor or represent an organization, we invite you to join us in shaping the future of this project. For further information please also see [Microsoft open-source contributing guidelines](https://github.com/microsoft/autogen?tab=readme-ov-file#contributing).

![AutoGen Overview](https://github.com/microsoft/autogen/blob/main/website/static/img/autogen_agentchat.png)

- AutoGen enables building next-gen LLM applications based on [multi-agent conversations](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat) with minimal effort. It simplifies the orchestration, automation, and optimization of a complex LLM workflow. It maximizes the performance of LLM models and overcomes their weaknesses.
- It supports [diverse conversation patterns](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat#supporting-diverse-conversation-patterns) for complex workflows. With customizable and conversable agents, developers can use AutoGen to build a wide range of conversation patterns concerning conversation autonomy,
the number of agents, and agent conversation topology.
- It provides a collection of working systems with different complexities. These systems span a [wide range of applications](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat#diverse-applications-implemented-with-autogen) from various domains and complexities. This demonstrates how AutoGen can easily support diverse conversation patterns.
- AutoGen provides [enhanced LLM inference](https://microsoft.github.io/autogen/docs/Use-Cases/enhanced_inference#api-unification). It offers utilities like API unification and caching, and advanced usage patterns, such as error handling, multi-config inference, context programming, etc.

AutoGen is created out of collaborative [research](https://microsoft.github.io/autogen/docs/Research) from Microsoft, Penn State University, and the University of Washington.

<p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
<a href="#readme-top" style="text-decoration: none; color: blue; font-weight: bold;">
↑ Back to Top ↑
</a>
</p>
</details>

## Roadmaps

Expand Down
2 changes: 1 addition & 1 deletion autogen/agentchat/contrib/agent_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ class AgentBuilder:
"""

AGENT_NAME_PROMPT = """# Your task
Suggest no more then {max_agents} experts with their name according to the following user requirement.
Suggest no more than {max_agents} experts with their name according to the following user requirement.
## User requirement
{task}
Expand Down
Empty file.
24 changes: 24 additions & 0 deletions autogen/agentchat/contrib/graph_rag/document.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
from dataclasses import dataclass
from enum import Enum, auto
from typing import Optional


class DocumentType(Enum):
"""
Enum for supporting document type.
"""

TEXT = auto()
HTML = auto()
PDF = auto()


@dataclass
class Document:
"""
A wrapper of graph store query results.
"""

doctype: DocumentType
data: Optional[object] = None
path_or_url: Optional[str] = ""
51 changes: 51 additions & 0 deletions autogen/agentchat/contrib/graph_rag/graph_query_engine.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
from dataclasses import dataclass, field
from typing import List, Optional, Protocol

from .document import Document


@dataclass
class GraphStoreQueryResult:
"""
A wrapper of graph store query results.
answer: human readable answer to question/query.
results: intermediate results to question/query, e.g. node entities.
"""

answer: Optional[str] = None
results: list = field(default_factory=list)


class GraphQueryEngine(Protocol):
"""An abstract base class that represents a graph query engine on top of a underlying graph database.
This interface defines the basic methods for graph rag.
"""

def init_db(self, input_doc: List[Document] | None = None):
"""
This method initializes graph database with the input documents or records.
Usually, it takes the following steps,
1. connecting to a graph database.
2. extract graph nodes, edges based on input data, graph schema and etc.
3. build indexes etc.
Args:
input_doc: a list of input documents that are used to build the graph in database.
Returns: GraphStore
"""
pass

def add_records(self, new_records: List) -> bool:
"""
Add new records to the underlying database and add to the graph if required.
"""
pass

def query(self, question: str, n_results: int = 1, **kwargs) -> GraphStoreQueryResult:
"""
This method transform a string format question into database query and return the result.
"""
pass
56 changes: 56 additions & 0 deletions autogen/agentchat/contrib/graph_rag/graph_rag_capability.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
from autogen.agentchat.contrib.capabilities.agent_capability import AgentCapability
from autogen.agentchat.conversable_agent import ConversableAgent

from .graph_query_engine import GraphQueryEngine


class GraphRagCapability(AgentCapability):
"""
A graph rag capability uses a graph query engine to give a conversable agent the graph rag ability.
An agent class with graph rag capability could
1. create a graph in the underlying database with input documents.
2. retrieved relevant information based on messages received by the agent.
3. generate answers from retrieved information and send messages back.
For example,
graph_query_engine = GraphQueryEngine(...)
graph_query_engine.init_db([Document(doc1), Document(doc2), ...])
graph_rag_agent = ConversableAgent(
name="graph_rag_agent",
max_consecutive_auto_reply=3,
...
)
graph_rag_capability = GraphRagCapbility(graph_query_engine)
graph_rag_capability.add_to_agent(graph_rag_agent)
user_proxy = UserProxyAgent(
name="user_proxy",
code_execution_config=False,
is_termination_msg=lambda msg: "TERMINATE" in msg["content"],
human_input_mode="ALWAYS",
)
user_proxy.initiate_chat(graph_rag_agent, message="Name a few actors who've played in 'The Matrix'")
# ChatResult(
# chat_id=None,
# chat_history=[
# {'content': 'Name a few actors who've played in \'The Matrix\'', 'role': 'graph_rag_agent'},
# {'content': 'A few actors who have played in The Matrix are:
# - Keanu Reeves
# - Laurence Fishburne
# - Carrie-Anne Moss
# - Hugo Weaving',
# 'role': 'user_proxy'},
# ...)
"""

def __init__(self, query_engine: GraphQueryEngine):
"""
initialize graph rag capability with a graph query engine
"""
...

def add_to_agent(self, agent: ConversableAgent): ...
2 changes: 1 addition & 1 deletion autogen/oai/anthropic.py
Original file line number Diff line number Diff line change
Expand Up @@ -314,7 +314,7 @@ def oai_messages_to_anthropic_messages(params: Dict[str, Any]) -> list[dict[str,
last_tool_result_index = -1
for message in params["messages"]:
if message["role"] == "system":
params["system"] = message["content"]
params["system"] = params.get("system", "") + (" " if "system" in params else "") + message["content"]
else:
# New messages will be added here, manage role alternations
expected_role = "user" if len(processed_messages) % 2 == 0 else "assistant"
Expand Down
2 changes: 1 addition & 1 deletion dotnet/eng/MetaInfo.props
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<VersionPrefix>0.1.0</VersionPrefix>
<VersionPrefix>0.2.1</VersionPrefix>
<Authors>AutoGen</Authors>
<PackageProjectUrl>https://microsoft.github.io/autogen-for-net/</PackageProjectUrl>
<RepositoryUrl>https://github.com/microsoft/autogen</RepositoryUrl>
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Connect_To_OpenAI_o1_preview.cs

using AutoGen.Core;
using OpenAI;

namespace AutoGen.OpenAI.Sample;

public class Connect_To_OpenAI_o1_preview
{
public static async Task RunAsync()
{
#region create_agent
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new InvalidOperationException("Please set environment variable OPENAI_API_KEY");
var openAIClient = new OpenAIClient(apiKey);

// until 2024/09/12
// openai o1-preview doesn't support systemMessage, temperature, maxTokens, streaming output
// so in order to use OpenAIChatAgent with o1-preview, you need to set those parameters to null
var agent = new OpenAIChatAgent(
chatClient: openAIClient.GetChatClient("o1-preview"),
name: "assistant",
systemMessage: null,
temperature: null,
maxTokens: null,
seed: 0)
// by using RegisterMiddleware instead of RegisterStreamingMiddleware
// it turns an IStreamingAgent into an IAgent and disables streaming
.RegisterMiddleware(new OpenAIChatRequestMessageConnector())
.RegisterPrintMessage();
#endregion create_agent

#region send_message
await agent.SendAsync("Can you write a piece of C# code to calculate 100th of fibonacci?");
#endregion send_message
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -335,7 +335,10 @@ private IEnumerable<ChatRequestMessage> ProcessToolCallMessage(IAgent agent, Too

var toolCall = message.ToolCalls.Select((tc, i) => new ChatCompletionsFunctionToolCall(tc.ToolCallId ?? $"{tc.FunctionName}_{i}", tc.FunctionName, tc.FunctionArguments));
var textContent = message.GetContent() ?? string.Empty;
var chatRequestMessage = new ChatRequestAssistantMessage(textContent) { Name = message.From };

// don't include the name field when it's tool call message.
// fix https://github.com/microsoft/autogen/issues/3437
var chatRequestMessage = new ChatRequestAssistantMessage(textContent);
foreach (var tc in toolCall)
{
chatRequestMessage.ToolCalls.Add(tc);
Expand Down
16 changes: 8 additions & 8 deletions dotnet/src/AutoGen.OpenAI/Agent/OpenAIChatAgent.cs
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ public class OpenAIChatAgent : IStreamingAgent
{
private readonly ChatClient chatClient;
private readonly ChatCompletionOptions options;
private readonly string systemMessage;
private readonly string? systemMessage;

/// <summary>
/// Create a new instance of <see cref="OpenAIChatAgent"/>.
Expand All @@ -50,9 +50,9 @@ public class OpenAIChatAgent : IStreamingAgent
public OpenAIChatAgent(
ChatClient chatClient,
string name,
string systemMessage = "You are a helpful AI assistant",
float temperature = 0.7f,
int maxTokens = 1024,
string? systemMessage = "You are a helpful AI assistant",
float? temperature = null,
int? maxTokens = null,
int? seed = null,
ChatResponseFormat? responseFormat = null,
IEnumerable<ChatTool>? functions = null)
Expand All @@ -75,7 +75,7 @@ public OpenAIChatAgent(
ChatClient chatClient,
string name,
ChatCompletionOptions options,
string systemMessage = "You are a helpful AI assistant")
string? systemMessage = "You are a helpful AI assistant")
{
this.chatClient = chatClient;
this.Name = name;
Expand Down Expand Up @@ -124,7 +124,7 @@ private IEnumerable<ChatMessage> CreateChatMessages(IEnumerable<IMessage> messag
});

// add system message if there's no system message in messages
if (!oaiMessages.Any(m => m is SystemChatMessage))
if (!oaiMessages.Any(m => m is SystemChatMessage) && systemMessage is not null)
{
oaiMessages = new[] { new SystemChatMessage(systemMessage) }.Concat(oaiMessages);
}
Expand Down Expand Up @@ -192,8 +192,8 @@ private ChatCompletionOptions CreateChatCompletionsOptions(GenerateReplyOptions?
}

private static ChatCompletionOptions CreateChatCompletionOptions(
float temperature = 0.7f,
int maxTokens = 1024,
float? temperature = 0.7f,
int? maxTokens = 1024,
int? seed = null,
ChatResponseFormat? responseFormat = null,
IEnumerable<ChatTool>? functions = null)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,10 @@ private IEnumerable<ChatMessage> ProcessToolCallMessage(IAgent agent, ToolCallMe

var toolCallParts = message.ToolCalls.Select((tc, i) => ChatToolCall.CreateFunctionToolCall(tc.ToolCallId ?? $"{tc.FunctionName}_{i}", tc.FunctionName, tc.FunctionArguments));
var textContent = message.GetContent() ?? null;
var chatRequestMessage = new AssistantChatMessage(toolCallParts, textContent) { ParticipantName = message.From };

// Don't set participant name for assistant when it is tool call
// fix https://github.com/microsoft/autogen/issues/3437
var chatRequestMessage = new AssistantChatMessage(toolCallParts, textContent);

return [chatRequestMessage];
}
Expand Down
Loading

0 comments on commit 36eef96

Please sign in to comment.