Skip to content

Commit

Permalink
feat(ai-proxy): added AWS IAM cloud identity instructions
Browse files Browse the repository at this point in the history
  • Loading branch information
tysoekong committed Sep 23, 2024
1 parent 1b12f67 commit bf05c2f
Show file tree
Hide file tree
Showing 2 changed files with 210 additions and 12 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,11 @@ LLM-based services using those same methods.

Kong's AI Gateway currently supports the following cloud authentication:

| AI-Proxy Advanced LLM Provider | Cloud Provider | Type |
|--------------------------------|-------------------------------------------------|-----------------------------------------|
| `azure` (Kong Enterprise Only) | Azure OpenAI | [Entra / Managed Identity Authentication](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) |
| `gemini` | Gemini Enterprise (on Vertex or Workspace) | [GCP Service Account](https://cloud.google.com/iam/docs/service-account-overview) |
| `bedrock` | AWS Bedrock Converse-API | [AWS IAM Identity](https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html) |
| AI-Proxy Advanced LLM Provider | Cloud Provider | Type |
|--------------------------------------------|-------------------------------------------------|-----------------------------------------|
| `azure` ('{{site.ee_product_name}}' Only) | Azure OpenAI | [Entra / Managed Identity Authentication](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) |
| `gemini` | Gemini Enterprise (on Vertex or Workspace) | [GCP Service Account](https://cloud.google.com/iam/docs/service-account-overview) |
| `bedrock` | AWS Bedrock Converse-API | [AWS IAM Identity](https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html) |

## Azure OpenAI (Kong Enterprise Only)

Check failure on line 35 in app/_hub/kong-inc/ai-proxy-advanced/how-to/_cloud-provider-authentication.md

View workflow job for this annotation

GitHub Actions / Vale

[vale] reported by reviewdog 🐶 [kong.kongterms] Use '{{site.ee_product_name}}' instead of 'Kong Enterprise'. Raw Output: {"message": "[kong.kongterms] Use '{{site.ee_product_name}}' instead of 'Kong Enterprise'.", "location": {"path": "app/_hub/kong-inc/ai-proxy-advanced/how-to/_cloud-provider-authentication.md", "range": {"start": {"line": 35, "column": 18}}}, "severity": "ERROR"}

Expand Down Expand Up @@ -283,4 +283,103 @@ config:
auth:
use_gcp_service_account: true
gcp_service_account_json: '{vault://gcp/VERTEX_SERVICE_ACCOUNT_JSON}'
```
```
## AWS Bedrock
When hosting your LLMs with [AWS Bedrock Converse API](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) in a business or enterprise plan,
and running them through AI Proxy, it is possible to use an [IAM Identity](https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html) that can be assigned to a currently running EC2 instance,
an EKS deployment, ECS deployment, or just used via the [AWS CLI](https://aws.amazon.com/cli/) credential context on the local machine.
How you do this depends on where and how you are running {{site.base_gateway}}.
### Prerequisites
You must be running a {{site.ee_product_name}} instance.
Ensure that the EC2 instance, EKS deployment, ECS deployment, etcetera, has been assigned the IAM principal,

Check failure on line 300 in app/_hub/kong-inc/ai-proxy-advanced/how-to/_cloud-provider-authentication.md

View workflow job for this annotation

GitHub Actions / Vale

[vale] reported by reviewdog 🐶 [kong.Spelling] Did you really mean 'etcetera'? Raw Output: {"message": "[kong.Spelling] Did you really mean 'etcetera'?", "location": {"path": "app/_hub/kong-inc/ai-proxy-advanced/how-to/_cloud-provider-authentication.md", "range": {"start": {"line": 300, "column": 63}}}, "severity": "ERROR"}
configurable from the AWS IAM portal.
If the role requires crossing permission boundaries, ensure that the correct Assume-Role Policy is applied.
Assign the correct permissions to the identity's IAM Policy:
* `bedrock:InvokeModel`
* `bedrock:InvokeModelWithResponseStream`

respective to the `Resource ARNs` that corresponds to the models that Kong is allowed to call on the user's behalf.

### Configuring the AI Proxy Advanced Plugin to use AWS IAM

When running Kong inside of your AWS subscription, AI Proxy Advanced is usually able to detect the designated IAM Principal automatically, based on the
assigned identity.

Kong will use the same **authentication credentials chain** as with most AWS SDKs (and the AWS CLI). See the [Java credentials chain](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/credentials-chain.html)
precedence order, for an example.

#### AWS IAM Identity

To use an AWS-assigned IAM Identity, set up your plugin config like this example:

<!-- vale off-->
{% plugin_example %}
plugin: kong-inc/ai-proxy-advanced
name: ai-proxy
config:
route_type: "llm/v1/chat"
logging:
log_statistics: true
log_payloads: false
model:
provider: "bedrock"
name: "amazon.titan-text-express-v1"
targets:
- route
- consumer_group
- global
formats:
- konnect
- curl
- yaml
- kubernetes
- terraform
{% endplugin_example %}
<!--vale on -->

In most workloads, this is **zero-configuration** and you should not need to instruct Kong AI Proxy plugin with any credentials of
Bedrock-specific configuration - Kong will find the correct IAM credentials automatically, upon **first invocation of the model**.

#### Environment variables

You can also specify your own AWS IAM credentials; simply set this environment variables in the Kong workload or deployment configuration:

Environment variable:
```sh
AWS_ACCESS_KEY_ID=AKAI...
AWS_SECRET_ACCESS_KEY=...
AWS_REGION=eu-west-1
```

or set it directly in the plugin configuration:

```yaml
config:
auth:
aws_access_key_id: 'AKAI...'
aws_secret_access_key: '...'
options:
bedrock:
aws_region: 'eu-west-1'
```

or, more securely, use a vault reference to e.g. AWS Secrets Manager:

```yaml
config:
auth:
aws_access_key_id: 'AKAI...'
aws_secret_access_key: '{vault://aws/BEDROCK_SECRET_ACCESS_KEY}'
options:
bedrock:
aws_region: 'eu-west-1'
```
111 changes: 105 additions & 6 deletions app/_hub/kong-inc/ai-proxy/how-to/_cloud-provider-authentication.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,11 @@ LLM-based services using those same methods.

Kong's AI Gateway currently supports the following cloud authentication:

| AI-Proxy LLM Provider | Cloud Provider | Type |
|--------------------------------|-------------------------------------------------|-----------------------------------------|
| `azure` (Kong Enterprise Only) | Azure OpenAI | [Entra / Managed Identity Authentication](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) |
| `gemini` | Gemini Enterprise (on Vertex or Workspace) | [GCP Service Account](https://cloud.google.com/iam/docs/service-account-overview) |
| `bedrock` | AWS Bedrock Converse-API | [AWS IAM Identity](https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html) |
| AI-Proxy LLM Provider | Cloud Provider | Type |
|--------------------------------------------|-------------------------------------------------|-----------------------------------------|
| `azure` ('{{site.ee_product_name}}' Only) | Azure OpenAI | [Entra / Managed Identity Authentication](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) |
| `gemini` | Gemini Enterprise (on Vertex or Workspace) | [GCP Service Account](https://cloud.google.com/iam/docs/service-account-overview) |
| `bedrock` | AWS Bedrock Converse-API | [AWS IAM Identity](https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html) |

## Azure OpenAI (Kong Enterprise Only)

Check failure on line 35 in app/_hub/kong-inc/ai-proxy/how-to/_cloud-provider-authentication.md

View workflow job for this annotation

GitHub Actions / Vale

[vale] reported by reviewdog 🐶 [kong.kongterms] Use '{{site.ee_product_name}}' instead of 'Kong Enterprise'. Raw Output: {"message": "[kong.kongterms] Use '{{site.ee_product_name}}' instead of 'Kong Enterprise'.", "location": {"path": "app/_hub/kong-inc/ai-proxy/how-to/_cloud-provider-authentication.md", "range": {"start": {"line": 35, "column": 18}}}, "severity": "ERROR"}

Expand Down Expand Up @@ -283,4 +283,103 @@ config:
auth:
use_gcp_service_account: true
gcp_service_account_json: '{vault://gcp/VERTEX_SERVICE_ACCOUNT_JSON}'
```
```
## AWS Bedrock
When hosting your LLMs with [AWS Bedrock Converse API](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) in a business or enterprise plan,
and running them through AI Proxy, it is possible to use an [IAM Identity](https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html) that can be assigned to a currently running EC2 instance,
an EKS deployment, ECS deployment, or just used via the [AWS CLI](https://aws.amazon.com/cli/) credential context on the local machine.
How you do this depends on where and how you are running {{site.base_gateway}}.
### Prerequisites
You must be running a {{site.ee_product_name}} instance.
Ensure that the EC2 instance, EKS deployment, ECS deployment, etcetera, has been assigned the IAM principal,

Check failure on line 300 in app/_hub/kong-inc/ai-proxy/how-to/_cloud-provider-authentication.md

View workflow job for this annotation

GitHub Actions / Vale

[vale] reported by reviewdog 🐶 [kong.Spelling] Did you really mean 'etcetera'? Raw Output: {"message": "[kong.Spelling] Did you really mean 'etcetera'?", "location": {"path": "app/_hub/kong-inc/ai-proxy/how-to/_cloud-provider-authentication.md", "range": {"start": {"line": 300, "column": 63}}}, "severity": "ERROR"}
configurable from the AWS IAM portal.
If the role requires crossing permission boundaries, ensure that the correct Assume-Role Policy is applied.
Assign the correct permissions to the identity's IAM Policy:
* `bedrock:InvokeModel`
* `bedrock:InvokeModelWithResponseStream`

respective to the `Resource ARNs` that corresponds to the models that Kong is allowed to call on the user's behalf.

### Configuring the AI Proxy Plugin to use AWS IAM

When running Kong inside of your AWS subscription, AI Proxy is usually able to detect the designated IAM Principal automatically, based on the
assigned identity.

Kong will use the same **authentication credentials chain** as with most AWS SDKs (and the AWS CLI). See the [Java credentials chain](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/credentials-chain.html)
precedence order, for an example.

#### AWS IAM Identity

To use an AWS-assigned IAM Identity, set up your plugin config like this example:

<!-- vale off-->
{% plugin_example %}
plugin: kong-inc/ai-proxy
name: ai-proxy
config:
route_type: "llm/v1/chat"
logging:
log_statistics: true
log_payloads: false
model:
provider: "bedrock"
name: "amazon.titan-text-express-v1"
targets:
- route
- consumer_group
- global
formats:
- konnect
- curl
- yaml
- kubernetes
- terraform
{% endplugin_example %}
<!--vale on -->

In most workloads, this is **zero-configuration** and you should not need to instruct Kong AI Proxy plugin with any credentials of
Bedrock-specific configuration - Kong will find the correct IAM credentials automatically, upon **first invocation of the model**.

#### Environment variables

You can also specify your own AWS IAM credentials; simply set this environment variables in the Kong workload or deployment configuration:

Environment variable:
```sh
AWS_ACCESS_KEY_ID=AKAI...
AWS_SECRET_ACCESS_KEY=...
AWS_REGION=eu-west-1
```

or set it directly in the plugin configuration:

```yaml
config:
auth:
aws_access_key_id: 'AKAI...'
aws_secret_access_key: '...'
options:
bedrock:
aws_region: 'eu-west-1'
```

or, more securely, use a vault reference to e.g. AWS Secrets Manager:

```yaml
config:
auth:
aws_access_key_id: 'AKAI...'
aws_secret_access_key: '{vault://aws/BEDROCK_SECRET_ACCESS_KEY}'
options:
bedrock:
aws_region: 'eu-west-1'
```

0 comments on commit bf05c2f

Please sign in to comment.