Skip to content

Commit

Permalink
Merge pull request #25 from mineiros-io/zied/issue-23
Browse files Browse the repository at this point in the history
feat: add support for cloud_storage_config in subscriptions
  • Loading branch information
zied-elouaer authored Jul 19, 2024
2 parents 3a8c486 + f97e1c6 commit e9a1a86
Show file tree
Hide file tree
Showing 3 changed files with 121 additions and 2 deletions.
50 changes: 49 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -528,7 +528,11 @@ See [variables.tf] and [examples/] for details and use-cases.
- [**`use_topic_schema`**](#attr-subscriptions-push_config-bigquery_config-use_topic_schema): *(Optional `bool`)*<a name="attr-subscriptions-push_config-bigquery_config-use_topic_schema"></a>
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists.
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists. Only one of use_topic_schema and use_table_schema can be set.
- [**`use_table_schema`**](#attr-subscriptions-push_config-bigquery_config-use_table_schema): *(Optional `bool`)*<a name="attr-subscriptions-push_config-bigquery_config-use_table_schema"></a>
When true, use the BigQuery table's schema as the columns to write to in BigQuery. Messages must be published in JSON format. Only one of use_topic_schema and use_table_schema can be set.
- [**`write_metadata`**](#attr-subscriptions-push_config-bigquery_config-write_metadata): *(Optional `bool`)*<a name="attr-subscriptions-push_config-bigquery_config-write_metadata"></a>
Expand All @@ -538,6 +542,50 @@ See [variables.tf] and [examples/] for details and use-cases.
When `true` and `use_topic_schema` is `true`, any fields that are a part of the topic schema that are not part of the BigQuery table schema are dropped when writing to BigQuery. Otherwise, the schemas must be kept in sync and any messages with extra fields are not written and remain in the subscription's backlog.
- [**`service_account_email`**](#attr-subscriptions-push_config-bigquery_config-service_account_email): *(Optional `string`)*<a name="attr-subscriptions-push_config-bigquery_config-service_account_email"></a>
The service account to use to write to BigQuery. If not specified, the Pub/Sub service agent, service-{project_number}@gcp-sa-pubsub.iam.gserviceaccount.com, is used.
- [**`cloud_storage_config`**](#attr-subscriptions-cloud_storage_config): *(Optional `object(cloud_storage_config)`)*<a name="attr-subscriptions-cloud_storage_config"></a>
If delivery to Cloud Storage is used with this subscription, this field is used to configure it. Either pushConfig, bigQueryConfig or cloudStorageConfig can be set, but not combined. If all three are empty, then the subscriber will pull and ack messages using API methods.
The `cloud_storage_config` object accepts the following attributes:
- [**`bucket`**](#attr-subscriptions-cloud_storage_config-bucket): *(**Required** `string`)*<a name="attr-subscriptions-cloud_storage_config-bucket"></a>
User-provided name for the Cloud Storage bucket. The bucket must be created by the user.
The bucket name must be without any prefix like "gs://".
- [**`filename_prefix`**](#attr-subscriptions-cloud_storage_config-filename_prefix): *(Optional `string`)*<a name="attr-subscriptions-cloud_storage_config-filename_prefix"></a>
(Optional) User-provided prefix for Cloud Storage filename.
- [**`filename_suffix`**](#attr-subscriptions-cloud_storage_config-filename_suffix): *(Optional `string`)*<a name="attr-subscriptions-cloud_storage_config-filename_suffix"></a>
(Optional) User-provided suffix for Cloud Storage filename. Must not end in "/".
- [**`max_duration`**](#attr-subscriptions-cloud_storage_config-max_duration): *(Optional `string`)*<a name="attr-subscriptions-cloud_storage_config-max_duration"></a>
(Optional) The maximum duration that can elapse before a new Cloud Storage file is created.
Min 1 minute, max 10 minutes, default 5 minutes. May not exceed the subscription's acknowledgement deadline.
A duration in seconds with up to nine fractional digits, ending with 's'. Example: "3.5s".
- [**`max_bytes`**](#attr-subscriptions-cloud_storage_config-max_bytes): *(Optional `number`)*<a name="attr-subscriptions-cloud_storage_config-max_bytes"></a>
(Optional) The maximum bytes that can be written to a Cloud Storage file before a new file is created.
Min 1 KB, max 10 GiB. The maxBytes limit may be exceeded in cases where messages are larger than the limit.
- [**`avro_config`**](#attr-subscriptions-cloud_storage_config-avro_config): *(Optional `object(avro_config)`)*<a name="attr-subscriptions-cloud_storage_config-avro_config"></a>
If set, message data will be written to Cloud Storage in Avro format.
The `avro_config` object accepts the following attributes:
- [**`write_metadata`**](#attr-subscriptions-cloud_storage_config-avro_config-write_metadata): *(Optional `bool`)*<a name="attr-subscriptions-cloud_storage_config-avro_config-write_metadata"></a>
When true, write the subscription name, messageId, publishTime, attributes, and orderingKey as additional fields in the output.
- [**`iam`**](#attr-subscriptions-iam): *(Optional `list(iam)`)*<a name="attr-subscriptions-iam"></a>
List of IAM access roles to grant to a set of identities on the subscription.
Expand Down
72 changes: 71 additions & 1 deletion README.tfdoc.hcl
Original file line number Diff line number Diff line change
Expand Up @@ -653,7 +653,14 @@ section {
attribute "use_topic_schema" {
type = bool
description = <<-END
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists.
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists. Only one of use_topic_schema and use_table_schema can be set.
END
}

attribute "use_table_schema" {
type = bool
description = <<-END
When true, use the BigQuery table's schema as the columns to write to in BigQuery. Messages must be published in JSON format. Only one of use_topic_schema and use_table_schema can be set.
END
}

Expand All @@ -670,6 +677,69 @@ section {
When `true` and `use_topic_schema` is `true`, any fields that are a part of the topic schema that are not part of the BigQuery table schema are dropped when writing to BigQuery. Otherwise, the schemas must be kept in sync and any messages with extra fields are not written and remain in the subscription's backlog.
END
}

attribute "service_account_email" {
type = string
description = <<-END
The service account to use to write to BigQuery. If not specified, the Pub/Sub service agent, service-{project_number}@gcp-sa-pubsub.iam.gserviceaccount.com, is used.
END
}
}
}

attribute "cloud_storage_config" {
type = object(cloud_storage_config)
description = <<-END
If delivery to Cloud Storage is used with this subscription, this field is used to configure it. Either pushConfig, bigQueryConfig or cloudStorageConfig can be set, but not combined. If all three are empty, then the subscriber will pull and ack messages using API methods.
END

attribute "bucket" {
type = string
required = true
description = <<-END
User-provided name for the Cloud Storage bucket. The bucket must be created by the user.
The bucket name must be without any prefix like "gs://".
END
}
attribute "filename_prefix" {
type = string
description = <<-END
(Optional) User-provided prefix for Cloud Storage filename.
END
}
attribute "filename_suffix" {
type = string
description = <<-END
(Optional) User-provided suffix for Cloud Storage filename. Must not end in "/".
END
}
attribute "max_duration" {
type = string
description = <<-END
(Optional) The maximum duration that can elapse before a new Cloud Storage file is created.
Min 1 minute, max 10 minutes, default 5 minutes. May not exceed the subscription's acknowledgement deadline.
A duration in seconds with up to nine fractional digits, ending with 's'. Example: "3.5s".
END
}
attribute "max_bytes" {
type = number
description = <<-END
(Optional) The maximum bytes that can be written to a Cloud Storage file before a new file is created.
Min 1 KB, max 10 GiB. The maxBytes limit may be exceeded in cases where messages are larger than the limit.
END
}
attribute "avro_config" {
type = object(avro_config)
description = <<-END
If set, message data will be written to Cloud Storage in Avro format.
END

attribute "write_metadata" {
type = bool
description = <<-END
When true, write the subscription name, messageId, publishTime, attributes, and orderingKey as additional fields in the output.
END
}
}
}

Expand Down
1 change: 1 addition & 0 deletions subscriptions.tf
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ module "subscription" {
retry_policy = try(each.value.retry_policy, null)
push_config = try(each.value.push_config, null)
bigquery_config = try(each.value.bigquery_config, null)
cloud_storage_config = try(each.value.cloud_storage_config, null)

iam = try(each.value.iam, [])

Expand Down

0 comments on commit e9a1a86

Please sign in to comment.