Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with library: com.azure.cosmos.spark:azure-cosmos-spark_3-3_2-12:4.23.0 #486

Open
Spoccia opened this issue Nov 29, 2023 · 0 comments

Comments

@Spoccia
Copy link

Spoccia commented Nov 29, 2023

Our team is developing notebook databricks using SPARK and SCALA. We are working on inserting data in Collections in CosmosDB.

Following multiple guides, we used the configuration field:
"spark.cosmos.throughputControl.globalControl.container" = collection for throughput
"spark.cosmos.throughputControl.targetThroughputThreshold" = 0.2

to limit the use of the RU's in Cosmos but we are noticing that after multiple executions it looks like the library is not considering the limitation we set.

What we are saying is that it is like the limitation is ignored in fact the RU usage can grow up untill 100%. We are opening this issue after Microsoft suggested it.

Thanks for your feedback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant