You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When exceeding the threshold (configurable, e.g. 100KB) for a data attribute, iWF can use others like S3 for storing the data attributes instead of writing into Temporal history. (only storing the keys and the S3 objectIDs)
By storing keys and S3 objectIDs in Temporal history, IWF server will load from S3 before sending to application, and write to S3 for updates. For optimization, server could also load from S3 lazily when application tried to read it.
This is possible because iWF server workflow never really read the value of the DAs -- they are transparent to iWF server.
By offloading the large data attributes to S3, it's much easier for users to deal with large datasets, and more cost effective on using Cadence/Temporal.
The text was updated successfully, but these errors were encountered:
When exceeding the threshold (configurable, e.g. 100KB) for a data attribute, iWF can use others like S3 for storing the data attributes instead of writing into Temporal history. (only storing the keys and the S3 objectIDs)
By storing keys and S3 objectIDs in Temporal history, IWF server will load from S3 before sending to application, and write to S3 for updates. For optimization, server could also load from S3 lazily when application tried to read it.
This is possible because iWF server workflow never really read the value of the DAs -- they are transparent to iWF server.
By offloading the large data attributes to S3, it's much easier for users to deal with large datasets, and more cost effective on using Cadence/Temporal.
The text was updated successfully, but these errors were encountered: