Skip to content
This repository has been archived by the owner on Nov 11, 2022. It is now read-only.

Since porting to 2.1.0, Dataflow is leaving Datasets/Tables behind in BigQuery #609

Open
polleyg opened this issue Oct 9, 2017 · 9 comments

Comments

@polleyg
Copy link

polleyg commented Oct 9, 2017

Since porting to 2.1.0, Dataflow is leaving Datasets/Tables behind in BigQuery when the pipeline is cancelled or when it fails. We've been on 1.8.0/1.9.0 previous to this, and we've never see this before. We skipped 2.0.0, so unsure which version it was actually introduced in.

I cancelled a job (2017-10-08_18_35_30-13495977675828673253), and it left behind a dataset and table in BigQuery:

image

  1. Why is Dataflow now creating datasets and tables in BigQuery for temp use?
  2. Why has Dataflow not deleted these temp datasets and tables when it was cancelled or when it failed?
@nguyent
Copy link

nguyent commented Oct 10, 2017

I haven't seen anything like this in 2.0.0; we run batch jobs on a daily basis and have restarted our streaming pipelines a few times now.

Is this in streaming, batch, or both?
What is the delta between job cancellation time and table creation/update time?
Is there a reproducible case?

@polleyg
Copy link
Author

polleyg commented Oct 13, 2017

Only seen it in batch so far, and cannot reproduce yet.

@jamespercy
Copy link

jamespercy commented Dec 12, 2017

Still happening in 2.2.0 templated batch jobs on our side. We're currently managing it with cleanup scripts but it's a PITA.
I was wondering if it might be an idea to put an expiry on those datasets to auto cleanup?
I guess determining that might be difficult depending on how long a batch can run for but a day seems safe. At least that would limit the number of temp files to 24 if one were running them every hour.

@polleyg
Copy link
Author

polleyg commented Dec 12, 2017

I was just thinking about this today because it happened yet again. Agree, auto expire on the datasets makes sense.

@jamespercy
Copy link

So I did a little investigation and it does look like that's actually implemented... not sure why it's still happening though.

https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryQuerySource.java

LOG.info("Creating temporary dataset {} for query results", tableToExtract.getDatasetId());
tableService.createDataset(
    tableToExtract.getProjectId(),
    tableToExtract.getDatasetId(),
    location,
    "Temporary tables for query results of job " + bqOptions.getJobName(),
    // Set a TTL of 1 day on the temporary tables, which ought to be enough in all cases:
    // the temporary tables are used only to immediately extract them into files.
    // They are normally cleaned up, but in case of job failure the cleanup step may not run,
    // and then they'll get deleted after the TTL.
    24 * 3600 * 1000L /* 1 day */);

I think I'll try do a bit more debugging of my own... p.s. is this the correct forum to be discussing this?

@lukecwik
Copy link
Contributor

[email protected] is a good place and also by opening a tracking issue on https://issues.apache.org/jira/projects/BEAM so people can follow the bug.

@rumeshkrish
Copy link

I am also facing this issue. If job failed, I observed that table got delete after 1 day. But DataSet still remain exist. Can we have option to clean temp dataset and tables immediately if job failed. ?
This method cleanupTempResource(options.as(BigQueryOptions.class)); is responsible for cleaning temp dataset and tables. This executed if job succeed in public List<BoundedSource<T>> split(long desiredBundleSizeBytes, PipelineOptions options) method call . If any error also we need to clean based on pipeline option.

Can any one have better idea. ?

@lobdellb
Copy link

I have this as well, python sdk version 2.27 running on Google dataflow. See attached.

Would be nice if the tables would at least expire automatically. Or if the temp dataset name was configurable. Or something else.

Screen_Shot_2021-05-19_at_8_59_47_AM

Screen Shot 2021-05-19 at 8 59 18 AM

@kennknowles
Copy link
Contributor

Check out https://beam.apache.org/community/contact-us/ for ways to reach the Beam community with bug reports and questions.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants