[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Beam leaving temporary datasets in BigQuery


We've recently enabled two Beam batch jobs in production, running daily, and have noticed a whole load of datasets being left behind in BigQuery (see attached). These jobs both read and write from BigQuery, and we're using Beam 2.4.0. The jobs are running as templates (with `withTemplateCompatibility()` when reading).

A similar issue has been reported at

The code to remove datasets does seem to be there, but I'm not seeing the logs in my job, so presumably it's not being called?

Nothing else obvious in the logs.

Any ideas or suggestions on how to track this issue down?


Attachment: Screen Shot 2018-05-31 at 10.38.15.png
Description: PNG image