Basically that's what I'm trying to do. I'm building a pipeline that has a refreshing, multimap, side input (BQ schemas) that then I apply to the main stream of data (records that are ultimately saved to the corresponding BQ table).
My job, although being of streaming nature, runs on the global window, and I want to unit test that the side input refreshes and that the updates are successfully applied.
The way I see understand it, the side collection is refreshed before accessing it so when accessed, it already contains the final (updated) snapshot of the schemas, is that true? In which case, how can I simulate that synchronisation? I'm using processing times as I thought that could be the way to go, but obviously something is wrong there.