Im using GCP Batch Dataflow to process data that im picking from a table. The input here is table data - where im using a query in Java to get the data.
After processing, when I'm trying to insert the rows - Say 10 Complete records -
If 1 records has an issue while inserting to BigQuery - That is no issue in processing. ( Ex - Datatype issue while inserting to Table ) - This record is being dropped - With the exact error reason but this is triggering more records to be dropped randomly. 1 or more records are dropped randomly -
Error Reason being "stopped" - Thought this was due to inappropriate error handling - but couldn't make the code work.
Do let me know if you have any suggestions.