I am having a major issue with Spring Cloud Dataflow running jobs with properties from previous job executions.
Does anyone know how to disable task property caching in Spring Cloud Dataflow (or if it is even possible to do).
Let's say a user can configure their batch job in a custom UI and it creates a schedule using the Java DSL
import .springframework.cloud.dataflow.rest.client.dsl.task.TaskSchedule;
Map<String, String> scheduleProperties = Map.of(
"a", "value of a property",
);
TaskSchedule.builder(dataFlowOperations)
.scheduleName("my-job-schedule1")
.task(task)
.build()
.schedule(cronExpression, scheduleProperties);
Then, the configuration changes and now I don't want to execute the job with parameter "a", but "b".
import .springframework.cloud.dataflow.rest.client.dsl.task.TaskSchedule;
// unschedule my-job-schedule1
Map<String, String> scheduleProperties = Map.of(
"b", "value of b property",
);
TaskSchedule.builder(dataFlowOperations)
.scheduleName("my-job-schedule2")
.task(task)
.build()
.schedule(cronExpression, scheduleProperties);
It is unacceptable for me to create a brand new schedule but receive the properties I was not giving it (even though the composed task runner only received the "b" property). I even use the RunIdIncrementer for each job execution.
Thanks for any advice.