最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

java - Spring Cloud Dataflow task properties caching - Stack Overflow

programmeradmin1浏览0评论

I am having a major issue with Spring Cloud Dataflow running jobs with properties from previous job executions.

Does anyone know how to disable task property caching in Spring Cloud Dataflow (or if it is even possible to do).

Let's say a user can configure their batch job in a custom UI and it creates a schedule using the Java DSL

import .springframework.cloud.dataflow.rest.client.dsl.task.TaskSchedule;

Map<String, String> scheduleProperties = Map.of(
    "a", "value of a property",
);

TaskSchedule.builder(dataFlowOperations)
                    .scheduleName("my-job-schedule1")
                    .task(task)
                    .build()
                    .schedule(cronExpression, scheduleProperties);

Then, the configuration changes and now I don't want to execute the job with parameter "a", but "b".

import .springframework.cloud.dataflow.rest.client.dsl.task.TaskSchedule;

// unschedule my-job-schedule1

Map<String, String> scheduleProperties = Map.of(
    "b", "value of b property",
);

TaskSchedule.builder(dataFlowOperations)
                    .scheduleName("my-job-schedule2")
                    .task(task)
                    .build()
                    .schedule(cronExpression, scheduleProperties);

It is unacceptable for me to create a brand new schedule but receive the properties I was not giving it (even though the composed task runner only received the "b" property). I even use the RunIdIncrementer for each job execution.

Thanks for any advice.

发布评论

评论列表(0)

  1. 暂无评论