Need assistance on Application integration filterClause

Hi everyone,

I've successfully created a Workday integration connector and have managed to retrieve historical data using a while loop task. However, I have a small doubt regarding whether the integration connector automatically ingests CDC (Change Data Capture) data into my destination, or if I need to set up any filter clauses.

Could someone please clarify this for me? Thanks!

@Madhuvandhini @rohitjangid 

Solved Solved
1 11 285
1 ACCEPTED SOLUTION

Hi @purna05 , I can see that the variable "file_path" is getting populated. Can you check the value of "connectorInputPayload" and make sure that the values are getting populated correctly?

 

7vFPoVcz55uPv6U.png

View solution in original post

11 REPLIES 11

Hi @purna05 ,

Sorry, I didn't completely understand the question. 
1. Are you looking for workday trigger in Application Integration to listen to Workday for CDC?
2. Are you want to do mutation operation in workday?

Could you please explain the use case in detail?

Hi @gowthamnagarajn 

Are you looking for workday trigger in Application Integration to listen to Workday for CDC

1. I created an Integration connection for Workday (source).
2. I created an Integration Connector for GCS (destination)
3. I created application integration with Api Trigger, Connector for Workday Data MApping and Connector for GCS.
4. when I trigger it gets only 1000 records even though I have 10000 records and I want to config CDC updated records from the workday.

any approach to solve?

I am trying to understand bit more.

You have a workflow which has API trigger, Workday connector task and GCS connector task. You read 1000 records from Workday and add it to GCS. Is my understanding correct? If it is a list entities, you can specify the page size, filter and next page token to paginate and get more records.

May I know what do you mean by CDC updated records? What is the exact workday entity/action you use in your workflow?

@purna05 Are you able to setup the workflow successfully?

Hi @gowthamnagarajn ,

Still blocked with an error. I'm new to this connectors and application integration. I am required to create a folder structure based on DATE. but I'm unable to create a variable for Date.


example : if I trigger today my integration, it need to go into specified bucket > specified folder > based in DATE create 2024 folder > month folder (April) > Date folder > File.json

Any support is appreciated. Thanks for reaching out on this chat.

Hi @purna05 

While we don't have a date variable, you can achieve the same using a Javascript task.

You can create a new string variable `file_path` and then add a new Javascript task. In the javascript task, you can write the following code.

```

function executeScript(event) {
const months = ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"];
 
const today = new Date();
const year = today.getFullYear().toString();
const month = months[today.getMonth()];
const day = today.getDate().toString();

const basePath = "specified/bucket/specified_folder";
const filePath = `${basePath}/${year}/${month}/${day}/File.json`;

event.setParameter('file_path', filePath);
}
```
 
This will take today's date and generate the file name in the `file_path` variable. You can later use this variable in your flow. Let me know if you still face any other issues.

 

 

Hi Anshjain,

Your help has been very beneficial for my current work. I tried creating some JavaScript code, but I ran into an error that I don't understand. Unfortunately, I'm not very skilled in JavaScript. Could you assist me with this issue? It would be very helpful if we could resolve it over a Google Meet session so I can understand it better.

purna05_0-1715659384596.png

purna05_1-1715659424603.png

 

Hi Purna, The javascript task is able to generate the "file_path" variable correctly as we can see in the execution logs in the screenshot.

The error says that the your gcs connector task does not have the input parameters.

So you'll have to create a mapping for the connector using our Data mapper task. -> https://cloud.google.com/application-integration/docs/data-mapping-overview

For GCS, depending on what you are trying to achieve, you'll need to set the required parameters -> https://cloud.google.com/integration-connectors/docs/connectors/cloudstorage/configure#before-you-be...

 

Hi @anshjain 
I created a data mapping task and connected it with a GCS connector. It is running successfully but the file was loading to the bucket not to the specified folder and no data folders were created. I think the javascript variable was not added. can we connect and solve this once? I appreciate your support and help.

purna05_0-1715671662222.png

files are getting into bucket refer below imagepurna05_1-1715671686583.png

but I specified a path with year, month and day folders. but no file were in specified folders.

purna05_2-1715671905352.png

 

Hi @purna05 , I can see that the variable "file_path" is getting populated. Can you check the value of "connectorInputPayload" and make sure that the values are getting populated correctly?

 

7vFPoVcz55uPv6U.png

Hi @anshjain 

Thank you very much, The JavaScript you provided was working perfectly and I'm able to create current date folder structure. But I have another small requirement I'm getting data from the workday web service and it has 524 records below i defined listEntityPageSize as 200 and i configured data mapping as below. i got 3 file into bucket 1st file 200, 2nd file 200 and 3rd file 124 records. but i want all 524 records into a single output file is it possible? 

purna05_0-1715782578824.png

 

purna05_1-1715782599254.png