During API-based data fetching, sometimes you need to pass a date sequence to the request body. For example, use date values in the request body to fetch daily data from the 1st of the current month to the task execution date.
In this case, you need to customize a date sequence starting from the first day of the month and ending with the task execution date, as shown in the following figure.
While you can create date sequences using SQL statements, database-specific syntax makes this approach non-universal. This document describes how to create date sequences using the Spark SQL operator, which applies to all databases.
1. Use a Spark SQL operator and the built-in parameter ${cyctime} to set the start time and end time. The start time defaults to the first day of the month, and the end time defaults to the task execution date.
2. Use another Spark SQL operator to create a date sequence between the start time and the end time. Output the data sequence to a parameter as its value.
3. Use a Loop Container node to loop through dates (the parameter values).
This example uses fetching data from an API synchronously on a daily basis as an example to illustrate how to fetch data.
Request Parameter Description:
dateFrom
Start time
You are advised to fetch data on a daily basis.
That is, set the start time and the end time to the same day.
dateTo
End time
Create a scheduled task, add a Data Transformation node, and drag a Spark SQL operator onto the Data Transformation editing page. Input the query statement to set the start and end time of the date sequence, as shown in the following figure.
The query statement is as follows.
SELECT CONCAT(LEFT('$cyctime',7),"-01") as s_date,LEFT('$cyctime',10) as e_date
Click Data Preview to view the start and end time of the output date sequence, as shown in the following figure.
Add a Parameter Output operator and output the start time and the end time as the parameter values, facilitating the creation of a continuous date sequence using the start time and the end time, as shown in the following figure.
Add a Data Transformation node, and drag a Spark SQL operator onto the Data Transformation editing page. Input the query statement to create a date sequence, as shown in the following figure.
SELECT explode(sequence(to_date('$s_date'), to_date('$e_date'), interval 1 day)) as date
Click Data Preview to view the created date sequence, as shown in the following figure.
Add a Parameter Output operator and output dates in the generated date sequence as the parameter values, facilitating the subsequent use in a Loop Container node to loop through the parameter values, as shown in the following figure.
Add a Loop Container node and drag in a Data Transformation node, as shown in the following figure.
Click Data Transformation to enter the Data Transformation node and drag in an API Input operator. Input the API address, and use the date sequence parameter ${loop_container} output in the "Creating a Date Sequence" section in Body, which is to sequentially fetch data from the specified database on a daily basis, as shown in the following figure.
After parsing and processing the data, use a DB Table Output operator to output it to the specified database, as shown in the following figure.
Click Save, click the Loop Container node, and tick the date sequence parameter. The Loop Container node will sequentially loop through its values on a daily basis and pass the values to the API Input operator for data fetching, as shown in the following figure.
Add and configure a Notification node to notify specified users. You can use the date sequence parameter to pass the current loop date to the notification content, as shown in the following figure.
Click Run to run the task. The data fetched from the API on a daily basis according to the date sequence is displayed. Meanwhile, the specific user will be notified of the result of each loop, as shown in the following figure.
滑鼠選中內容,快速回饋問題
滑鼠選中存在疑惑的內容,即可快速回饋問題,我們將會跟進處理。
不再提示
10s後關閉
Submitted successfully
Network busy