# Workflows

The *Workflows* menu allows you to schedule various tasks and execute them at a specific date and time. This allows the execution of different Extractors and Transformers, so that they are tightly chained together.

## Creating a Workflow

To create a new Workflow, go to '*Data Pipelines*' > '*Workflows*', then click the green button labbeled '*Create Workflow*':

![Creating a Workflow](https://3420689551-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LHEKskLK6aXinV75Knl%2F-Lyn4cvujEe7nfPU6k2b%2F-Lyn5h-9Dfqkf7Hjh_QQ%2Fimage.png?alt=media\&token=ad9565b3-4661-4e0c-9feb-70d8fead14a2)

* Provide a meaningful N**ame** for the Workflow
* Optionally you can provide a detailed **Description**
* The **Start date** of the Workflow, when it will run the first time
* The interval: **Hourly**, **Daily** or **Monthly**
* At what time should the Workflow start
* Provide an interval value. I.e. provide 2 for every second  **Hourly** / **Daily** / **Monthly**, depending on your configured interval
* **Add** a new step in your work flow
* Set the **Type** of the step using the drop down menu
* Provide the option that goes with the selected **Type**. This can be your Extractor, Transformer, Report or other name
* Depending on the selected Step Type, you can provide an offset date. This value is used during execution of that step. Typically this would be used for a **From** date offset (i.e. **-1** for yesterday)
* A **To** date offset can be provided for some step types (i.e. **0** for today)
* In case multiple steps need to be executed in parallel, you may chose to remove the **Wait** checkbox.&#x20;
* Additional arguments can be send to the step. This applies only to some **Extractors** and when executing a custom **Command**
* You can delete a step using the red minus button
* To view historical Workflow results, click the **Status** tab

### Special Workflow Steps

Apart from adding *Extractor*, *Transformer* and *Report* steps, there are two different *Workflow* Step types:

* Core
* Execute Command

#### Core

The *Core* command allows you to run a few predefined API calls, currently these are the following:

![Core API calls using workflow step](https://3420689551-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LHEKskLK6aXinV75Knl%2F-LNPqyyO-B4xsVN2yGxj%2F-LNPrWL8Oh3tUQA67IZK%2Fimage.png?alt=media\&token=d72aaa5c-fe37-4707-ae30-a663f9e26e6d)

* *Run garbage collector*
  * Cleans up the server getPrices cache table and Redis cache
* *Purge cache*
  * This will Unprepare any prepared reports. Use with caution
* *Refresh budgets*
  * Evalualte all configured budgets
* *Send heartbeat*
  * Send an API heartbeat request. For future use.

{% hint style="danger" %}
**Purge cache** should be used with caution, as it will unprepare **all** available reports. This means that nonr of your reports will return any data, until you have prepared them again.
{% endhint %}

#### Execute command

The Execute Command step enables you to execute an external command, like a script:

![Call external commands or scripts](https://3420689551-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LHEKskLK6aXinV75Knl%2F-Lhtxxm9bITzQXOC661o%2F-LNPtR2AV9DTUYWeRoNY%2Fimage.png?generation=1561119429777641\&alt=media)

As an example: you could run a Powershell script to obtains some data from a special data source that Exivity Extractors are not able or allowed to connect to. This script could be executed the following manner:

```
powershell.exe "D:\script\special.ps1" ((get-date).addDays(-1)).ToString("""yyyyMMdd""")
```

The above command calls the Powershell executable to run the *special.ps1* script, with a dynamically generated parameter that is evaluated at run time. This particular example always provides yesterdays date in *yyyyMMdd* format as a parameter to the *special.ps1* script. Many other variations and scripting languages are possible. Feel free to experiment.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://olddocs.exivity.io/2.10.2/data-pipelines/workflows.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
