Deployments

Configuring Deployments in Daana

Once you've completed your data mappings and model configurations, it's time to deploy and execute your workflows. The Configure Deployment screen allows you to manage how Daana pushes the data pipelines to your target environment and sets up scheduled or manual data loads.

1. Installation: Deploy Workflow

The Deploy Workflow button pushes all metadata (from your model and mappings) into your target environment. This step is necessary after any changes to your model or mappings. Daana will automatically detect the changes and update the necessary parts of the deployment. Think of this as the "installation" step that prepares your pipelines for execution.

2. Execution: Manual and Automated

In the Execute section, you can either trigger a manual load of your data into the target warehouse or configure automated executions.

  • Manual Execution: Pressing the Execute Workflow button will load new data from the source into the target warehouse, making it immediately queryable.

  • Automation: If you want to schedule regular data loads, you can use the Automation feature. Enabling this will give you two options:

    • UI Scheduler: Set up the schedule using a user-friendly interface by specifying the exact time, or how frequently you want the workflow to run (e.g., every X hours or at a specific time on selected days).

    • Cron Tab: For advanced users, you can directly input a cron schedule if you prefer more control over how the scheduling works.

3. Advanced Batch Config

For cases where you need more specific control over how batches of data are processed, you can configure the Advanced Batch Config section. This is particularly useful when dealing with data lakes where batch IDs or load timestamps (e.g., load_ts) are used to signify when data was loaded into the environment.

Batch Column

The Batch Column field defines how the data is grouped for batch processing. For example, in a Sales table, the Order Date could be used as a batch column to group sales transactions by date for daily processing.

Read Logic

The Read Logic field allows you to define how Daana reads data from your source systems. This logic specifies the filtering, transformation, and retrieval methods that Daana should use when loading data into your target environment. An example might be querying a database to extract data for a specific time period (e.g., sales data for the last month).


By configuring deployment in Daana, you ensure that your data pipelines are fully integrated into your environment and set up for both manual and scheduled execution. This gives you flexibility in how and when data is processed, whether you need immediate results or regular batch updates.

Last updated

Was this helpful?