Skip to main contentA pipeline in Nekt is a data workflow that moves data through Nekt.
- From the source to the Catalog, when a source extraction happens
- From the Catalog back to the Catalog going through a transformation, when a transformation is executed
- From the Catalog to a Destination, when we’re loading data on an external tool
It means that pipelines are the structure handling data movimentation and making sure it happens in a orchestrated and monitored way.
Configuring pipelines
Every time you create a source, a transformation or a destination, the platform will intuitively guide you to also configure the execution of its pipelines - when they’ll run, how the data will be synced, etc. You don’t need to worry about it.
As you grow your workspace, you’ll probably start orchestrating pipelines. For example:
When the source X extraction finishes, I want to trigger the transformation Y and once it’s execution is done, send the data transformed to the destination Z.
This is the orchestration of a source, transformation and destination pipelines, meaning their executions will be connected.
Controlling pipelines
Pipelines are controlled by triggers (scheduled, event-based, or manual) that determine when they will be executed.
Every time a pipeline is executed, it creates a Run. You can then monitor the success of a run through it’s status and logs.