Tailer Documentation
  • What is Tailer Platform?
  • Getting Started
    • Prepare your local environment for Tailer
    • Install Tailer SDK
    • Set up Google Cloud Platform
    • Encrypt your credentials
  • [Tutorial] Create a first data pipeline
    • Introduction
    • Prepare the demonstration environment
    • Copy files from one bucket to another
    • Load files into BigQuery tables
    • Prepare data
    • Build predictions
    • Export data
    • Congratulations!
    • [Video] Automatic Script
      • SQL script file
      • DDL script file
      • Tables to Tables script file
      • Launch configuration and furthermore
  • Data Pipeline Operations
    • Overview
    • Set constants with Context
      • Context configuration file
    • Move files with Storage to Storage
      • Storage to Storage configuration file
    • Load data with Storage to Tables
      • Storage to Tables configuration file
      • Storage to Tables DDL files
    • Stream incoming data with API To Storage
      • API To Storage configuration file
      • API To Storage usage examples
    • Transform data with Tables to Tables
      • Tables to Tables configuration file
      • Table to Table SQL and DDL files
    • Export data with Tables to Storage
      • [V3] Table to Storage configuration file
      • Table to Storage SQL file
      • [V1-V2: deprecated] Table to Storage configuration file
    • Orchestrate processings with Workflow
      • [V2] Workflow configuration file
      • [V1: deprecated] Workflow configuration file
    • Convert XML to CSV
      • Convert XML to CSV configuration file
    • Use advanced features with VM Launcher
      • Process code with VM Launcher
        • VM Launcher configuration file for code processing
      • Encrypt/Decrypt data with VM Launcher
        • VM Launcher configuration file for data encryption
        • VM Launcher configuration file for data decryption
    • Monitoring and Alerting
      • Monitoring and alerting parameters
    • Asserting Data quality with Expectations
      • List of Expectations
    • Modify files with File Utilities
      • Encrypt/Decrypt data with File Utilities
        • Configuration file for data encryption
        • Configuration file for data decryption
    • Transfer data with GBQ to Firestore
      • Table to Storage: configuration file
      • Table to Storage: SQL file
      • VM Launcher: configuration file
      • File-to-firestore python file
  • Tailer Studio
    • Overview
    • Check data operations' details
    • Monitor data operations' status
    • Execute data operations
    • Reset Workflow data operations
    • Archive data operations
    • Add notes to data operations and runs
    • View your data catalog
    • Time your data with freshness
  • Tailer API
    • Overview
    • Getting started
    • API features
  • Release Notes
    • Tailer SDK Stable Releases
    • Tailer Beta Releases
      • Beta features
      • Beta configuration
      • Tailer SDK API
    • Tailer Status
Powered by GitBook
On this page
  • Example
  • Global monitoring parameters
  • Alert parameters
  • Email alert
  • Alert message variables

Was this helpful?

Edit on GitHub
  1. Data Pipeline Operations
  2. Monitoring and Alerting

Monitoring and alerting parameters

Learn how to add monitoring and alerting information to your data operation configurations.

The monitoring and alerting parameters are defined in a JSON object added at the root level of the JSON data operation configuration. It contains the following sections:

  • Monitoring parameters: General information about the criticality of the data operation.

  • Alerting parameters: One or several alert messages and channels, containing information about how to alert the right people throw which system or application.

Currently, you can only send on alert message, and only by email. But we are thinking about adding more alert systems like Pagerduty, Datadog, or throw a generic web-hook. Feel free to suggest to us your preferred alerting platform.

Example

Here is an example in a TTT configuration file:

{
	"configuration_type": "table-to-table",
	"configuration_id": "000001_append_some_data",
	"short_description": "Append some data to a partitioned table",
	"account": "000099",
	"environment": "DEV",
	"activated": true,
	"archived": false,
	"start_date": "2023, 1, 23",
	"schedule_interval": "*/5 * * * *",
	"max_active_runs": 1,
	"task_concurrency": 3,
	"default_gcp_project_id": "my-project",
	"default_bq_dataset": "my_dataset",
	"default_write_disposition": "WRITE_TRUNCATE",
	"direct_execution": true,
	"task_dependencies": [
		"create_my_data_table >> merge_table_with_last_data"
	],
	"workflow": [
		{
			"task_type": "create_gbq_table",
			"id": "create_my_data_table",
			"short_description": "Create the destination table with partitioning on date and clustering",
			"bq_table": "my_data",
			"ddl_file": "my_data.json",
			"force_delete": false
		},
		{
			"task_type": "run_gbq_script",
			"id": "merge_table_with_last_data",
			"sql_file": "merge_table_with_last_data.sql"
		}
	],
  "monitoring": { 
    "impact": 2, 
    "urgency": 2, 
    "alert_enabled": true,
    "alert_status": ["FAILED","NO_MATCH"], 
    "alert_environment": ["PROD","DEV"],
    "alert_info": "Put here information about the alert", 
    "alert": { 
      "email" :  { 
          "email_from": "alert@brand.com", 
          "email_to": "toto@tailer.ia;titi@tailer.ia", 
          "email_reply_to": "support@brand.com", 
          "email_subject": "Data Operation Alert : @configuration_id has just failed", 
          "email_body_type": "txt",
          "email_body": "Type : @configuration_type\nID : @configuration_id\nImpact : @impact\nEnvironnement : @environnement"
        }
      }
  }
}

General parameters about the monitoring.

Parameter
Description

impact

type: integer

optional

If not specified, the default value will be 2.

urgency

type: integer

optional

If not specified, the default value will be 2.

alert_enabled

type: boolean

mandatory

Flag used to enable/disable the execution of the alerting (i.e. send an alert to a recipient when the run failed)

alert_status (beta) type: array

optional

Specifies the Run Status that will trigger the alert.

Possible values: FAILED, SUCCESS, NO_MATCH, CHECKED. Default value: "FAILED"

alert_environment

type: array

mandatory

Specifies the environments that will trigger the alert.

Possible values: PROD, PREPROD, STAGING, DEV.

alert_info

type: string

optional

Short information describing the alert. You can refer it as a variable in your triggering message (as email) with the @alert_info parameter.

alert

type: array of maps

optional

List of alert messages the data operation will trigger if it fails.

Check the section below for detailed information on their parameters.

An alert will be able to trigger different types of messages. Currently, only an email alert can be sent.

For each alert message, parameters will differ depending on the message type.

An email alert will send an email with specific parameters each time the data operation fails in the specified environments.

By default, Tailer provides an email template with detailed information. Thus, you don't have to fill in all the parameters such as subject, body, etc. But you can also personalize all the parameters to set precisely how the alert email should look like. For that, you can use the alert variables described below.

Parameter
Description

email_from

type: string

optional

Surcharge the "email from" attribute.

Default value: "no_reply@tailer.ai"

email_to

type: string

mandatory

List of email recipients

You can specify more than one recipient by separating the email addresses with a ;

example: "steve@apple.com;support@amazon.com"

email_reply_to

type: string

optional

Surcharge the "email reply to" attribute.

Default value: "no_reply@tailer.ai"

email_subject

type: string

optional

Subject of the triggered alert email.

Default value: "TAILER RUN ALERT: @jobid FAILED at @execution_date"

You can personalize the subject with the alert variables described below.

email_body_type

type: string

optional

Format type of the email body.

Default value: "html"

Other possible value: "txt"

email_body

type: string

optional

Body of the alert email.

Default value: "See attached html file below"

You can personalize the body with the alert variables described below. Be careful to use the right body format according to the email_body_type parameter.

Alert messages can be personalized with alert message variables. Those variables are contextualized during the data operation run.

Variable
Description

@url_to_tailer_studio_run_id

Tailer Studio URL for the run

@url_to_tailer_studio_conf_id

Tailer Studio URL for the run current configuration ID

@account

Account of the data operation's run

@environment

Environment of the data operation's run

@status

Status of the data operation's run

@configuration_id

Configuration ID of the data operation

@configuration_type

Configuration type of the data operation's run

@execution_date

Execution date of the data operation's run

@short_description

Short description of the data operation

@job_id

Job ID of the data operation

@run_id

Run ID of the data operation's run

@updated_by

Email address of the user who last updated the configuration

@update_date

Date tile of the last configuration update

@impact

Monitoring impact level parameter

@urgency

Monitoring urgency level parameter

@alert_info

Alert info set in the monitoring parameter

PreviousMonitoring and AlertingNextAsserting Data quality with Expectations

Last updated 4 months ago

Was this helpful?

Global monitoring parameters

'Impact' is an ITIL measure of the extent of the Incident and of the potential damage caused by the Incident before it can be resolved.

'Urgency' is a measure of how quickly a resolution of the Incident is required.

Alert parameters

Email alert

Alert message variables

🌐
⚠️
📨
🧩
Learn more
Learn more