LogoLogo
3.6.9
3.6.9
  • Introduction
  • Getting started
    • Installation
      • Prerequisites
        • Server requirements
      • On-premises
        • Single-node
          • Directory structure
        • Multi-node
      • Azure Market Place
      • AWS Market Place
    • Tutorials
      • Amazon AWS CUR
      • Amazon AWS CUR (Athena)
      • Azure Stack
      • Azure EA
      • Azure CSP
      • Google Cloud
      • VMware vCloud
      • VMware vCenter
    • How-to guides
      • How to configure receiving a monthly billing report
      • How to automatically trigger a monthly billing report
      • How to update your license
      • How to store contract information with an Account in a report
      • How to automatically send workflow errors as webhooks to a monitoring system
    • Concepts
      • User interface
      • Services
    • Releases
      • Upgrading to version 3
      • Known issues
      • Announcements
      • Archive
  • Reports
    • Accounts
    • Services
    • Instances
    • Summary
    • Budget
  • Services
    • Manage
    • Rates
      • Tiered Services
        • Aggregation Levels and the Account Hierarchy
    • Adjustments
    • Subscriptions
  • ACCOUNTS
    • Budget management
  • Data pipelines
    • Extract
      • Configuration
      • Extractor templates
      • Script basics
      • Parslets
      • Subroutines
        • check_dateformat
        • check_dateargument
        • format_date
        • validate_response
      • Language
        • aws_sign_string
        • basename
        • buffer
        • csv
        • clear
        • decimal_to_ipv4
        • discard
        • encode
        • encrypt
        • environment
        • escape
        • exit_loop
        • foreach
        • generate_jwt
        • get_last_day_of
        • gosub
        • gunzip
        • hash
        • http
        • if
        • ipv4_to_decimal
        • json
        • loglevel
        • loop
        • lowercase
        • match
        • pause
        • print
        • return
        • save
        • set
        • subroutine
        • terminate
        • unzip
        • uppercase
        • uri
        • var
    • Transform
      • Configuration
      • Transformer templates
      • Transform Preview
      • Language
        • aggregate
        • append
        • calculate
        • capitalise
        • convert
        • copy
        • correlate
        • create
        • default
        • delete
        • dequote
        • environment
        • event_to_usage
        • export
        • finish
        • Functions
        • if
        • import
        • include
        • lowercase
        • normalise
        • option
        • rename
        • replace
        • round
        • services
        • set
        • sort
        • split
        • terminate
        • timecolumns
        • timerender
        • timestamp
        • update_service
        • uppercase
        • var
        • where
    • Datasets
    • Lookups
    • Metadata
    • Reports
    • Workflows
  • Administration
    • User management
      • Users
      • Groups
    • Notifications
      • Budget Notifications
      • Report notifications
      • Workflow notifications
    • Settings
      • Global Variables
      • White Labeling
  • Advanced
    • Integrate
      • GUI automation
        • Examples
      • API docs
      • Single sign-on
        • Claims-based identity provisioning: users, Account access and user groups
        • Azure-AD
        • Auth0
        • OKTA
        • OneLogin
        • ADFS
        • LDAP
    • Digging deeper
      • Authentication flows
      • Transformer datadate
      • Dataset lifecycle
      • Config.json
      • Databases
  • Security
    • Security
    • Authentication
      • Token
      • LDAP
      • SAML2
    • Password reset
    • Password policy
    • Announcements
  • Troubleshooting
    • Logs
  • Terms & Conditions
  • Privacy Policy
Powered by GitBook
On this page
  • Overview
  • Syntax
  • Details
  • Options
  • Initial data load
  • Data integrity checks
  • Examples

Was this helpful?

Export as PDF
  1. Data pipelines
  2. Transform
  3. Language

event_to_usage

PreviousenvironmentNextexport

Last updated 3 years ago

Was this helpful?

Overview

The event_to_usage statement generates a new CSV file with usage records from START/STOP/UPDATE events in source .

Syntax

event_to_usage fromsource.aliastofile{ options }

See Details section for options description

Details

This statement produces usage records from events in source DSET. Three types of events are supported:

  • START event - marks the start of consumption

  • STOP event - marks the end of consumption

  • UPDATE event - marks the change of consumption attributes, such as quantity

There are several situations when the usage record is created:

  • from START to first matching STOP event

  • from START to first matching UPDATE event

  • from UPDATE to first matching STOP event

  • from UPDATE to first matching UPDATE event

  • from the beginning of the day to STOP event (if consumption started during previous days)

  • from the beginning of the day to the UPDATE event (if consumption started during previous days)

  • from START event to the end of the day (if no matching STOP/UPDATE events found)

  • from UPDATE event to the end of the day (if no matching STOP/UPDATE events found)

  • for the whole day (if consumption started during previous days, and there was no STOP or UPDATE event during processing day)

Event B is considered matching to event A if happened after event B and has matching key fields.

Options

Several options control the behaviour of this statement:

Option name

Presence

Description

start_event

Mandatory

SQL condition for START events (see note below)

stop_event

Mandatory

SQL condition for STOP events (see note below)

update_event

Optional

SQL condition for UPDATE events (see note below). If not specified, only START and STOP are used

key_columns

Mandatory

Comma-separated list of columns for matching events. Values in all columns should match in order for events to match

timestamp_column

Mandatory

Column with event timestamp

epoch_date

Optional

Conditions for events are valid SQL conditions, which are copied verbatim into the query's WHERE, therefore it can be any legal SQL expression, such as "state = 'started' AND (prev_state = 'stopped' OR prev_state IS NULL)". Remember using SQL-standard single quotes for string literals.

Initial data load

If epoch_date option is specified, it is possible to perform initial data load - load running consumptions on specific date. Transcript performs following checks:

  • there are no events loaded for the specified DSET

  • processing date matches specified epoch_date

If any of these checks fail, Transcript stops with error.

Data integrity checks

It is very important to load events in correct order, therefore Transcript performs following checks:

  • the data for the same day for the specified DSET cannot be loaded twice

  • there cannot be gaps in processed dates (except for Initial data load)

If any of these checks fail, Transcript stops with error.

If there is a need to re-process data for specific day, event-related state in the database must be rolled back to the preceding date, and after data for all following days processed in correct order.

Examples

Only START and STOP events, single-column key:

event_to_usage from usage.events to "exported/usage.csv" {
      start_event = "state = 'STARTED' AND (prev_state = 'STOPPED' OR prev_state IS NULL)"
      stop_event = "state = 'STOPPED' AND prev_state = 'STARTED'"
      key_columns = resource_id
      timestamp_column = time
    epoch_date = 20200109
}

START, STOP and UPDATE events, complex key:

event_to_usage from usage.events to "exported/usage.csv" {
      start_event = "state = 'STARTED' AND (prev_state = 'STOPPED' OR prev_state IS NULL)"
      update_event = "state = 'STARTED' AND prev_state = 'STARTED'"
      stop_event = "state = 'STOPPED' AND prev_state = 'STARTED'"
      key_columns = resource_type,resource_id
      timestamp_column = time
}

Date in YYYYMMDD format when data was initially loaded (see )

there is a matching START event before STOP/UPDATE event (ignored in mode)

DSET
Initial data load
permissive