Annotations, Rules, and Reports

Data review and anomaly detection through annotations, rules, and reports.

Annotations

Annotations are a core feature in Sift for data review and anomaly detection. They enable users to attach meaningful, time-bound information to a run, asset, or channel. An annotation captures user-generated insights or observations, facilitating analysis and collaboration.

The following image illustrates a data-review annotation looks like in Sift:

Annotation Example

Annotations can be:

  • Assigned to team members
  • Associated with various states to track their progress during the review process
  • Created manually using the Sift UI or Sift API
  • Generated automatically through rules

Rules

Rules automate data analysis for live or historical data streams. Built using the Common Expression Language (CEL), rules enable the definition of logical conditions that operate on data from one or more channels.

Key features of Sift rules:

  • Built-in functions for defining complex logic.
  • Automated actions, including generating annotations or sending notifications, when conditions evaluate to true.
  • Versioning, to track rule changes over time and facilitate improvements.
  • Previews, for testing rules using existing run data before applying them.

Rules can be created and managed through the Sift UI or the Sift API. The following image demonstrates rule creation and previewing in the Sift UI:

Rule Example


Reports

Reports provide a consolidated overview of multiple rules configured for an asset, facilitating efficient data review. Reports include:

  • A summary of all configured rules for a given run.
  • Annotations generated by these rules.
  • The originating rule for each annotation.
  • The current status of annotations.

This consolidated view makes it easier to identify trends and anomalies across your data. The following image illustrates a report in the Sift UI:

Report Example

On this page