Sift | Docs

Data Review Workflow

Overview of the Sift interface and its main features.

Overvie

The Data Review Workflow outlines the complete process for creating, managing, and evaluating rules for data analysis and validation. This system enables users to:

  1. Create and Configure Rules
  • Set rule applicability by specifying assets
  • Define input channels and expressions for analysis
  • Preview rule behavior using historical data
  • Add contextual channels for better insight
  • Configure annotation settings for violations
  1. Organize Rules into Templates
  • Create templates with descriptive names and tags
  • Add relevant rules to templates
  • Save and manage template changes
  1. Generate and Review Reports
  • Generate reports for specific runs using selected templates
  • Monitor report execution status
  • Review individual rule evaluation results
  1. Manage Results and Violations
  • Assign reviewers to annotations
  • Add comments and tag specific users
  • Receive notifications for assignments
  • Document comprehensive violation dispositions
  • Update rules to address false positives

This workflow provides a systematic approach to data validation, review, and quality assurance, with built-in features for collaboration and continuous improvement.

Create a Rule

  1. Specify rule applicability

    1. Specify Assets: Select which Assets should have the rule applied

    2. Specify Asset tags: If an Asset has one of the specified tags, it will also have the rule applied

      1. To add an Asset tag:

        1. Search for the Asset you want to add a tag to and open it in Explore

        2. Navigate to the Asset Details tab

        3. Click the Edit icon and add tags

  2. Select required input channels

    1. The Input Channels search allows users to search for and select any of the channels on any of the applicable Assets identified in Step 1.
  3. Write rule expression

    1. The Expression Syntax sidebar on the right of the page contains the available functions

      1. For complete documentation and examples of each function, see the function channel on the Calculated Channels support doc. support doc.
    2. If the Input Channels aren't found for all applicable Assets, a warning message will appear below the Expression field

      1. This will NOT block updating the Rule in order to allow for cases where the Asset has not yet seen an instance of the channel, but will in the future.
  4. Preview Rule behavior: Rules can be previewed on historical Runs of telemetry from applicable Assets.

    1. When clicking the '+' button to select Runs, all Runs from the specified Assets for the Rule are available to filter and select.

    2. Select one or more and click “Preview Runs”

    3. Users can navigate to the respective Runs in the Explore view to verify behavior is what's expected.

    4. If the Input Channels are not all found for a given Run, a warning icon will be displayed on the preview.

  5. Add contextual channels for additional insight into Rule failures

    1. Contextual channels allow Rule editors to provide insight to reviewers during the data review process.
    2. Users can select either channels or Views to add for context.
  6. Configure annotation settings

    1. Select either Data Review or Phase annotation
    2. Specify whether there should be a user assigned to the generated Annotations by default.
  7. Save your change with autogenerated logs about which fields changed from the previous version (if any) and optionally add Version Notes.

Add Rule to Template

  1. Create a new template with descriptive name

  2. Add relevant tags and description

  3. Click “+ Edit Rules” to search for Rules to add to the Template

  4. Select appropriate rules

  5. Save template changes

Generate a Report

  1. Click "Create Report" button

  2. Select target run for analysis then click “Select Report Template”

  3. Choose report template and start the Report generation.

  4. Monitor report execution status on Reports for Runs page

  5. Review individual rule evaluation statuses in the report

Assign Results

  1. Navigate to Annotations tab in left sidebar

  2. Select annotation for assignment

  3. Use right sidebar to assign reviewer and add comments

  4. Tag specific users with @<user email> in comments

  5. Tagged users will receive notifications via the top-right notifications icon

Disposition Violations

  1. Open relevant annotation

  2. Add comprehensive comment that includes:

    • Root cause analysis
    • Assessment of behavior acceptability
    • Links to related issues or tickets

For detailed guidance on dispositioning annotations, refer to the Annotations documentation.

Update Rule

If investigation reveals a false positive:

  1. Modify rule to account for identified edge case
  2. This prevents similar issues in future evaluations

On this page