Overview
Sift provides official client libraries for Python, Rust, and Go to simplify integration with its APIs. In this tutorial, you will use the Sift Python client library to stream telemetry from a Python script into Sift. You will run a simple example that simulates a robotic vehicle reporting both velocity and temperature in real time, and see how that telemetry appears in Sift. By the end of this tutorial, you will understand how to install the client library, configure authentication, define telemetry signals, and stream time series data into Sift using Python.Prerequisites
- Basic understanding of how Assets, Channels, and Runs relate to each other in Sift
- Working knowledge of Python and basic familiarity with async/await
- Familiarity with Explore v2 (Beta)
- A Sift API key and access to your Sift gRPC and REST URLs
Scenario
In this tutorial, you will simulate a simple system that produces telemetry. Imagine a robotic vehicle reporting both its current speed and its internal temperature every half second. Instead of connecting to real hardware, the script generates mock velocity and temperature values and streams them to Sift in real time. Each time you run the script, it creates a new Asset and starts a new session for collecting telemetry. The script sends velocity and temperature measurements every 0.5 seconds. You will view this telemetry in Sift and see how multiple Channels are grouped within a single Run, and more importantly, learn how to structure a Python script using the Sift Python client library to stream telemetry into Sift.Step 1: Install the Sift client library
Before we install anything, let’s create a clean Python environment for this tutorial. This helps avoid dependency conflicts and keeps your setup isolated. In your terminal, create and activate a virtual environment, then install the Sift Python client library with streaming support along withpython-dotenv, which we will use to securely load authentication values from a .env file.
sift-stack-py[sift-stream] instead of just sift-stack-py because streaming ingestion requires additional optional dependencies.
- The
sift-streamextra installs the gRPC streaming components needed to open and maintain a real-time ingestion stream. - If you install only
sift-stack-py, the core client will work for non-ingestion APIs, but streaming ingestion will not be available. Usingpip install "sift-stack-py[sift-stream]"ensures that streaming support is included.
Step 2: Configure authentication
Now that your environment is ready, we need to tell the script how to connect to your Sift environment. The Sift Python client library requires three values: a Sift API key, the Sift gRPC URL, and the Sift REST URL. Instead of placing these directly in the script, we will store them in a.env file. This keeps credentials secure and makes the script easier to reuse.
Create a new file in the same directory as your script and name it:
.env file, add the following values, replacing them with the ones from your Sift environment.
Step 3: Create a Python script to stream telemetry using streaming ingestion
Now that your environment is ready and authentication is configured, let’s create a script that sends telemetry to Sift. This script will simulate a system that reports velocity and temperature measurements every 0.5 seconds and streams them to Sift indefinitely (until you stop the script with Ctrl + C). Create the following file (script) in your project directory and paste the following content into it:stream.py
Step 4: Run the script and view streamed data in Sift
In this step, you will execute the Python script from your terminal and then open Sift to observe the newly created Asset, Run, and Channels. As the script executes, you should see velocity and temperature measurements updating live in the Sift interface (Explore).- In your virtual environment, execute the script with:
As the script executes, you should see output similar to (a new line will appear every 0.5 seconds):
- In Sift, locate the Run name or description field and enter the exact Run name shown in the terminal (for example, robot_vehicle_…_…_run).
- In the Runs table, click the Run name (for example, robot_vehicle_…_…_run).
- Click Explore 2.
- Click Live.
- In the Channels tab, click the following Channels:
- temperature
- velocity
Step 5: Understand the ingestion workflow
The ingestion process follows a clear sequence. The steps below summarize how telemetry is structured, configured, and streamed to Sift.Configure authentication and define the telemetry schema
At a high level, streaming telemetry to Sift involves defining structure first, then sending timestamped data that conforms to that structure. In this script, authentication is configured using
SiftConnectionConfig, and a SiftClient is created to communicate with your Sift environment.Define the telemetry schema
A
FlowConfig defines the telemetry schema, and each ChannelConfig declares an individual signal with its name, unit, and data type.Create the ingestion context
An
IngestionConfigCreate associates that schema with an Asset, and a RunCreate defines the session that will group all incoming telemetry.Open a streaming ingestion session and send timestamped flows
Once this ingestion context is established, the script opens a streaming ingestion session over gRPC and begins sending timestamped flows in real time. Each flow includes a timestamp and structured Channel values created using
flow_config.as_flow(), ensuring the data matches the defined schema. These flows are transmitted over the open stream and immediately appear in Sift under the specified Run.Conclusion
In this tutorial, you learned how to stream live telemetry to Sift using the Sift Python client library. You configured authentication, defined a telemetry schema usingFlowConfig and ChannelConfig, created an Asset and Run, and opened a streaming ingestion session to send timestamped flows in real time. You also saw how streamed data immediately appears in Sift and how Channels are organized within a Run.
With this foundation, you can adapt the same ingestion pattern to real systems instead of simulated data. By defining clear flow schemas and using the streaming ingestion API correctly, you can integrate Sift into production pipelines and continuously stream structured time series telemetry into your environment.