24,000 Contact Us Cloud

Event Frames: One of the Most Brilliant Ideas in Industrial Data — And Why They Matter Even More in the AI Era

Jeff Tao

March 5, 2026 /

Visualize and compare multiple events by a simple click in TDengine

Over the years, one concept in the PI System has always impressed me: Event Frames. For those who work with industrial operational data, the idea is simple but powerful.

Instead of looking only at raw time-series signals, Event Frames convert continuous data streams into discrete operational events. An Event Frame has: start time, stop time, duration, associated attributes, and relationships with other events (parent / child). For example:

  1. A production shift may contain multiple batches.
  2. Each batch may contain multiple process phases.
  3. Each phase may generate alarms or abnormal events.

This structure turns raw sensor signals into operational context. And once you have events, many questions become easy to answer:

  1. How many downtime events occurred last week?
  2. Which batches consumed the most energy?
  3. How long did compressor surge events last?
  4. What happened before a machine failure?

In many ways, Event Frames convert signals into stories about operations.

The Challenge with Modern Data Infrastructure

Today, many companies are adopting modern streaming infrastructure such as Spark or Flink for real-time analytics. These systems are extremely powerful, but they were designed primarily for data engineers, not OT engineers.

In theory, you can implement event detection using stream processing frameworks. But in practice, it often requires:

  1. writing streaming jobs in Java or Scala
  2. implementing state machines
  3. managing distributed state and timers
  4. maintaining complex pipelines

What was once a simple rule in an OT system can easily turn into hundreds of lines of code. For engineers in operations or manufacturing, that creates friction. The technology becomes powerful — but harder to use.

Why Event Frames Matter Even More in the AI Era

In the AI age, raw time-series data alone is rarely enough. AI systems work best when data is structured and contextualized. Instead of feeding a model with millions of raw signals like:

  1. temperature(t)
  2. pressure(t)
  3. vibration(t)

it is often much more valuable to provide structured operational events like:

  1. Event: Compressor Surge
  2. Start: 10:23:15
  3. Duration: 12 seconds
  4. Severity: High
  5. Associated Equipment: Compressor-7

Events become the building blocks for operational intelligence.

They are useful for:

  1. root cause analysis
  2. anomaly detection
  3. batch comparison
  4. predictive maintenance
  5. training machine learning models
  6. AI agents that reason about operations

In other words, Event Frames act as a bridge between raw data and operational knowledge.

This is why I believe the concept is even more important today than when it was first introduced.

Where Traditional Historians Fall Behind

PI System did an excellent job introducing Event Frames and making them accessible to OT engineers. But many traditional historian architectures were not designed for the modern AI and data ecosystem.

Today we need systems that can combine:

  1. high-performance time-series storage
  2. real-time stream processing
  3. contextualized asset models
  4. event generation
  5. open data access for AI
Start/Stop Trigger can be configured manually or automatically by AI in TDengine

How We Approached This in TDengine

At TDengine, we took a slightly different approach. TDengine includes a built-in stream processing engine and a GUI to configure the rules (expressions), which allows the system to generate Event Frames directly from streaming time-series data. Furthermore, with the help of LLM, TDengine can even generate the rules or detect the anomaly automatically based on the operational context. In many cases, users can simply describe what they want in natural language, and the system will translate that intent into the underlying rules.

The key design principle is simple: OT engineers should not need to write streaming code.

Users define rules and logic at the operational level, while the system handles:

  1. state management
  2. event detection
  3. event lifecycle
  4. data storage and indexing

The user experience is similar to what engineers are familiar with in PI System, but built on modern infrastructure that is AI-ready from the ground up. In addition, you can visualize and analyze events from TDengine IDMP by a simple click.

Align the start time and even normalize the event duration for different events in TDengine

Looking Forward

Industrial data platforms are evolving rapidly. But sometimes the most powerful ideas are not the newest ones. Event Frames are one of those ideas.

They transform raw sensor signals into meaningful operational events — something both engineers and AI systems can understand. As industrial AI continues to develop, I believe event-centric operational data models will become even more important.

In the future, AI agents will not just analyze time-series signals. They will reason about events, processes, and operational context.

And that journey starts with turning data into events.

  • Jeff Tao

    With over three decades of hands-on experience in software development, Jeff has had the privilege of spearheading numerous ventures and initiatives in the tech realm. His passion for open source, technology, and innovation has been the driving force behind his journey.

    As one of the core developers of TDengine, he is deeply committed to pushing the boundaries of time series data platforms. His mission is crystal clear: to architect a high performance, scalable solution in this space and make it accessible, valuable and affordable for everyone, from individual developers and startups to industry giants.