Over the years, one concept in the PI System has always impressed me: Event Frames. For those who work with industrial operational data, the idea is simple but powerful.
Instead of looking only at raw time-series signals, Event Frames convert continuous data streams into discrete operational events. An Event Frame has: start time, stop time, duration, associated attributes, and relationships with other events (parent / child). For example:
- A production shift may contain multiple batches.
- Each batch may contain multiple process phases.
- Each phase may generate alarms or abnormal events.
This structure turns raw sensor signals into operational context. And once you have events, many questions become easy to answer:
- How many downtime events occurred last week?
- Which batches consumed the most energy?
- How long did compressor surge events last?
- What happened before a machine failure?
In many ways, Event Frames convert signals into stories about operations.
The Challenge with Modern Data Infrastructure
Today, many companies are adopting modern streaming infrastructure such as Spark or Flink for real-time analytics. These systems are extremely powerful, but they were designed primarily for data engineers, not OT engineers.
In theory, you can implement event detection using stream processing frameworks. But in practice, it often requires:
- writing streaming jobs in Java or Scala
- implementing state machines
- managing distributed state and timers
- maintaining complex pipelines
What was once a simple rule in an OT system can easily turn into hundreds of lines of code. For engineers in operations or manufacturing, that creates friction. The technology becomes powerful — but harder to use.
Why Event Frames Matter Even More in the AI Era
In the AI age, raw time-series data alone is rarely enough. AI systems work best when data is structured and contextualized. Instead of feeding a model with millions of raw signals like:
- temperature(t)
- pressure(t)
- vibration(t)
it is often much more valuable to provide structured operational events like:
- Event: Compressor Surge
- Start: 10:23:15
- Duration: 12 seconds
- Severity: High
- Associated Equipment: Compressor-7
Events become the building blocks for operational intelligence.
They are useful for:
- root cause analysis
- anomaly detection
- batch comparison
- predictive maintenance
- training machine learning models
- AI agents that reason about operations
In other words, Event Frames act as a bridge between raw data and operational knowledge.
This is why I believe the concept is even more important today than when it was first introduced.
Where Traditional Historians Fall Behind
PI System did an excellent job introducing Event Frames and making them accessible to OT engineers. But many traditional historian architectures were not designed for the modern AI and data ecosystem.
Today we need systems that can combine:
- high-performance time-series storage
- real-time stream processing
- contextualized asset models
- event generation
- open data access for AI
How We Approached This in TDengine
At TDengine, we took a slightly different approach. TDengine includes a built-in stream processing engine and a GUI to configure the rules (expressions), which allows the system to generate Event Frames directly from streaming time-series data. Furthermore, with the help of LLM, TDengine can even generate the rules or detect the anomaly automatically based on the operational context. In many cases, users can simply describe what they want in natural language, and the system will translate that intent into the underlying rules.
The key design principle is simple: OT engineers should not need to write streaming code.
Users define rules and logic at the operational level, while the system handles:
- state management
- event detection
- event lifecycle
- data storage and indexing
The user experience is similar to what engineers are familiar with in PI System, but built on modern infrastructure that is AI-ready from the ground up. In addition, you can visualize and analyze events from TDengine IDMP by a simple click.
Looking Forward
Industrial data platforms are evolving rapidly. But sometimes the most powerful ideas are not the newest ones. Event Frames are one of those ideas.
They transform raw sensor signals into meaningful operational events — something both engineers and AI systems can understand. As industrial AI continues to develop, I believe event-centric operational data models will become even more important.
In the future, AI agents will not just analyze time-series signals. They will reason about events, processes, and operational context.
And that journey starts with turning data into events.


