Traditional Industries in an AI World

Jeff Tao
Jeff Tao
/
Share on LinkedIn

This article was originally published in Forbes.

The emergence of ChatGPT in the public eye has brought new life to the field of artificial intelligence (AI). As AI technology enters all industries, it becomes a part of our work and lives, ushering in a new industrial revolution. While jobs will be lost, new opportunities will be created for those who work with AI.

Traditional industries, such as energy and manufacturing, are even more anxious about the AI-oriented future than those in the IT sector. They want to know how they can use AI technologies to reduce costs and increase efficiency in their industries.

Having observed this over the past year, I realize that the path toward AI for traditional industries takes work. But the main question influencing whether these industries can enjoy the benefits of AI quickly, and with low cost and risk, is one of data infrastructure.

Eliminating Data Silos

AI is based on big data: Only with an extensive enough sample size can models be trained and business decisions be data-driven. For traditional industries, however, big data is easier said than done.

The IT and OT infrastructures in these industries are uneven at best, and many need to catch up to internet companies in digital transformation. Many energy companies still rely on PI System — often older versions that include end-of-life packages like PI ProcessBook. This is because PI System is particularly stable and meets their business requirements, so they have no reason to explore alternatives.

However, these legacy systems are not interconnected — they are data silos. An energy company typically has dozens to hundreds of power plants, each often using different versions or even different software systems. Due to mergers and acquisitions, different vendors are often used at different sites, making a unified data infrastructure a high-cost and high-risk proposition.

Before AI technology can be implemented in traditional industries, data silos must be eliminated, and the data contained in different systems must be centralized. Data centralization is no easy task at these sites that typically run both legacy and modern equipment, spanning a wide range of industrial data protocols.

Centralization is not simply the combination of data, but a process involving cleaning and processing data before sending it to a unified platform. This process quickly becomes messy and time-consuming and does not produce direct financial benefits. But if you can’t centralize your data, forget about AI — you won’t be able to use it.

Creating a Data Sharing Platform

To use data is to share it with applications, both internal and external. For example, electric vehicles collect massive amounts of data, ranging from engine and battery information to user behavior. Electric vehicle companies use this data internally to determine how to improve their products and experiences.

They also share this data with third parties, such as battery and engine manufacturers, for further analysis. In some cases, sharing with regulators may be necessary as well. Only with a powerful data-sharing platform is it possible to fulfill these requirements.

Sharing data brings privacy and security questions into the picture. Stakeholders must have access only to the specific data shared with them. For user privacy reasons, some raw data may not be suitable for sharing, requiring processing before third parties can view it. And for information security purposes, a data owner must control the time period during which specific data can be accessed and used.

Data sharing must also be a flexible process that can work with many applications and support the constant introduction of new applications and partners. It cannot be a process requiring difficult configuration or developer involvement; systems administrators must be able to enable sharing quickly and without significant effort upon receiving a request. Real-time sharing must be possible, in addition to sharing data as a scheduled task.

A strong, secure and flexible data sharing platform is necessary for enterprises interested in AI enablement, in addition to data centralization.

The Benefits of Open Systems

With data centralization and sharing, enterprises can create in-house AI applications that enable better anomaly detection, predictive maintenance and more. In addition, these applications can work on an enterprise level instead of being restricted to a single plant or factory, offering a big-picture view for leadership.

In-house development of AI applications is out of the question for most traditional industries. Their best option may be to interconnect their own data platforms with industry-leading third-party AI applications. Third-party cloud services are another way to reduce time to market and costs. Cloud services also typically offer usage- or time-based billing, making it easier to test a system without committing to significant expenses.

Because the quality of the services offered by AI providers can vary widely, enterprises must select open data platforms that can provide data to any application via standard interfaces. Open data systems give you options for AI applications; you can try out multiple applications until you find the best one for your scenario.

The Future of AI

Despite the hype, we can’t expect AI to replace traditional industries like energy, manufacturing and mining — just like the internet didn’t 20 years ago. These industries will continue to grow, with AI giving them new vitality and increasing efficiency. Enterprises that do not embrace this AI-enabled future will likely be unable to compete, and eventually, they will lose the market.

There are some areas in which AI is not ready for traditional industries: The first is in industries with small profit margins, as it may be considered too expensive because of its overly compute-intensive nature. ChatGPT and similar technologies rely on historical data, whereas real industrial scenarios need real-time data analytics. There are still challenges for AI in the industrial sector and much room for improvement, but that’s for AI researchers to work out.

We must take action now to avoid being rendered obsolete by the emergence of AI. Centralize data from all sites and create an open, secure and flexible data-sharing platform that can seamlessly integrate with third-party AI tools and services. The future belongs to those who are AI-ready. Are you?

  • Jeff Tao
    Jeff Tao

    With over three decades of hands-on experience in software development, Jeff has had the privilege of spearheading numerous ventures and initiatives in the tech realm. His passion for open source, technology, and innovation has been the driving force behind his journey.

    As one of the core developers of TDengine, he is deeply committed to pushing the boundaries of time series data platforms. His mission is crystal clear: to architect a high performance, scalable solution in this space and make it accessible, valuable and affordable for everyone, from individual developers and startups to industry giants.