How to Unlock Value from Industrial Data with AI and ML Technology

Jeff Tao
Jeff Tao
Share on LinkedIn

This article was originally published in Forbes Technology Council.

As artificial intelligence (AI) becomes more deeply ingrained in business processes across industries, the accessibility of data is developing into an increasingly important factor in determining whether enterprises can adapt to and grow with the new technological landscape. Although AI and machine learning (ML) technology can drive operational efficiency and enable business insights on a scale barely imaginable just a decade ago, the output of AI and ML applications and algorithms is only as good as the input — and that input is data.

To ensure that their applications are provided with the best data possible, enterprises should first break down data silos that exist in their organization by means of a strong platform for data centralization. Such a platform ingests and stores data from different sources into a single, unified system that retains the context of the data and implements good governance through processes such as data cleaning and extract, transform and load (ETL).

However, data centralization and governance are only the beginning of the type of modern data strategy that enterprises need to achieve digital transformation and reap the benefits of AI. In fact, an even more important element is data sharing.

The Importance of Sharing

In an industrial data context, sharing refers not only to enabling internal and external stakeholders to access data critical to their job functions but also to distributing data among various systems and applications. In the past, exporting datasets as CSV files and emailing them to operators or analysts may have been the only feasible method of sharing data, but these manual operations are no longer acceptable for enterprises undergoing digital transformation.

For enterprises looking to put the concepts of the Industrial Internet of Things (IIoT) and Industry 4.0 into practice, their data solutions need strong support for real-time data sharing. One traditional option for data sharing offered by platforms such as Snowflake and AVEVA Data Hub is based on views. By creating views, you can easily control the granularity of the data you want to share and implement the principle of least privilege, ensuring that team members who work with data have access to only the data they need while keeping sensitive data private.

Although views are an essential component of data sharing, you’ll also want to consider data historians who provide data sharing based on a publish-subscribe (pub-sub) model. In this model, you create topics that define the data to be shared, and your applications subscribe to topics relevant to them. Streaming data is then pushed to the subscribing application in real time. This method of proactively sending data to applications is a good fit for AI software that needs the latest data at all times.

Selecting a Data Sharing Platform

Considering the importance of data sharing to digital transformation, data platforms that offer strong sharing capabilities are a must for forward-looking industrial enterprises. Here are a few key items to keep in mind when upgrading your data infrastructure.

  • Consider a data platform that provides subscription-based sharing in addition to traditional views. By using views to share data with team members and a subscription to share data with applications, your applications get real-time data pushed to them on demand, ensuring that computations reflect the latest situation in your organization while analysts and managers can work with relevant data provided in a familiar format.
  • Ensure that your data platform is secure and complies with relevant regulations. Data security and privacy are of the utmost importance when it comes to sharing data. Your data platform must implement best practices for security such as role-based access controls (RBAC), assigning privileges based on user groups instead of named users and defining an expiration date for shared data. In addition, it’s a good idea to choose solution vendors that have completed widely accepted security certifications such as SOC 2.
  • Insist on systems that are easy to use. For digital transformation to succeed, all team members must be on board. If your data platform is complex and doesn’t make data sharing an easy process, some staff may be reluctant to share their data or even cause potential security issues by incorrectly setting privileges. A strong data-sharing platform must be simple but powerful and use familiar interfaces and languages to receive company-wide support.

In the new data economy, sharing is the key to success. By implementing a data historian platform that makes sharing data simple and secure, you can make the road to digital transformation a smoother one for your entire enterprise.

  • Jeff Tao
    Jeff Tao

    With over three decades of hands-on experience in software development, Jeff has had the privilege of spearheading numerous ventures and initiatives in the tech realm. His passion for open source, technology, and innovation has been the driving force behind his journey.

    As one of the core developers of TDengine, he is deeply committed to pushing the boundaries of time series data platforms. His mission is crystal clear: to architect a high performance, scalable solution in this space and make it accessible, valuable and affordable for everyone, from individual developers and startups to industry giants.