Skip to main content

🎥 How Gousto reduces food waste and increases workforce productivity: Watch on-demand. >

HighByte Blog / Latest Articles / The key to delivering advanced analytics from OT historians

The key to delivering advanced analytics from OT historians

Torey Penrod-Cambra
Torey Penrod-Cambra is the Chief Communications Officer of HighByte, focused on the company's messaging strategy, market presence, and ability to operationalize. Her areas of responsibility include marketing, public relations, analyst relations, investor relations, and people operations. Torey applies an analytical, data-driven approach to marketing that reflects her academic achievements in both chemistry and ethics. Torey received a Bachelor of Arts in Chemistry from Miami University in Oxford, Ohio and completed post-graduate studies in Bioethics Ethics and Health Law at the University of Pittsburgh.
The real value of Industry 4.0 is realized when manufacturers unlock the power of analytics. With the addition of artificial intelligence (AI) and machine learning (ML), organizations can transform raw data into predictive, meaningful insights.

But manufacturers must first overcome a structural barrier to connect their process historians with their analytical applications. This is where HighByte Intelligence Hub comes into play. The latest release of the Intelligence Hub extends connectivity from enterprise systems to historian and time-series database applications, including PI System and InfluxDB. It removes a major disconnect between operational technologies (OT) and the business systems where organizational leaders access the information to make strategic decisions.
 

What are process historians?

Process historians date back to the 1980s, when OSIsoft released its PI System. (OSIsoft was acquired by AVEVA in 2020.) You can think of process historians as a type of automated recordkeeper for your plant floor. They collect time-series data for a wide range of events or conditions, such as temperature, OEE, pressure, and the on/off status of motors. This information is valuable for the operations team but lacks context because its primary function is to collect process control data at a high rate and log it in a time series database.
 

How do analytical systems present information?

On the analytics side, business analysts are using applications that often require data in a tabular format built on a time cycle or an event, depending on what they're analyzing. They are retrieving both streaming data for real-time analytics and historical data for machine learning. Since analytics is a more iterative process, you need the flexibility to run data extractions through an analytic. This process may take many iterations until a strong correlation is identified.  In contrast, the historian side must be very stable for data collection. In many cases, the historian is a validated critical system.  
 

How does the use case impact the data?

Analytical use cases of industrial data span the business, including process control, asset maintenance, quality, supply chain, and research and development. Historian data may have the proper context for analyzing process control if you are using an asset framework, but the other use cases will require historian data to be restructured and combined with MES, CMMS, QMS, BMS, ERP, and other systems to create the usable data set for analytics. 
 

How does the Intelligence Hub close the OT/analytics gap?

With version 2.2, we’ve added bi-directional connectivity to PI System. This means that we can pull data from PI points, assets, or event frames and send them to the Intelligence Hub. From there, we clean, standardize and structure the data, merge it with data from other systems, and then publish it to a file for analytics bulk ingest or stream it directly to the cloud, data lake, or analytics platform. At the heart of the Intelligence Hub is a built-in modeling engine, which gives you the ability to structure data based on your specific needs (so asset models or product models) and then push the information to the applications that need access to that data. You can also integrate third-party sensor data from the cloud back to the factory and write the modeled data directly into the PI asset framework. All the system connections, data structuring and transformation, and data flow automation are configured through a browser-based user interface requiring no custom code and providing a management application of your data pipelines. 
 
This hub-and-spoke configuration moves data directly to the point of use instead of sending it through multiple layers of applications found in the Purdue Model of industrial communications. The key benefit here is faster access to information because the data isn’t unnecessarily filtered through so many systems.  
 
The Intelligence Hub offers a completely codeless environment, so you don't need to know how to write REST APIs or program connects to OPC servers. It's all done through a web-based user interface, which frees the IT team to focus on strategic projects rather than writing code.
 
Want to read more on this topic? Learn about the latest release of HighByte Intelligence Hub and see how to tap into historical data and accelerate analytics in this post by my colleague Aron Semle.

Get started today!

Join the free trial program to get hands-on access to all the features and functionality within HighByte Intelligence Hub and start testing the software in your unique environment.

Related Articles