Skip to main content

🎥 How Gousto reduces food waste and increases workforce productivity: Watch on-demand. >

HighByte Blog / Latest Articles / Data standards won’t solve your interoperability problems

Data standards won’t solve your interoperability problems

John Harrington
John Harrington is the Chief Product Officer of HighByte, focused on defining the company’s business and product strategy. His areas of responsibility include product management, customer success, partner success, and go-to-market strategy. John is passionate about delivering technology that improves productivity and safety in manufacturing and industrial environments. John received a Bachelor of Science in Mechanical Engineering from Worcester Polytechnic Institute and a Master of Business Administration from Babson College.
​The efforts of standards organizations like OPC Foundation, Eclipse Foundation (Sparkplug), ISA, CESMII, and MTConnect represent a significant step forward for the advancement of Industry 4.0 in manufacturing.
 
But industry standards only go so far. Businesses need data to tell the story of what is happening, why it is happening, and how to fix it. Multiple pieces of information must be assembled with other pieces of information from other sources to tell the use case story—just like words must be combined into sentences and sentences combined to form stories. Data standards can’t tell the use case story—they can only provide a dictionary.
 
Standardizing the device-level data into structures is key, but only the beginning. Data standards alone will not solve your interoperability problems because they don’t provide the use case related context you need to make strategic decisions. Here are four key reasons why you still need an Industrial DataOps solution like the Intelligence Hub—even with the introduction or evolution of new standards.

1. You’re dealing with machine and vendor variability.

Standards bodies are made up of vendors and users in an industry. As the standard is being defined, variances are allowed for vendor machines with unique capabilities or limitations and unique use cases. While the intent is flexibility, the result is often ambiguity. It’s typical for vendors to implement the same standard slightly differently. Historically, vendors have refined their systems and changed data models over time to suit their needs.
 
As a result, even minor variations in data sets require human interaction to link these machines to other systems in the network and automate dashboards or analytics.
 
The Intelligence Hub enables codeless connections to a wide range of sources, including equipment or controllers, smart devices, sensors, and systems. If the input data is standardized, it can easily be passed through or combined with other data with no additional effort. However, if it's not standardized, it can be modeled and transformed to the governed data standard for the use case.
 

2. You’re viewing individual data with no relationship context.

Think about a manufacturing line with multiple machines. The machine standards address the data for each machine independently, not the combination of the machines or any custom automation connecting the machines. When analyzing operational metrics, bottlenecks, or quality root cause for a production line, specific information from each machine, test stand, and sensor should ideally be assembled into a single payload for that line. 
 
The Intelligence Hub can assemble large payloads of data from multiple machines, consolidate, and then publish to the target system. Models in the Intelligence Hub can correlate the data by logical use case. In a factory, this is typically machinery, process, and product but could also be sustainability or energy consumption. This systematic approach of building data models for the use case greatly accelerates the use of this information by line of business users who are less knowledgeable of the machines and line layouts.
 

3. You’re looking at more than just device data.

You can’t make strategic decisions if you’re not linking your machine data to other systems across your organization. This includes your enterprise applications, such as your ERP system, and your manufacturing databases (e.g., SCADA, MES, Historian, QMS, LIMS, and CMMS).
 
The Intelligence Hub can connect to virtually any system in your organization and combine information from these systems with machine data. For example, your MES provides context on a particular batch. In the Intelligence Hub, you can combine that information with device data and quality system data into a single payload and send it to the Cloud for analysis or dashboarding. 
 

4. You’re not getting the data you need, when you need it.

Information overload is a real problem in the Industry 4.0 world. Industrial data is nearly infinite in both the volume of data values and the frequency at which they can be acquired. Modern PLCs can have hundreds of thousands of data points and can collect data from sensors at sub-millisecond frequencies. Understanding what is needed—and when—is critical. Sometimes data is needed at a cyclic rate, once per second. Other times, you may need an event-based feed to identify cell production complete, defects, or machine performance issues. By defining the desired data payload and its event or frequency, you have a more efficient decision-making process, and you minimize cloud costs because you only store and process the data you need.
 
The Intelligence Hub can assemble data from multiple sources into a single payload, perform any aggregates and transformations, and send to a cloud data lake or other system at the desired rate or event.
 

Wrap Up

Standardized models are important to our industry because they provide a baseline data set to work with, i.e., the dictionary. While data standards are no substitute for telling the use case driven story with contextualized, intelligent insights that drive strategic decision making, they do help expedite data modeling when paired with the Intelligence Hub. The Intelligence Hub allows data standards to deliver the value that’s been elusive since these standards bodies began to form in the 1990s. As you digitally transform to Industry 4.0, use your data to tell stories that solve business problems.
 
If you want to learn more about data modeling best practices and pitfalls, check out the Data Modeling Guidebook we’ve created. The guidebook includes a review of the ISA-95 specification and demonstrates how it can be applied within an Industrial DataOps solution like the Intelligence Hub.

Get started today!

Join the free trial program to get hands-on access to all the features and functionality within HighByte Intelligence Hub and start testing the software in your unique environment.

Related Articles