Guidelines in Making Data Quality for Process Control Systems

With the new technological developments in analytical instrumentation, the process plants today are now increasing in the usage of analyzers to help them improve operational efficiencies in process control systems. The important part of using this is it can contribute advancements in process control system in the way it understands and trust data that are being received from the analyzer.

The information received from a process analyzer provides a data hierarchy. The analytical device that can have one or more sub-controllers. These sub-controller learns to manage on or more streams. Each stream that is being analyzed is designed to get and achieve one or more measured component values.

The data hierarchy with explicit information gives meaning to the association between two sets of data. For example, if 1 stream is disabled it is then immediately obvious which are the components that will be affected. In a similar way, once the sub-controller is turned off the data hierarchy defines the streams that can be possibly be affected.

Engineers that understands the different data quality starts with the communication happening between the analyzer and the process control system. Once there are no major communication faults that will happen, engineers should look and understand the analyzer status and see if the analyzer is running properly if there is a possible error that will occur. In order for the stream 1 component to be validated, the analyzer should be quick in running the system with no faults. With this in mind, the stream 1 has should be set to online status. The other series of questions engineers should be asked are:

What will happen to the component values in the course of maintenance cycle? And;

Is the process control system should have at least the last value while the analyzer validation cycle is currently being performed?

To define the calibration cycle, it is the first and introductory step in changing the analytical results shown in a gas chromatograph. The other option is to set standard values to a not-a-number (NaN) in the course of doing a calibration. This is a crucial step to initiate re-initialising of the control algorithms when they return to their normal operation. The data quality in this logic should also be checked for a timeout.

Engineers can check the analyzer data quality in two steps. The first steps involve verification of the communications across all the analytical equipment. This is required to have and come up with results that are normally operating. Engineers should have clear understanding of the different analyzer modes, the fault signals, and error conditions.

The second step is designed to determine if the logic that verifies the data should come up with reasonable criteria. Once the instrument is able to detect concentrations in the outside of some process limits. The logic should be verified once the data is within a range that helps the engineers to come up with a sense of this process. There should be limitations (the physical ones) on how fast phased a value can change. In this case, a rate of change in the alarm system can be implemented.