Efficiency advantages

Through data-driven modelling, diagnostic data can be used to replace inefficient ‘time-based’ calibration and maintenance schedules with ‘condition-based’ monitoring (CBM) systems. These can remotely determine facility process conditions, instrument calibration validity and even measurement uncertainty, without the need for unnecessary manual intervention. CBM can also uncover hidden trends and process value correlations which were previously undetectable. The information generated through CBM can be used to predict component failure, detect calibration drift, reduce unscheduled downtime, and ultimately provide a framework for in-situ device calibration and verification.

When specifying such a system with an engineering consultancy, it is important to realize that there is no true ‘one size fits all’ solution. Every facility is different, from its mechanical build (e.g., pipe bends, valve positions etc.) to its instrumentation (e.g., IO count, number of pressure/temperature sensors etc.) not to mention the subtle variations between the digital data that can be obtained from differing manufacturers of the same instrument.

Defining the objective(s) of the CBM system is therefore a crucial first step, as this will inform which modelling techniques will be used and enable realistic milestones for the development of the system to be agreed.

A first-pass data collection and integration stage should be undertaken. At this point, multiple sources of data are standardized into a singular database which the model will read from. This can sometimes be time consuming as most plants are not necessarily designed with CBM in mind.

With the data standardized, ‘exploratory data analysis’ (EDA) is then possible during which the data science team, in collaboration with experienced plant engineers and operators, will seek to uncover patterns and correlations within the data and align them to real world occurrences.

After these initial steps have been completed the model development can begin. This is an iterative process, especially when the model is being developed using live data. For example, if the CBM system is to detect unwanted gas entrainment in a fluid flow scenario (which can affect flow meter operation and calibration) then such a situation has to occur first for the model to learn the patterns present in the data.

The accuracy of data-driven modelling improves with more data, so historical data can play a vital part in the process. While of course end-users will expect their CBM system to act on live data, data scientists can use archived data from a given engineering plant to effectively backdate the model’s awareness of facility operation with information pertaining to previous faults and instrument calibration drift. The data model will therefore be able to flag similar events should they occur again in the future.

Taking it one step further, by incorporating measurement uncertainty analysis into CBM, this improves predictive performance even when the live data departs from the initial training data. However, to date, little consideration has been given to uncertainty quantification over the prediction outputs of such models. It is vital that the uncertainties associated with this type of in-situ verification method are quantified and traceable to appropriate flow measurement national standards.

Therefore, research in this field is currently underway at TÜV SÜD National Engineering Laboratory, the UK’s Designated Institute for fluid flow measurement. In this controlled environment, every instrument’s measurement uncertainty is understood and accounted for in the CBM model. The end goal of CBM research is to obtain data-driven models which are highly generalizable and capable of interpreting live field datasets. By quantifying the uncertainty associated with model outcomes this will create a national standard for remote flow meter calibration, with the knowledge and experience of current practices built in.

Where CBM was perhaps once viewed by end-users as a ‘nice to have’, the need for cost saving and increased operational efficiency has never been more prevalent. Successfully implemented and well trained CBM systems can help operators realize these goals. The traditional approach to flow meter calibration has been to cease operation on an oil rig to remove the meter and ship it to an appropriate calibration facility. However, with CBM, the model can provide information on meter condition based on live and historical data. In addition, if an operator observes that the flow rate data has begun to deviate, a well-trained CBM system may be able to inform them that the problem does not lie with a drifting meter calibration, but can instead be attributed to the erosion of the meter due to previously undetected particles in the flow and immediate intervention is required to prevent permanent damage.

This early knowledge of a developing issue can save operators hundreds of thousands of pounds in replacement meter costs and manual fault diagnosis time. As greater numbers of operators upgrade their systems and processes to take advantage of digital instrumentation, the relevance and uptake of CBM models will continue to grow to the point where they are no longer a novelty but a necessity

 

DR GORDON LINDSAY
Dr Gordon Lindsay is the Head of ‘Digital Services’ at TÜV SÜD National Engineering Laboratory. The company is a global center of excellence for flow measurement and fluid flow systems and is the UK’s Designated Institute for Flow and Density Measurement, with responsibility for providing the UK’s physical flow and density measurement standards. TÜV SÜD National Engineering Laboratory is a trading name of TUV SUD Ltd, a company of the TÜV SÜD Group, an international service organization. For further information please visit: www.tuvsud.com/en-gb/nel