Pop go the analytics

New approaches are allowing data analytics to transform upstream oil and gas. By Duncan Irving

When it comes to the comprehensive exploitation of data, the upstream oil and gas industry is a decade behind industries where tight margins in customer- and logistics-driven value chains have led to massive investment in data mining and cross-functional analytics. In the current economic climate, the oil & gas industry is now beginning to organise itself so it can tap into the wealth of insights that lies within its myriad of databases, file stores, archives and operational systems.

So far, achieving the right mix of analytics expertise, database software, hardware and domain knowledge has proved elusive for many companies. Unlike its downstream counterpart, the upstream industry has focused on deterministic models of data analysis that lead to a particular outcome, when it really needs to move towards data-driven and ‘probabilistic’ approaches based on collaboration. These methods show not only what could happen but how likely each outcome is.

A reason for the industry’s sluggishness in analytics lies in the way data flows are often compartmentalised, so that for instance, geophysicists have one stream, while production data is sent off in another direction. Only recently have conventional workflows been challenged by new business processes and economic drivers.

By storing information in silos, the industry makes it very hard to obtain the massive data sets drawn from all parts of a business that are required for analytics. It is only the use of such data volumes that allows an organisation to harness the analytics-driven power of correlation and comparison.

Collaboration
Data volume is one aspect, but if analytics are to support operations, they also need to be approached collaboratively, which allows both long-term study and real-time analysis as data is generated.

Again, within much of the industry, the compartmentalised approach has stood in the way. One department often has to ask another to run analysis, which slows everything down. Furthermore, it can lead to time-wasting duplication of effort, as one team may already be undertaking the work requested of them by their colleagues.

However, as this inefficiency has become more apparent, so attitudes have changed. The more forward thinking in the industry understand that the best environment in which to conduct analytics is a ‘data lake’ into which all the data is pooled. Not only does this avoid costly duplication, it also gives reassurance about the quality of the analytics, as everyone is looking at the same core data. The entire undertaking then becomes a joint effort, which has incalculable value in itself.

Data integration
A data lake may be where the initiative begins, but its secrets are only brought to the surface through successful data integration. In the oil and gas industry this should include data from supply chain logistics, enterprise resource planning systems as well as production. All this can be put together into one effective platform, provided the integration system can recognise the relationships between all the data types and supply the context.

Then, once the lake has been assembled, the data can be subjected to thorough analysis using tools that have the science ‘baked-in’, such as reservoir flow modelling software, as well as more generic visual analytics tools that have become so valuable in other industries.

A case in point
A good example of innovative industry practice using analytics was one project’s close examination of well log data from a large proportion of the UKCS (United Kingdom Continental Shelf) data set. The aim was to reveal more effective drilling methods and eliminate costly and timewasting problems such as stuck pipe and tripping.

The task, completed by consultants, successfully collated hundreds of well logs and then searched them for correlations. The analysis also included the written logs made by drillers, looking for words like ‘stuck’ and noting the time at which they were written, so they could be matched against the data logs.

The deployment of analytics allowed data from thousands of wells to be processed in a single afternoon and uncovered many unexpected relationships, such as those resulting from lithology (when drilling through different types of rock) or lurking in the seismic data.

Without the use of analytics, most of this would not have been obvious, even to someone experienced in the oil and gas field.

Once these relationships between data have been established, they can be used in strategic decision-making. For example, drillers, who are constantly searching out small improvements in efficiency, can be provided with information on how one well compares with others. Use of the analytical insights allows them to avoid drilling in a certain type of rock, or fed live, enables them to optimise their work.

Return on investment
More operationalised decision-making is now required to ensure that effective well planning and interventions are delivered – both onshore and offshore. Operating companies are making the investment in permanent reservoir monitoring systems and downhole sensors to collect and process the data giving right-time insights into reservoir behaviour.

However, no operating company can yet integrate these science-rich insights, along with technical information around drilling histories and flow behaviour, to provide statistically robust guidance on the quality and reliability of any given intervention plan.

Faced with this challenge, Teradata and a major upstream operator decided to run analytics on full pre-stack seismic data, as well as the post-stack and to integrate it with flow models and production histories to generate just such a multi-domain view. The next logical step would be to process data as soon as it is gathered on a ship to understand in realtime the quality of a re-survey and how data quality may impact on the reliability of the data for the next round of well interventions. This would enable the vessel to reshoot a line of seismic should something wrong be found with the results.

Using analytics, this project has already made it possible to make quick comparisons between the latest seismic data and data from the previous survey. The analytics show up differences in the seismic recording itself, where for example, the results might have been affected by different wave conditions or by the presence of another vessel nearby while the last survey was being shot.

This means that when the company contemplates a highvalue intervention, it is able to avoid the costly mistake of basing its decision on unreliable data about a particular section of the oil field.

And once it understands how all the data is connected,  it becomes feasible to run deeper analysis and predict significant factors affecting the field, such as downhole pressure or water saturation levels.

As an operator organises its data more intelligently, so it can deploy analytics to generate insights by comparing patterns and trends – something performed effectively in many other industries. Within the oil industry it is now possible to use data processing that is similar to a mobile phone app which recognises music by comparing it with the recordings in its memory.

It is a question of using time series data, which reveals how something changed, and compares it with patterns stored in a computer’s memory. This technique will reveal whether something similar happened before.

This could, for example, be used to monitor equipment vibration data and look out for trends that led to problems in the past. Collecting all that equipment data from an organisation’s SCADA (Supervisory Control and Data Acquisition) systems and combining it with enterprise-wide operational and historical data is beyond the capabilities of the SCADA architecture. Exposing these systems to a business-wide analytical platform for further processing can then point to the root of a problem, and indicate what is likely to happen next by putting together a “likelihood pathway”.

It is substantial leaps in insight and efficiency such as these that are convincing more operators in the upstream oil and gas industry that they must break down the barriers between departments so they can fully exploit the big data sets they possess.

Only then can they obtain the greater insight, faster decision-making and improved ROI that advanced analytics can undoubtedly bring.

In fact, the entire upstream oil and gas industry needs to wake up to the enormous value it can very rapidly unlock by deploying advanced analytics techniques on the mass of data it constantly generates.

TERADATA
Duncan Irving is Oil and Gas Practice Lead, Teradata. Teradata, the big data analytics and marketing applications company, helps companies get more value from data than any other company. Teradata’s leading portfolio of big data analytic solutions,integrated marketing applications, and services can help organisations gain a sustainable competitive advantage with data.

For further information please visit: teradata.com