Analytics technology to study oil and gas data

11 April 2018



Oil and gas pipelines generate colossal amounts of potentially invaluable data. We look into how operators are deploying sophisticated analytics technology to use this information to improve safety and efficiency.


Big data is nothing new where oil and gas are concerned. If you take the term simply to mean ‘a large quantity and variety of data’, then the industry has been dealing with it for many years. However, it is only relatively recently that companies have begun to exploit the available information to the full.

For many of them, this has meant investing in sophisticated analytics software that can put the data to good use. While the possibilities have been discussed for some time, it took the global downturn of the past few years – along with the associated cost-cutting measures – to spur a shift towards digital.

According to a 2015 survey by Accenture and Microsoft, involving oil and gas industry professionals worldwide, the next few years are likely to see a wave of investments in big data and the industrial internet of things (IoT). Despite tighter budgets, only 20% of respondents said they planned to spend less on digital over the coming three to five years, with 62% saying they would spend more.

The rationale, by and large, had less to do with cost-cutting and more to do with operational efficiency. Around 90% of respondents said analytics capabilities would add business value, with a similar proportion extolling the benefits of mobile technologies, industrial IoT and automation.

Faster and better

“Oil and gas industry leaders continue to look to digital technology as a way of addressing some of the key challenges the industry faces today in this lower crude oil-price cycle,” says Rich Holsman, global head of digital in Accenture’s energy industry group. “Making the most of big data, IoT and automation are the next big opportunities for energy and oilfield services companies, and many are already starting work in these areas.”

“Mobility and digital technology is gaining traction as oil and gas players learn to use it to make faster and better decisions, from the field to the front office,” adds Craig Hodges, general manager of the Gulf Coast District at Microsoft.

As you might expect in such an information-driven industry, big data has a wide array of uses.

These include, for instance, detecting hydrocarbon deposits, managing wells more efficiently, and optimising oil and gas production.

However, one of the main applications is pipeline management, with operators turning to digital technology to gauge the state of their networks in real-time.

Currently, the global transportation pipeline infrastructure stretches over 3.5 million kilometres, with around two thirds of that in the US. This network is ageing and corroded in parts, meaning leaks are an ever-present concern and high levels of maintenance are required.

Even a small failure can be catastrophic. Take the incident in mid-2015, when a tiny crack caused about 35,000 barrels of emulsion to spill in northern Alberta. According to the operator, the leak could have happened any time during the two weeks before detection. Although there was a monitoring system in place, it clearly did not work as planned.

Intelligent technology can provide a more efficient way to evaluate what’s happening inside the pipeline.

“Over the past few years, the industry, and TransCanada, has identified significant opportunities dealing with interacting pipeline integrity threats,” says Mark Cooper, a spokesperson for TransCanada. “By bringing more data into a centralised environment, the number of opportunities to use this information for new insights on assets continues to grow.”

TransCanada has been one of the early movers in this field. In 2016, the company acquired Columbia Pipeline Group, making it one of the largest natural gas transmission companies in North America. With the acquisition came Columbia’s Enterprise Analytics platform, designed to monitor assets and mitigate unplanned failures.

The platform is demonstrably effective. Since Columbia signed its first enterprise agreement in 2011, its asset availability has surged from 85–98%. On top of this, TransCanada has invested heavily in its geospatial information system and pipeline integrity applications.

“TransCanada’s suite is a combination of commercial off-the-shelf (COTS) tools and custom-developed analytics solutions that help it manage pipeline integrity,” says Cooper. “The company has found value in adding data to a centralised environment and using new and existing tools to run enhanced analytics in support of realising its goal of zero pipeline safety incidents.”

The Enterprise Analytics platform, created in partnership with software vendor OSIsoft, ‘listens’ to the assets’ data stream. It uses sophisticated algorithms in order to determine any anomalies, and can alert the technician to anything unusual well before a failure occurs.

The information itself is collected via a network of sensors, which take many forms and support a range of business processes. All modern oil and gas networks incorporate sensors to monitor pressure, temperature, density, and flow and compressor condition, among other things. The difficulty lies in deriving intelligence from the data, which, owing to its sheer volume, can be hard to parse.

Safer and more efficient

The Enterprise Analytics platform, however, is able to sift through that information quickly and reliably. By locating small problems before they become big ones, the software enables the company to take early corrective measures. This means lower failure and downtime rates, and less time spent on maintenance.

As a corollary, it has saved the company money. Matt Parks, director of reliability, puts the estimate at more than $7 million in cost avoidance on US Gas Operations East assets for 2016 alone. The programme, deployed on 97% of Columbia Pipeline compression assets, is now preparing for a roll-out in the US West pipelines and may eventually be used across TransCanada.

“This is a situation where the safety and business case aligns,” says Cooper. “TransCanada prides itself on being a responsible pipeline operator that puts safety of the communities in which it lives, works and operates above all.

“Continuously improving in this space is a means by which it can continue to exceed the safety expectations of its public stakeholders and regulators.”

TransCanada is not the only company to have embraced technology of this kind. In fact, companies that fail to do so risk being left behind. A 2014 study by Bain found that organisations with better analytics capabilities were twice as likely to be in the top quartile of financial performance in their industry, five times more likely to make decisions faster than their peers and three times more likely to execute decisions as planned. While this study covered a range of industries, its implications for the upstream sector are significant: Bain has estimated that analytics could help oil and gas companies increase production by up to 8%.

In 2013, Shell announced its plan to become “the most innovative energy company in the field of digital”, including a focus on big data for maintenance.

As of 2016, the company had saved over $1 million by using IoT devices to monitor its Nigerian oilfields. The system chosen – a random-phase, multiple-access connectivity platform by Ingenu – provided Shell with a collection of field data without much need for complex infrastructure. Shell said this would lead to safer and more efficient oilfield operations.

Likewise, BP is working with GE, and its software Predix, to make its oil wells smarter. The latest technology deployed is the Plant Operations Adviser, which will improve reliability and prevent downtime. The system provides access to a number of live data feeds, and includes a real-time facility threat display. It brings together big data, cloud hosting and analytics.

“BP gravitates towards new technology, especially digital, and that makes working with it exciting,” says Lorenzo Simonelli, president and CEO, at GE oil and gas.

“GE is taking a big step forward together during this time of digital transformation,” he continues, “deploying what it has co-created over the past year to drive kind productivity improvements.”

GE has also co-developed a Predix Platform-powered intelligent pipeline system with Accenture. Its predictive analytics capabilities should enable issues that would once have taken months to fix to be resolved in days.

“There’s integration, in real-time, of the best, accurate data,” says Mauricio Palomino, senior solutions architect with GE. “All of the systems of record are integrated, in real time, to operators calculate risk. GE helps operators identify conditions that might happen in the pipeline, and helps them optimise operations in the same field with their activities. Pipeline management becomes a very proactive process, prioritising different challenges.”

One of the system’s key benefits is that it brings data sources together. Many oil companies still work in a siloed way, making it difficult to gain a single view of what’s going on across their operations. GE’s ‘intelligent pipelines’ allow silos to ‘talk’.

The future possibilities are certainly exciting. As oil and gas companies get better at harnessing information, big data will help them not just with threat monitoring but also with making informed decisions. TransCanada has said that its software does not just concern anomaly detection, it is also about understanding the nuances of its fleet as a whole.

“Through machine learning – a method using complex models and algorithms that can learn and make predictions on data – data can be integrated, key performance indicators can be compared to identify inefficient performance, and operational intelligence can be used to make data-driven decisions for our compression fleet,” said Keary Rogers, TransCanada manager of core reliability, in a press release.

Cooper adds that TransCanada continues to invest heavily in research and development. “Some investment in this space focuses on improved analytical models, and also includes the development of new sensors that generate an increasing volume of data that is used to inform pipeline maintenance decisions,” he says.

“TransCanada will continue to focus on the future of tools developed specifically for the pipeline industry while also using other available analytical tools.”

As cost-cutting measures continue throughout the industry, operators will need to be smart with what they have. Big data can help them eke out as much efficiency as possible from their infrastructure, while reducing the risk their assets will fail.

Tapping into pipeline data is the key to improving works practices.
Tapping into pipeline data is the key to improving works practices.
Tapping into pipeline data is the key to improving works practices.


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.