Predict Failure versus Predictive Maintenance

Gauges
LinkedInEvernoteFacebook

In a recent post by ARC Advisory Group, Peter Reynolds notes that 80% of assets fail randomly despite being supported by programs designed for asset maintenance and reliability. Only 3-5% of maintenance performed is predictive. The vast majority of maintenance is either break-fix or executed based on the OEM’s asset maintenance schedule – needed or not.

A broad set of factors drive asset performance, including variabilities in process conditions/flow outside the asset itself, which previously may not have been considered relevant to determining asset condition. With advanced analytics, the compute power is available to combine asset health, asset condition, and process variables to determine the asset’s true risk of failure.

More importantly, machine learning will provide a means to see beyond a conventionally-understood state leading to asset failure. These machine learning models require an understanding of the operating and failure mode states of these assets. As Reynolds points out, this probably means working with operating personnel, not maintenance personnel, to develop the models. This marks a change from condition-based maintenance and less sophisticated predictive models.

Using sophisticated machine learning models, asset managers can know that a given asset will continue through a rough spot, not fail as might have been predicted by condition monitoring or prognostic models, and will in fact go on to a longer operation. This suggests that the P-F curve in ARC’s post could look more like a sine wave than a gradual drop off. The key is to have confidence in the algorithm’s prediction that failure is actually not imminent. Only the right set of machine learning analytics can predict into the future without a loss of confidence.

Predictive and prescriptive analytics will indeed drive the next wave of improvements in asset performance. But only the right algorithms will provide the highest return on investment for those seeking lasting improvements in asset performance.

 

Image copyright: http://www.123rf.com/profile_frimerke’>frimerke / 123RF Stock Photo

LinkedInEvernoteFacebook

Re-imagining the Future of Asset Maintenance

Copyright: http://www.123rf.com/profile_wi6995
LinkedInEvernoteFacebook

Asset failure, or more accurately, avoiding asset failure, is big business, as it should be. For asset-intensive industries, asset failure can mean revenue loss, customer dissatisfaction, brand degradation, even regulatory fines. So improving the means by which asset failure is avoided is as important as the day-to-day production by the asset.

Many companies continue to take a break/fix approach to asset repair, or cyclical preventative maintenance, where pre-set characteristics of general asset types determine when maintenance is performed. Some are considering Condition-Based Maintenance (CBM), where some parameter of an asset is monitored and repair is performed when that parameter indicates a problem or imminent failure, based on a statistical models for that type of asset. But greater business benefits are achieved with Predictive and Prescriptive Maintenance, often powered by machine learning, which look at the state of each individual asset and predict the probability of failure into the future, and optimize maintenance and repair schedules based on that input along with other constraints.

Arc Advisory Group recently updated its Maintenance Maturity Model to note the availability and benefits of these more sophisticated analytic approaches. They noted that moving from preventive maintenance to predictive and prescriptive models can deliver 50 percent savings in labor and materials, which has a ripple effect from improvements in shipping times to customer satisfaction. They observe that new technologies in the industrial internet of things (IIoT) enable inexpensive, real-time asset monitoring. Measuring vibration, heat, lubricants, and other asset conditions in real-time are essential for the enterprise to adopt Predictive and Prescriptive Maintenance. Creating a ‘digital twin’ or software model of the asset gives analytics software a basis to compare ideal and observed measurements.

Doesn’t CBM provide many of the same benefits? Perhaps to a lesser extent, but there is no reason to settle for CBM. In CBM, analytics examines the current state of the asset to alarm for likely asset failure. However, not every condition that may appear to head toward failure actually will in that specific asset, and true asset maintenance optimization can occur only when an enterprise can reliably determine the difference. Avoiding unnecessary maintenance costs can extend asset life at a fraction of the cost.

Predictive maintenance powered by machine learning should allow you to ‘see over the hill,’ beyond the current condition, to determine the most probable outcome given the current condition of each asset. The combination of machine learning and IIoT could prove to be the missing link in smart and effective asset maintenance.

 

 

Image courtesy 123RF: Copyright: http://www.123rf.com/profile_wi6995

LinkedInEvernoteFacebook

DistribuTECH 2017 – Serious Networking for Energy Nerds

DISTRIBUTECH LOGO
LinkedInEvernoteFacebook

The annual DistribuTECH conference is right around the corner, this year at the San Diego Convention Center from January 31 – February 2.  With over 11,000 attendees from 78 countries and over 500 exhibiting companies, DistribuTECH is the place to be for those even mildly interested in energy transmission and distribution.

Spacetime will be there, this year hosting pre-scheduled meetings in room 3946.  (Schedule your meeting here.)

You can also see Spacetime’s advanced analytics in action on the exhibit floor in our partners’ booths.

 Partner  Booth  Demo
 Siemens 3113 Asset Intelligence integrated with Siemens Spectrum Power
 Sentient Energy 1025 Distribution Intelligence integrated with Sentient AMPLE Platform
 Live Data Utilities 2352 Distribution Intelligence integrated with Live Data RTI Platform

Visit our partners and see the future of advanced analytics for the internet of things today.  Register for DistribuTECH or download a free exhibit hall pass.

 

LinkedInEvernoteFacebook

CrateDB SQL Database Puts IoT and Machine Data to Work

LinkedInEvernoteFacebook

Space-Time Insight joined CrateDB in their launch of CrateDB 1.0, an open source SQL database that enables real-time analytics for machine data applications. We make extensive use of machine learning and streaming analytics, and CrateDB is particularly well-suited for the geospatial and temporal data we work with, including support for distributed joins. It allows us to write and query sensor data at more than 200,000 rows per second, and query terabytes of data. Typical relational databases can’t handle anywhere near the rate of ingestion that Crate can.

Crate handles and queries geospatial and temporal data particularly well. We also get image (BLOB) and text support, which is important for our IoT solutions, as they are often used to capture images on mobile devices in the field and provide two-way communication between people and machines. Crate is also microservice-ready — we’ve Dockerized our IoT cloud service, for example.

Finally, our SI Studio platform uses Java and SQL and expects an SQL interface, so choosing Crate made integration straightforward and allowed us to leverage existing internal skill sets.

Read more at Crate.io, Space-Time Insight and The Register.

LinkedInEvernoteFacebook

Machine Learning Analytics

LinkedInEvernoteFacebook

Machine learning is all the rage, with business leaders scrambling to understand how it can benefit their organizations, and for some, even what machine learning is.  One thing is clear: the onslaught of data from the internet of things has made quickly scaling machine learning and advanced analytics the key to optimizing enterprise decision-making, operations, and logistics.

An enterprise-grade machine learning solution begins with three core capabilities:

  1. predictions without relying on knowledge of past events
  2. analysis and visualization of time series data
  3. optimized decision-making under uncertain conditions.

With these, an enterprise can put its data to work to improve operations and planning.

advanced-machine-learning

Handy resources to learn more about machine learning:

State of Enterprise Machine Learning

Major Roadblocks on the Path to Machine Learning

Mainstreaming Machine Learning

LinkedInEvernoteFacebook

National Grid Webinar: Answering Your Questions

LinkedInEvernoteFacebook

Recently David Salisbury, Head of Network Engineering for National Grid and Neil Barry, Senior Director EMEA at Space-Time Insight, presented the webinar “How Analytics Helps National Grid Make Better Decisions to Manage an Aging Network“, hosted by Engerati.  [Listen to the recording here.] Unfortunately, not all the submitted questions were able to be answered in the time allotted.  However, responses have been provided in this post.

How were pdf data sources incorporated into your analytics? How will that be kept up to date?

To correct to the discussion in the webinar, pdf data sources were not analysed in the valves and pipeline use cases. For the corrosion use case, data from pdf reports was manually rekeyed into the analytics solution.

 

Are there mechanisms built into the system that facilitate data verification and data quality monitoring?

In the general case, metrics were computed for data completeness (e.g., of the desired data, how much was actually available) and confidence (e.g., how recent was the data we used). For the corrosion use case, there are checks for data consistency and completeness.  For pipelines and valves, these metrics have not yet been fully configured.

 

Could you describe how this helps with the audit trail?  As the system changes, the current snapshot is updated.  How do you show the status at a certain point in the past when a decision was made?

For the corrosion use case, the history is stored and accessible, providing an audit trail. The foundation analytics does offer a ‘time slider’ that delivers animated time series data, making it easy for the user to go back in time.  However, this is not currently configured for National Grid.

 

Please provide specific examples of how decisions were made based on analytics and demonstration of analytics/predictive analysis

David described an example at around the eight minute mark into the webinar – budgets used to be set locally, but the insight from analytics might show that a particular type of problem is located in a specific geographic area. This can help with decisions around investment and risk.

 

How have you defined Asset Health? What data is required to assess?

Models for asset health were agreed upon by National Grid and Space-Time Insight during the implementation process. For pipelines, as was mentioned in the webinar, two of the data sets are Close Interval Potential Survey (CIPS) and Inline Inspection (ILI). For valves, a number of data sets are used, including test results and work orders.

 

Did you look at techniques to predict issues based on historical data…so you can target risk areas?

This has not been implemented by National Grid.  However, the product software has the capability to predict the probability of failure and the criticality of that failure, as one example.

 

Has Space Time insight worked on developing a situational intelligence tool for electric distribution and/or transmission applications? Similar to the gas transmission monitoring developed for National Grid?

Yes, Space-Time Insight offers an asset intelligence solution for electricity transmission and distribution utilities.  More information is available online.

LinkedInEvernoteFacebook

How To Not Be Stubborn In 2020

how_to_not_be_stubborn
LinkedInEvernoteFacebook

I pulled the following two predictions about analytics and decision making from a recent list of 100 predictions by Gartner analysts (subscription required):

  • By 2018, decision optimization will no longer be a niche discipline; it will become a best practice in leading organizations to address a wide range of complex business decisions.
  • Through 2020, over 95% of business leaders will continue to make decisions using intuition, instead of probability distributions, and will significantly underestimate risks as a result.

Apparently most of us will refuse to get the message about optimizing decisions, even after years of tools and best practices being in place. In Gartner’s 2020, we’re all still stubborn foot-draggers.

In my experience, predictions like these often require a grain of salt. Generalizations such as “over 95% of business leaders” at “leading organizations” who “significantly underestimate risk” lack the mathematical precision necessary to inspire confidence and change behavior.

Predictions like this often contain a grain of truth, as well. We frequently prefer our personal comfort zone, resist change, suffer from confirmation bias, and respect the confines of our organization’s formal and informal cultural.

Keep in mind that being stubborn can quickly lead to being history. Accenture CEO Pierre Nanterme notes that half of the companies in the Fortune 500 disappeared between 2000 and 2015. Why? New business models based on digital technologies, including decision optimization. The rapid pace of change and disruption will continue, and increase.

So, how do you avoid becoming a historical footnote by 2020?

  • Start with the end in mind. Decision optimization starts with the BI dashboards that (I hope) you are using today, and extends to advanced analytics that include prediction, simulation, and prescription. Knowing where you’re heading helps you plan a route and schedule for reaching your destination.
  • Start small. You won’t get to optimal decisions immediately. Identifying what decisions you can automate helps you pinpoint feasible projects with measureable ROI. Chances are, regardless of how digital your industry is now, there is low-hanging fruit to be picked.
  • Start now. Start this quarter or this month or this week, or even today. With hosted and cloud solutions, you don’t need to complete a big IT project before you can start improving decision making through analytics. In fact, you don’t have time for the typical enterprise project that requires years.

The year 2020 may seem like a long way off.  In truth, it’s 12 calendar quarters away. That’s not long. Start now and you’ll be 12 quarters ahead of some other stubborn dog.

LinkedInEvernoteFacebook

How digital is your industry?

LinkedInEvernoteFacebook

construction-site-woman-web

Jeff Bezos, founder and CEO of Amazon, famously wrote in 1997 that it was “Day One of the Internet.” Now nearly 20 years later, he still feels that we’re at Day One, and early in the morning to boot. How can that be, given how pervasive and transformative digital technology seems these days?

This Harvard Business Review video describes a McKinsey Global Institute survey about just how digital various industries are today.

The survey examined 27 digital characteristics about assets, usage, and labor across more than 20 industry sectors. It uncovered plenty of room to bring digital technology and approaches into vast areas of our economy.

A few sectors such as IT, media, finance, and professional services, are heavily digital. Other industries such as real estate and chemicals have adopted an ad hoc or tactical approach to digital, but have not converted their value and supply chains to digital from end to end. There remain large portions of the economy, including key functions such as government, healthcare, construction and agriculture, that have very little digitization.

The video points out that the Industrial Internet of Things gives capital intensive industries such as utilities,  manufacturing, and transportation significant opportunity for improvement by connecting physical assets to data and communications networks.

The video also notes that possibly the biggest area for improvement lies in providing workers with digital tools and processes. Providing analytics across the enterprise, instead of keeping it the sole province of IT or data science, is an empowering step industries can take for more productive workers.

Many emerging and developing countries have skipped telephone landlines and moved directly to mobile technology. Can the digital laggards of the economy similarly leapfrog previous digital stages and move directly to end-to-end digital processes with connected, digital assets and advanced analytics?

To my mind, situational intelligence can help government leapfrog from laggard to leader. Many government applications center on documents, payments, and professional services, tasks that are already heavily digitize in other sectors. Government also involves a lot of transportation and real estate functions, sectors that are ahead of government digitally and poised to benefit from the Industrial Internet of Things.

(Image: goodluz / 123RF Stock Photo)

LinkedInEvernoteFacebook

Improving Your Operations with Data Visualization Applications

LinkedInEvernoteFacebook

Operational and Cost Efficiency

Visual analytics is an effective way to understand the stories in your data and take appropriate actions. Effective data visualizations powered by visual analytics enable you to easily and interactively dive deep into your data to identify opportunities to improve efficiency and productivity, cut costs, and optimize resources; all of which are at the crux of increasing operational efficiency. Hence operations leaders want to quickly understand what is in their data so they can initiate and monitor improvement programs as well as make the best possible decisions for the types of situations they regularly face.

But there’s a small catch – while it’s tempting to believe that putting powerful data visualization authoring software in the hands of business users will result in useful solutions, this is rarely a recipe for success. That’s because creating effective data visualizations requires expertise because visualization authoring software itself does not magically surface insights. Human expertise is required to implement the type of functionality that makes it possible to surface and intuitively convey insights that are actionable for making operational improvements.

Basic tables and charts are easy to create, however solving problems and enabling data discovery that leads to root causes and opportunities to improve operations is an entirely different matter. Spreadsheets and most visualization software make it easy to create pretty charts as well as to combine tables and charts into dashboards but do not fully meet the needs of operations leaders. Let’s face it, if spreadsheets alone were sufficient, you’d have all you need to effectively run your business operations.

Questions that you should ask yourself are:

Do one or more simple visualizations or dashboards containing multiple simple visualizations solve real business operations problems?

Do the visualizations surface buried insights and make them readily comprehensible and actionable?

Is it possible to clearly visualize activity on a map and see changes unfold over time?

Is it possible and to synchronize components within a dashboard to update when data is selected, filtered and sorted?

Of these capabilities, how easily can they be implemented (if at all)?

The answer to these questions exposes the necessity for specific functionality that transcends just data visualizations; application software is required to deliver such functionality, or to be specific – data visualization applications. This is one of the reasons that expertise is required, because applications must be implemented to fully deliver on the promise of visual analytics. Expertise must include art, skill and knowledge that typical operations personnel do not possess. Business users do not understand their organization’s data landscape or how to model and transform their data for visualization. And they often don’t have the time to learn and hone the expertise needed to implement data visualization applications regardless of how simple modern data visualization developments tools are to use.

Full service offerings that use best-of-breed visual analytics are a great way to obtain the needed combination of expertise and visual analytics to enable you to achieve the objective set forth in this post – to improve all aspects of operational efficiency.

LinkedInEvernoteFacebook

UK Smart Meter Roll Out: It’s All About The Data

LinkedInEvernoteFacebook

meter-installation-web

Despite some 2.6 million smart meters already being installed in the UK, it is the data infrastructure that is causing delays with the further roll out of smart meters in the UK, according to a recent BBC article. This IT project is necessary to support the volume of data anticipated to come from the smart meter roll out that is being pushed by the government.

From the chart below you can see how many meters have been installed since 2012.  Higher volumes of data are already being collected which reinforces the need for this important IT project to be up and running as soon as possible.

uk-smart-meters

(Chart and data available from the UK Department of Energy & Climate Change)

With news that the data infrastructure launch is pushed back until the autumn, what impact will this have?

How much data will smart meters generate?

To do a quick calculation on monthly meter reads from the potential smart meters across the UK, there would be around 53 million reads per month. By contrast, with smart meters that record data every 15 minutes, this could mean 96 reads a day from 53 million meters resulting in thousands of times more data being generated. This is obviously a rough estimation but gives an indication as to what the energy companies would be dealing with. This doesn’t include status messages from the meters which would add to the mass amount of data being generated.

Why is this so important, if smart meters are just about making billing automated and putting an end to manual meter reading? There is a lot more value within meter readings and status messages beyond billing.

The benefits of smart meters are clear for consumers: tracking how much energy you are using, monitoring the effect of changes that you have made to your energy consumption, and receiving accurate bills without having to submit a meter reading.

When applied properly, data helps energy companies to manage supply and demand in a much easier fashion. Energy companies benefit from analysing the data collected from the smart meters to enable new rates and business models, implement demand response programs, manage solar power panels in a better way and improve support for electric vehicles, to mention but a few.

To benefit from the thousands-fold growth in meter data, energy companies need analytics that locate the problems and opportunities hidden inside this massive amount of data. Smart meter analytics must be intelligent enough to do the heavy lifting for users, not just make it easier somehow for users to browse among millions of meters. Increasingly, analytics for this size of data set needs the intelligence and autonomy to make decisions independently.

Once the IT infrastructure is in the place, the UK energy companies can start pursuing the new value within smart meter data, analysing it to make better business decisions. All 53 million UK meters likely won’t be changed out by 2020, but that shouldn’t stop UK energy providers from using the smart meter data they already have, or will have soon.

(Image courtesy of rido / 123RF Stock Photo)

LinkedInEvernoteFacebook