Machine Learning Analytics

LinkedInEvernoteFacebook

Machine learning is all the rage, with business leaders scrambling to understand how it can benefit their organizations, and for some, even what machine learning is.  One thing is clear: the onslaught of data from the internet of things has made quickly scaling machine learning and advanced analytics the key to optimizing enterprise decision-making, operations, and logistics.

An enterprise-grade machine learning solution begins with three core capabilities:

  1. predictions without relying on knowledge of past events
  2. analysis and visualization of time series data
  3. optimized decision-making under uncertain conditions.

With these, an enterprise can put its data to work to improve operations and planning.

advanced-machine-learning

Handy resources to learn more about machine learning:

State of Enterprise Machine Learning

Major Roadblocks on the Path to Machine Learning

Mainstreaming Machine Learning

LinkedInEvernoteFacebook

National Grid Webinar: Answering Your Questions

LinkedInEvernoteFacebook

Recently David Salisbury, Head of Network Engineering for National Grid and Neil Barry, Senior Director EMEA at Space-Time Insight, presented the webinar “How Analytics Helps National Grid Make Better Decisions to Manage an Aging Network“, hosted by Engerati.  [Listen to the recording here.] Unfortunately, not all the submitted questions were able to be answered in the time allotted.  However, responses have been provided in this post.

How were pdf data sources incorporated into your analytics? How will that be kept up to date?

To correct to the discussion in the webinar, pdf data sources were not analysed in the valves and pipeline use cases. For the corrosion use case, data from pdf reports was manually rekeyed into the analytics solution.

 

Are there mechanisms built into the system that facilitate data verification and data quality monitoring?

In the general case, metrics were computed for data completeness (e.g., of the desired data, how much was actually available) and confidence (e.g., how recent was the data we used). For the corrosion use case, there are checks for data consistency and completeness.  For pipelines and valves, these metrics have not yet been fully configured.

 

Could you describe how this helps with the audit trail?  As the system changes, the current snapshot is updated.  How do you show the status at a certain point in the past when a decision was made?

For the corrosion use case, the history is stored and accessible, providing an audit trail. The foundation analytics does offer a ‘time slider’ that delivers animated time series data, making it easy for the user to go back in time.  However, this is not currently configured for National Grid.

 

Please provide specific examples of how decisions were made based on analytics and demonstration of analytics/predictive analysis

David described an example at around the eight minute mark into the webinar – budgets used to be set locally, but the insight from analytics might show that a particular type of problem is located in a specific geographic area. This can help with decisions around investment and risk.

 

How have you defined Asset Health? What data is required to assess?

Models for asset health were agreed upon by National Grid and Space-Time Insight during the implementation process. For pipelines, as was mentioned in the webinar, two of the data sets are Close Interval Potential Survey (CIPS) and Inline Inspection (ILI). For valves, a number of data sets are used, including test results and work orders.

 

Did you look at techniques to predict issues based on historical data…so you can target risk areas?

This has not been implemented by National Grid.  However, the product software has the capability to predict the probability of failure and the criticality of that failure, as one example.

 

Has Space Time insight worked on developing a situational intelligence tool for electric distribution and/or transmission applications? Similar to the gas transmission monitoring developed for National Grid?

Yes, Space-Time Insight offers an asset intelligence solution for electricity transmission and distribution utilities.  More information is available online.

LinkedInEvernoteFacebook

How To Not Be Stubborn In 2020

how_to_not_be_stubborn
LinkedInEvernoteFacebook

I pulled the following two predictions about analytics and decision making from a recent list of 100 predictions by Gartner analysts (subscription required):

  • By 2018, decision optimization will no longer be a niche discipline; it will become a best practice in leading organizations to address a wide range of complex business decisions.
  • Through 2020, over 95% of business leaders will continue to make decisions using intuition, instead of probability distributions, and will significantly underestimate risks as a result.

Apparently most of us will refuse to get the message about optimizing decisions, even after years of tools and best practices being in place. In Gartner’s 2020, we’re all still stubborn foot-draggers.

In my experience, predictions like these often require a grain of salt. Generalizations such as “over 95% of business leaders” at “leading organizations” who “significantly underestimate risk” lack the mathematical precision necessary to inspire confidence and change behavior.

Predictions like this often contain a grain of truth, as well. We frequently prefer our personal comfort zone, resist change, suffer from confirmation bias, and respect the confines of our organization’s formal and informal cultural.

Keep in mind that being stubborn can quickly lead to being history. Accenture CEO Pierre Nanterme notes that half of the companies in the Fortune 500 disappeared between 2000 and 2015. Why? New business models based on digital technologies, including decision optimization. The rapid pace of change and disruption will continue, and increase.

So, how do you avoid becoming a historical footnote by 2020?

  • Start with the end in mind. Decision optimization starts with the BI dashboards that (I hope) you are using today, and extends to advanced analytics that include prediction, simulation, and prescription. Knowing where you’re heading helps you plan a route and schedule for reaching your destination.
  • Start small. You won’t get to optimal decisions immediately. Identifying what decisions you can automate helps you pinpoint feasible projects with measureable ROI. Chances are, regardless of how digital your industry is now, there is low-hanging fruit to be picked.
  • Start now. Start this quarter or this month or this week, or even today. With hosted and cloud solutions, you don’t need to complete a big IT project before you can start improving decision making through analytics. In fact, you don’t have time for the typical enterprise project that requires years.

The year 2020 may seem like a long way off.  In truth, it’s 12 calendar quarters away. That’s not long. Start now and you’ll be 12 quarters ahead of some other stubborn dog.

LinkedInEvernoteFacebook

How digital is your industry?

LinkedInEvernoteFacebook

construction-site-woman-web

Jeff Bezos, founder and CEO of Amazon, famously wrote in 1997 that it was “Day One of the Internet.” Now nearly 20 years later, he still feels that we’re at Day One, and early in the morning to boot. How can that be, given how pervasive and transformative digital technology seems these days?

This Harvard Business Review video describes a McKinsey Global Institute survey about just how digital various industries are today.

The survey examined 27 digital characteristics about assets, usage, and labor across more than 20 industry sectors. It uncovered plenty of room to bring digital technology and approaches into vast areas of our economy.

A few sectors such as IT, media, finance, and professional services, are heavily digital. Other industries such as real estate and chemicals have adopted an ad hoc or tactical approach to digital, but have not converted their value and supply chains to digital from end to end. There remain large portions of the economy, including key functions such as government, healthcare, construction and agriculture, that have very little digitization.

The video points out that the Industrial Internet of Things gives capital intensive industries such as utilities,  manufacturing, and transportation significant opportunity for improvement by connecting physical assets to data and communications networks.

The video also notes that possibly the biggest area for improvement lies in providing workers with digital tools and processes. Providing analytics across the enterprise, instead of keeping it the sole province of IT or data science, is an empowering step industries can take for more productive workers.

Many emerging and developing countries have skipped telephone landlines and moved directly to mobile technology. Can the digital laggards of the economy similarly leapfrog previous digital stages and move directly to end-to-end digital processes with connected, digital assets and advanced analytics?

To my mind, situational intelligence can help government leapfrog from laggard to leader. Many government applications center on documents, payments, and professional services, tasks that are already heavily digitize in other sectors. Government also involves a lot of transportation and real estate functions, sectors that are ahead of government digitally and poised to benefit from the Industrial Internet of Things.

(Image: goodluz / 123RF Stock Photo)

LinkedInEvernoteFacebook

Improving Your Operations with Data Visualization Applications

LinkedInEvernoteFacebook

Operational and Cost Efficiency

Visual analytics is an effective way to understand the stories in your data and take appropriate actions. Effective data visualizations powered by visual analytics enable you to easily and interactively dive deep into your data to identify opportunities to improve efficiency and productivity, cut costs, and optimize resources; all of which are at the crux of increasing operational efficiency. Hence operations leaders want to quickly understand what is in their data so they can initiate and monitor improvement programs as well as make the best possible decisions for the types of situations they regularly face.

But there’s a small catch – while it’s tempting to believe that putting powerful data visualization authoring software in the hands of business users will result in useful solutions, this is rarely a recipe for success. That’s because creating effective data visualizations requires expertise because visualization authoring software itself does not magically surface insights. Human expertise is required to implement the type of functionality that makes it possible to surface and intuitively convey insights that are actionable for making operational improvements.

Basic tables and charts are easy to create, however solving problems and enabling data discovery that leads to root causes and opportunities to improve operations is an entirely different matter. Spreadsheets and most visualization software make it easy to create pretty charts as well as to combine tables and charts into dashboards but do not fully meet the needs of operations leaders. Let’s face it, if spreadsheets alone were sufficient, you’d have all you need to effectively run your business operations.

Questions that you should ask yourself are:

Do one or more simple visualizations or dashboards containing multiple simple visualizations solve real business operations problems?

Do the visualizations surface buried insights and make them readily comprehensible and actionable?

Is it possible to clearly visualize activity on a map and see changes unfold over time?

Is it possible and to synchronize components within a dashboard to update when data is selected, filtered and sorted?

Of these capabilities, how easily can they be implemented (if at all)?

The answer to these questions exposes the necessity for specific functionality that transcends just data visualizations; application software is required to deliver such functionality, or to be specific – data visualization applications. This is one of the reasons that expertise is required, because applications must be implemented to fully deliver on the promise of visual analytics. Expertise must include art, skill and knowledge that typical operations personnel do not possess. Business users do not understand their organization’s data landscape or how to model and transform their data for visualization. And they often don’t have the time to learn and hone the expertise needed to implement data visualization applications regardless of how simple modern data visualization developments tools are to use.

Full service offerings that use best-of-breed visual analytics are a great way to obtain the needed combination of expertise and visual analytics to enable you to achieve the objective set forth in this post – to improve all aspects of operational efficiency.

LinkedInEvernoteFacebook

UK Smart Meter Roll Out: It’s All About The Data

LinkedInEvernoteFacebook

meter-installation-web

Despite some 2.6 million smart meters already being installed in the UK, it is the data infrastructure that is causing delays with the further roll out of smart meters in the UK, according to a recent BBC article. This IT project is necessary to support the volume of data anticipated to come from the smart meter roll out that is being pushed by the government.

From the chart below you can see how many meters have been installed since 2012.  Higher volumes of data are already being collected which reinforces the need for this important IT project to be up and running as soon as possible.

uk-smart-meters

(Chart and data available from the UK Department of Energy & Climate Change)

With news that the data infrastructure launch is pushed back until the autumn, what impact will this have?

How much data will smart meters generate?

To do a quick calculation on monthly meter reads from the potential smart meters across the UK, there would be around 53 million reads per month. By contrast, with smart meters that record data every 15 minutes, this could mean 96 reads a day from 53 million meters resulting in thousands of times more data being generated. This is obviously a rough estimation but gives an indication as to what the energy companies would be dealing with. This doesn’t include status messages from the meters which would add to the mass amount of data being generated.

Why is this so important, if smart meters are just about making billing automated and putting an end to manual meter reading? There is a lot more value within meter readings and status messages beyond billing.

The benefits of smart meters are clear for consumers: tracking how much energy you are using, monitoring the effect of changes that you have made to your energy consumption, and receiving accurate bills without having to submit a meter reading.

When applied properly, data helps energy companies to manage supply and demand in a much easier fashion. Energy companies benefit from analysing the data collected from the smart meters to enable new rates and business models, implement demand response programs, manage solar power panels in a better way and improve support for electric vehicles, to mention but a few.

To benefit from the thousands-fold growth in meter data, energy companies need analytics that locate the problems and opportunities hidden inside this massive amount of data. Smart meter analytics must be intelligent enough to do the heavy lifting for users, not just make it easier somehow for users to browse among millions of meters. Increasingly, analytics for this size of data set needs the intelligence and autonomy to make decisions independently.

Once the IT infrastructure is in the place, the UK energy companies can start pursuing the new value within smart meter data, analysing it to make better business decisions. All 53 million UK meters likely won’t be changed out by 2020, but that shouldn’t stop UK energy providers from using the smart meter data they already have, or will have soon.

(Image courtesy of rido / 123RF Stock Photo)

LinkedInEvernoteFacebook

Can Analytics Make Large Power Transformers Immortal?

LinkedInEvernoteFacebook

Large power transformer

Large power transformers (LPT) are the workhorses of the North American electric transmission grid, and many have lived past their life expectancy. The U.S. Department of Energy reports that average LPT is 40 years old; 70 percent of LPTs are 25 years or older. These assets are becoming a weak link in the chain of networked transmission assets and may be subject to catastrophic failure, including from severe weather.

If transmission systems fail, large-scale outages can occur. According to a different Department of Energy report, 85 percent of U.S. outages affecting 10,000 customers or more in 2015 were caused by weather or by asset failure. These outages cause customers economic and cost transmission organizations lost revenue and damaged reputation. Regulators are increasingly focused on loss-of-load probability and loss-of-load hours, both key reliability measures.

The Department of Energy also reports that LPTs cost up to $7.5 million dollars each, weigh up to 400 tons, and take up to 18 months to procure and install. The money and time required goes up significantly if new engineering is required. The cost of LPTs accounts for 15-50 percent of total transmission capital expenditures.

For all these reasons, there is an urgent need to understand the operational contingency of heavily loaded LPTs to manage and reduce outages, and to bridge the time until critical LPTs can be replaced.

With analytics you can make the most of what you have while planning for new assets. And with an 18 month delivery cycle, utilities need to start that analysis now.

The growing array of smart, connected devices in the transmission system generates large silos of data. That data can be useful in maintaining safe, reliable, affordable and sustainable transmission operations, but it cannot be correlated, analyzed and applied in a timely manner without advanced visual analytics.

Recently, Siemens and Space-Time Insight announced a partnership in part to tackle the issue of large power transformers.

Situational intelligence provides a number of ways to apply advanced analytics to silos of data for managing the current population of LPTs more effectively. Consider three scenarios:

  • By correlating and analyzing the health of LPTs along identified transmission corridors with demand forecasts and power dispatch schedules, transmission operators are able to prioritize the delivery of power using assets that are relatively healthier than other assets. This helps organizations increase grid reliability and make the most of their current assets.
  • By correlating weather forecasts, LPT health and forecasted energy demand, analytics gives transmission operators advanced warning of weather impacts on transmission assets so that they can respond accordingly to avoid outages and asset damage.
  • By forecasting the impact of removing some LPTs from service, analytics gives transmission planners better insight for planning and executing outages necessary for LPT maintenance, repair and replacement, potentially extending the life of these essential assets.

With the scale of the power grid and past deficit of investment in transmission infrastructure, we will be dealing with aged LPTs for many years. Analytics gives us tools to make the most of the assets we currently have, but it won’t make LPTs immortal.

LinkedInEvernoteFacebook

Analytics Makes Water Conservation More Precise—and Fun!

LinkedInEvernoteFacebook
water sprinkler
Watering sidewalks wastes water.

While human populations shift, migrate and grow, the amount of fresh water remains constant. (The only way to make large quantities of fresh water is desalination, which is energy-intensive and practical only near the ocean.) To serve more people from the same supply of water, communities increasingly are turning to conservation measures.

In times of drought, communities employ water usage restrictions such as banning car washing and limiting lawn watering to certain days of the week or month. These rules rely on observable behaviors that aren’t easily enforced unless the utility employs smart meters or water cops—or both.

Analytics provide water utilities with unique ways to encourage participation in conservation without smart meters and water cops. Here are a couple of examples.

Pinpoint water conservation opportunities

Some utilities offer financial incentives such as loans or rebates for people willing to take the conservation steps of removing their lawn. According to the U.S. Environmental Protection Agency,  landscape irrigation nationwide accounts for nearly one-third of all residential water use, totaling nearly 9 billion gallons per day.  Which customers are likely to participate?

A first step for utilities is identifying the areas where such a lawn removal program can be successful. Geo-spatial analytics identify portions of the utility service territory where home footprint is small relative to lot size, which shows customers with a potentially high percentage of irrigable land. Locating such lots in areas with less tree cover shows yards that may require more watering in hot months. Sorting a list of these houses by the size of their seasonally-adjusted water bill shows you customers who might be motivated to lower their water bills by replacing some grass with hardy, native perennial shrubs and ground covers.

You might want to filter out houses with pools and hot tubs, to prevent those large water uses from skewing your results. Also, you could focus on households without children living at home. These households might lack built-in labor for mowing and be paying a lawn care company as well as high water bills.

Make conservation a game

Gamification applies concepts such as skills, challenges, points and rankings to non-game contexts such as conservation. Gamification brings out the competitor in us and can make conservation if not fun, then at least a little more rewarding and less boring.

One simple conservation game simply shows you how your water use compares with houses geographically close to you and with households that resemble yours in terms of lot size, people in household, and number of bathrooms. Improving your standing relative to those like you earns you points and entitles you to prizes.

Another, more active game may be to progress through levels of conservation skill, knowledge and savings by accepting certain challenges. Challenges could include doing a home water audit using the utility’s online tools, installing free low-flow showerheads from the utility, or cashing in rebates on low-flow toilets. You receive points as you complete challenges and lower your consumption. Analytics show you how you compare with your neighbors who are and aren’t playing the game. Achieving certain levels in the game might entitle you to prizes such as gift cards or bill rebates.

Fairly simple analytics run behind the scenes in these games to award points and keep score. More sophisticated analytics might be used to recruit players for the game and to target the challenges and rewards to more closely match the water saving needs and opportunities of their neighborhood and household type.

(Image courtesy of ewapee / 123RF Stock Photo)

LinkedInEvernoteFacebook

Identifying Decisions That You Can Automate

LinkedInEvernoteFacebook

Mars rover web

Automated decisions are making the news. This year, Tesla cars in autopilot mode have experienced crashes in Florida, Montana and China. (The company contends drivers that were not using the autopilot mode properly). Meanwhile on another planet, the Mars Rover can now make its own decisions about what rocks to investigate. Keep your eyes open, because I believe these stories will become more prevalent quickly.

When analytics drives automated decisions, people are faced with another set of decisions about how that capability fits into the work, culture and mission of the organization. It’s akin to when you hire a new employee. What role does the new capability fill? How much autonomy is granted to the new capability? What review and verification processes are in place to ensure safe, productive and profitable work?

As stated in an earlier blog, it’s unlikely that automation will replace entire jobs. New job descriptions will be written, and existing ones likely rewritten, to specify how humans and automation will interact to fulfill a needed role.

The analytics entering organizational roles for the foreseeable future will be focused on specific tasks, if not built for specific purposes. Finance and investment companies are using analytics extensively for trading and portfolio composition, but those same analytics aren’t likely to make employee benefits decisions without modification.

Because they are purpose built, analytics need to specialize in predictable decisions that they perform repeatedly. This is exactly the sort of dull work that’s best left to automation, since people have a tendency to get tired, bored and distracted doing repetitive work.

At least initially, organizations will want to assign to analytics decisions that carry known and usually low-level consequences. Consequences can be measured by the amount of money at stake, the number of employees or customers affected, or the ease with which an automated decision can be reversed if need be. Analytics can help accurately define the type, scope and severity of consequences associated with decisions.

The metrics of predictability and consequence come together nicely in a video from The Harvard Business Review describing how to decide what decisions you can entrust to automation.

What about frequency of decisions? Some decisions, like short-term financial trading, happen so rapidly that humans can’t make every single call. In these situations, humans move from doing the work to maintaining, tuning and improving the automated systems that do the work. Other decisions, such as whether to acquire another company, happen so infrequently that automating the decision probably isn’t worth the effort. Between these two points lies a spectrum of decision frequency that organizations must also weigh in identifying decisions to automate.

The framework of predictability, consequence and frequency gives organizations the model they need to determine what role automated decisions will play. What decisions would you like to automated in your organization, and how would you score them for predictability, consequence and frequency?

(Image courtesy of forplayday / 123RF Stock Photo)

LinkedInEvernoteFacebook

Mobile Apps for Internet of Things Data Acquisition

LinkedInEvernoteFacebook

nugget, gold, bronze, copper, iron

The Internet of Things in many ways is a catchall phrase that is used to describe everything from types of devices, to communications gateways, to new service-oriented business models. IoT devices generally are capable of sensing and communicating. IoT devices in the consumer sector include thermostats, door locks, garage door openers, etc. In the industrial sector there are many sensors used in manufacturing processes, vehicles, heavy equipment, and so on. Sensing and communicating data has traditionally been referred to as data acquisitions – a common IoT use case. What is often overlooked is the use of smartphones and tablets for data acquisition. These devices include several sensors such as for audio, video and motion.

The following story highlights how the mobile devices that we use every day are becoming integral to the IoT ecosystem.

Recently I was at a cafe with a friend. A former coworker of my friend whose name is Craig walked in, so my friend invited him to join us. My friend asked Craig “where you currently working?” Craig answered “I am working as an independent contractor developing a unique mobile app.”

With the Apple and Google app stores full of apps, and in many cases offering multiple apps that essentially do the same thing, I wondered what app could be new and unique. I quickly found out as Craig described how the app would help mining companies improve how they determine where mineral deposits most likely exist. Easier identification of mineral deposits will accelerate, optimize and lower the cost of mining – a definite game changer.

Determining places to explore and excavate is a combination of manual labor and trial and error. Miners typically pick and scrape away at surfaces in a mine to collect sample material to be examined to determine if the sample contains mineral deposits. If mineral deposits are detected then further exploration at that area would be initiated.

Craig then explained how the app works. Each mineral has a unique signature that can be identified by a spectrometer (from how the mineral reflects light). Photos and videos taken with smartphones and tablets use compression so the signature cannot be detected using standard photo and video apps. The app he developed interfaces directly to the video sensor so it can analyze the reflected light with the needed fidelity to recognize spectral signatures that identify specific areas where desired mineral deposits can likely be found. The locations identified are marked and uploaded to an operations center for further review and for planning.

Learning about this app shows how ingenuity and software running on commercial off-the-shelf smartphones and tablets makes them applicable for data acquisition use cases. More use cases that integrate people and mobile apps into IoT use cases will surely ensue.

So the next time you pick up a smartphone or tablet think of the myriad of uses it can be programmed to perform, especially when connected to other devices, systems and people. If you know of clever uses of mobile apps for IoT use cases, please comment.

LinkedInEvernoteFacebook