Analytics Beyond the World of Dashboards: Part 2


In the first part of this post, I drew a bright line between the world of BI dashboards and the situational intelligence analytics that industries are now deploying to derive value from the variety of data sources at their disposal.

Here are some examples of situational intelligence applications that exemplify these points:

  • Optimizing workforce scheduling—an operator of wind farms faces multiple variables in scheduling crews to perform maintenance and repairs. Assigning the day’s work depends on crew availability and skills, weather conditions, part availability, safety constraints, travel time to and from a turbine, climbing time up and down the turbine, and much more. It used to take managers many hours each day to build work schedules using traditional tools and those schedules had to be manually revised when weather conditions, for example, suddenly changed. Now optimization software automatically builds the most efficient schedule in minutes, making adjustments on-the-fly as conditions demand.
  • Predicting failure—a utility faces mounting pressure from regulators and ratepayers after catastrophic failure of power distribution equipment. How do they determine the true risk inherent in their millions of assets spread over thousands of square miles? They had charts showing the historical performance of their assets, but all assets age differently based on geographic location, relationship to other assets in the network, workload, maintenance record, and more. With predictive analytics, asset planners see the likelihood that an asset will fail plus the consequences should it fail. These two measures given them an accurate gauge of risk for each asset and group of assets. Risk-based decisions (as opposed to gut instinct and incomplete data) drive choices about maintenance and capital expenditures.
  • Detecting anomalies—a railroad has tens of thousands of miles of track to inspect and maintain. Wear and tear, weather conditions, and natural disasters continually affect track conditions. Their visual dashboards showed them on a monthly basis which routes had slow throughput and which sections of track were overdue for inspection or repair. This data is significant since an increase in system-wide train speed translates into millions of dollars of revenues. Using anomaly detection, they now pinpoint sections of track that warrant inspection and possible repair before they fail or cause train delays. The data analysis is presented on maps that highlight problematic sections of tracks.
  • Streaming analytics—a construction company needs to know where tens of thousands of vehicles, tools and pieces of equipment are and how they are being used (or abused). By using streaming analytics on the data from sensors placed on trucks and tools, the company pinpoints equipment that is delayed in transit, reassigns unused equipment to other nearby sites, prevents tool theft and loss, and audits vehicle movements to support applications for fuel tax rebates, to name a few.

These “real” analytics systems may sound like they require highly sophisticated and educated users to operate. In reality, not only are these regular business users, but they do not require specialized training and they have the ability to interact with the analytics to execute what-if scenarios, for example.

Dashboards still have a role for many users in many scenarios. But as computing and communications technologies continue to connect the world into an Internet of Things, true analytics systems for prediction, anomaly detection, optimization, and streaming data will take their place at the head of the table.


Situational Intelligence for Supply Chain Logistics: Should You Use a Cloud Data Repository?


A recent customer testimonial video from CEVA Logistics discusses the benefits of having all your big data in one place for analytics.

As Deepak Dodani, Vice President of Global Supply Chain and Transport Solutions for CEVA Logistics says in the video, “to have [all our data] pulled together into one place and to provide intelligence tool sets on top of it provides tremendous flexibility and insight…This data starts to drive key decision making for the future supply chain processes.”

It’s true that analytics encompassing all an organization’s data, instead of individual data silos, gives tremendous insight. That’s a prime value provided by situational intelligence applications.

The question is, how do you get all your big data into one place to perform analytics? It’s big, after all.

One approach is what Deepak describes: gather all the data in the enterprise and move it to a single repository in the cloud.  This approach requires a lot of upfront work to create the repository and ongoing work whenever you want to add a new data source to your analytics program. A cloud repository is nearly mandatory if you are implementing streaming analytics for your Internet of Things devices. On the positive side, a common cloud repository might give you ease of back up and disaster recovery, plus offload some IT management tasks to your cloud provider.

A second approach is the leave all your data sources in place and connect them to a common platform for analytics and visualization. This helps you get your analytics program up and running faster, plus it makes it easier to integrate future data sources into your analytics program. This could be very important as data-creating devices proliferate from the growing Internet of Things.  However, leaving your data in place doesn’t necessarily give you ease of back up and disaster recovery.

Which approach would you choose and why?


Analytics Beyond the World of Dashboards: Part 1


A recent ComputerWorld article, Why Analytics Is Eating The World, contained the comment, “The best analytical insights come from user-generated dashboards running on top of IT-managed infrastructure.” While there is clearly a need for business users to create their own charts using visualization tools like Tableau, the idea – which is challenged in the article itself – that these dashboards can produce the “best analytical insights” is questionable. Vendors and a professor are quoted in the article that users need to be trained in using these tools and in data science techniques to correctly analyze the data.

There are a number of things in the article that don’t add up:

  • First, the main benefit of end user data visualization tools is that users don’t need much training, so to recommend that users be trained to correctly use the visualized data runs completely counter to that benefit.
  • Second, the idea that these end user tools do “analytics” has to be taken with a sizable grain of salt. If “analytics” means displaying charts with basic algorithms, then fine. But for most of the industry, serious analytics–especially of diverse and large volumes of data–requires complex algorithms created by people who spend years at school learning how to create them. There’s a huge difference between people doing data analysis of what they see on a screen and software doing data analysis and presenting the results to users.
  • Third, how does the user know whether the data they’re looking at is “correctly” analyzed or not? Regardless of who writes a prediction algorithm, the results are always questionable. That’s not to say there’s no value in analytics, just that it’s critical that the user understands the conditions under which the analysis was performed, the quality of the data used, and the mechanisms used to produce a result. The only way a user can be confident in what they see is by understanding how confident the software is in the analysis it generated.
  • Finally, while it’s useful to have visual dashboards that tell businesses how many products they sold last quarter, companies are realizing that to be competitive they need real-time advanced analytics that provides insight into what’s happening right now and in the future, not yesterday or a month ago. In many cases, the visualization of the data itself is of course, critical, but the real value to a business lies in the analysis.

In the second part of this post, I’ll share some examples of analytics beyond dashboards.


Smart Meter Deployment and Analytics: Begin with the End in Mind


Sixteen member states of the European Union are currently deploying smart electricity meters. Five member states are deploying smart natural gas meters.  According to a European Commission report, by 2020, 72 percent of meters across the member states will be smart meters.

2020 is still five years away, and the European Commission had originally targeted 80 percent penetration of smart meters by 2020. Shareholders and regulators don’t want to wait years before seeing a return on the investment in smart meters.

If a country is just starting to roll out smart meters, where should they put their first 20 percent of meters to start realizing benefit? Answering that question demands situational intelligence.

Situational intelligence incorporates spatial, temporal and nodal dimensions into analytics. Spatial and nodal concerns for prioritizing smart meter deployments include

  • Where in the service territory does the meter stand (including proximity to other meters to deploy at the same time (route optimization)?
  • Where on the distribution network does the meter lie (network relationship)?
  • What is the age and type of building associated with the meter?
  • What electricity or gas usage is associated with that meter and building?
  • Is the location, network relationship, building type and usage representative of a class of customer or usage that you might want to study (population sampling)?

Once deployed, a small, early subset of smart meters can provide a rich new data source for other applications such as distribution optimization, demand response, energy efficiency program design, revenue protection and more.

In deploying smart meters, situational intelligence and other analytics projects, it pays to begin with the end in mind. You’re less likely to lose your way and more likely to start realizing returns on your investment.


For Utility Analytics, Situational Intelligence is the New Sweet Spot


A recent Greentech Media article states that situational intelligence vendors such as Space-Time Insight have been landing big utility customers “by putting disparate data streams and stores to use in applications to solve today’s data management challenges — not in years or months, but in weeks, or sometimes even days.”

As the article explains, many utilities are struggling with ensuring that their analytics for different data streams are, first of all, accurate, and second of all, properly correlated to one another.

For example: “Take the situation of a meter that’s not functioning properly: Is it due to a meter hardware failure, a network failure, or because somebody has ripped the meter from its socket? Having access to data streams on each of those possible points of failure, and being able to correlate each in relation to the other, can deliver an answer to questions like these, which separate systems can struggle to provide.”

According to the article, big IT companies struggle to effectively serve the utility analytics market for a couple reasons. One, the solutions that they offer are expensive and take a long time to implement for a customer. Two, they ask utility customers to commit to significant upfront investments with as-yet-uncertain outcomes.

Additionally, big IT companies have the obligation to maintain and perpetuate their installed base of software and hardware, even if those products were never designed or intended to handle the current or future requirements related to the Internet of Things, machine learning, cloud computing and other historic trends remaking the IT sector and society at large. If your product line is built upon SQL, and SQL no longer scales sufficiently to handle your customers’ data streams from the Internet of Things, what do you do?

GTM Research foresees global utility data analytics growing from $1.1 billion in 2013 to nearly $4 billion by 2020. This growth will be largely driven by the flood of new data from connected utility assets, and the desire to turn it from an overwhelming deluge into streams of business value. With that much money at stake and small companies scoring large customer wins, big IT companies may decide that they need build, or buy, new situational intelligence solutions to stay competitive.


In Analytics, There Are No Black Boxes


You can’t just blindly accept numbers—that’s not scientific. Yet when you rely on black-box analytics, you’re forced to accept that the logic of the unseen algorithms precisely matches your needs.

Opening up the black box gives you three types of confidence in the data that is driving your decisions.

  • Understanding how an analytical value is derived. Even if you couldn’t do the math yourself or were never a mathematician, you should know what inputs and logic went into creating the analytical results on which you are relying.
  • Modifying or creating analytics to fit your specific needs. Although you may be working with an established analytics vendor or product, you may need to modify existing algorithms or even create your own in order to meet your needs. For instance, you may want to add or change the inputs given to an algorithm, or change the weighting given to the inputs used.
  • Auditing the analytics process. If you are using analytics to make significant, data-driven decisions, odds are you will need to show your math at some time to someone: regulators, investors, board of directors, insurance companies. Black box analytics don’t give you this opportunity for auditability and transparency.

Open source analytics packages are increasingly the norm. R and Spark are two leading examples. Open source allows you to create, understand, modify, and audit analytics to match your specific needs and assure your stakeholders.


Visualization: Virtual vs Augmented Reality


A cursory look around the web on the topic of virtual reality vs. augmented reality would lead you to believe that there’s a race going on and augmented reality is ahead. But how close is that race? What are the strengths of each contestant and would the race have a different result in the Situational Intelligence space?

First, let’s have a look at the contestants.  In virtual reality, you are immersed in a world created for you. Usually, you get there using a helmet or goggles. It is a world that a developer creates and controls through rules for how you can interact with the virtual environment – think computer gaming.

Oculus 2

Virtual reality

With augmented reality, you stay in the real world, but see and hear it populated with phantom images, sounds and sensations that you can manipulate, overlay, and try out in the real world.  Usually, you can augment reality using simple devices such as the camera on your phone or tablet computer and an AR application. Heads-up displays in cars and planes are current and increasingly common forms of augmented reality.

Augmented Reality example

Augmented reality

With situational intelligence, Virtual Reality is useful for placing you in situations and letting you gather information in places that you wouldn’t, or couldn’t, physically inhabit. For example, you can fly over, or even through, a piece of super-heated equipment or through a noxious environment to gather real-time data about how that equipment is performing. And, everywhere you look—up, down, under, over—is display space to tell you what’s going on. You are no longer limited to a 2D screen, so yo ucan visually connect the data that’s being generated to the object that is producing it.

With Augmented Reality, you can bring Mohammed to the mountain, rather than the other way around, to test how changes in parameters or different placement could effect the performance of a piece of equipment. For example, working with your tablet, you could drag furniture from an online catalog into a real-time picture of your living room to ensure that all your redecorating ideas work and look good. Once you’re happy with your selections, simply click a button to place your furniture order.

I’d say that the race between virtual and augmented reality is currently a tie. Both have a place in situational intelligence, depending on the application.