The Science of Visualization: Pre-attentive Design

data by both
LinkedInEvernoteFacebook

Pre-attentive processing is the first impression your eyes see before you visually analyze something. It refers to the visual properties that you notice approximately a quarter of a second after you glance somewhere. After that, you can control what you eyes differentiate for by size, color or shape and other characteristics or features.

Pre-attentive design focuses on increasing the parallel processing your visual system does by mapping one data type to one feature.

For instance, in the Parkwhiz application I can immediately recognize where to find parking in San Francisco. But to get a sense of the relative cost of each parking spot, I need to wait for my attention to kick in and then physically look at each placemark, one by one.

data by shape

The difference between these two tasks is what computer scientist Christopher G. Healey and the cognitive scientists refer to as parallel versus serial visual processing. In Parkwhiz, we visually process location based data in parallel and price in series.

data by color

Another example of pre-attentive processing in parallel is by color. See how Google represents typical traffic by color? With pre-attentive processing, I can quickly tell what areas to avoid, but if I wanted to think about this traffic as it relates to parking then I would have to serially search the map as my long term memory tries to retain information I previously saw on the Parkwhiz map.

Pre-attentive processing turns Big Data into little data

Parkwhiz and Google maps have helped people focus on the subset of data they need to make a choice. But what about when I want to predict the best place to park based on typical traffic and cost? Merging the two maps would be tricky since Parkwhiz is representing location by a colored shape while Google is representing density as a color. By themselves this is not an issue. But in enterprise and consumer lives we often have to consider many variables.

Pre-attentive design enables rapid decision making

Looking at one map while mentally comparing it to another takes a lot of capacity from our mind. Pre-attentive design can lessen the work by merging the two views. Thankfully we can take pre-attentive processing to the next level by mapping one visual feature to one data category instead of blending them, thereby decreasing visual noise.

If I could see parking location by shape and traffic flow  by color, then I could quickly draw a line with my mind to the fastest route with the cheapest price.

data by both

For example, this crude visualization layers Parkwhiz on top of Google traffic. I changed the Parkwhiz placemarks to gray, so that parking location is represented just by shape and not by color. Then I adjusted the Google image to only show red and yellow, since good traffic is implicit by anywhere that is not red. Now, I can quickly scan for low-cost parking options that are not adjacent to red or yellow lines.

Visualization is science, not just aesthetics

With data visualization, function truly gives rise to form. Dr Healey points out search speed is a function of the speed of resource allocation and the amount of competition for access to the visual short-term memory (VSTM). If my VSTM has less information being sent to it about color then my speed in allocating resources goes up.

Conscious application of cognitive science makes visualization useful, not just attractive.

 

LinkedInEvernoteFacebook

Register Now for SI World, the Industry’s Leading Conference on Situational Intelligence

LinkedInEvernoteFacebook

SI World, the leading conference on situational intelligence, will be held on Wednesday October 28 at the Sheraton New Orleans hotel.  Executives, technical managers and business users in both the private and public sectors attend SI World to understand revolutionary approaches to situational intelligence, data visualization and analytics for the Internet of Things.  Registration for SI World is now open.

Why attend SI World?

  • SI World is co-located with Energy Central’s Utility Analytics Week. Who doesn’t like saving on travel time and expense budgets?
  • Attendees can learn from those who have gone before them to improve their own situational intelligence initiatives.
  • SI World offers the opportunity to rub elbows with the brightest minds in visual analytics and build invaluable connections.
  • The SI World opening reception will be held at the famous Maison Bourbon Jazz Club on Bourbon Street.

Register now and discover insights that will help you advance your organization’s analytics initiatives, and improve business outcomes.

LinkedInEvernoteFacebook

Situational Intelligence Unifies the Utility Internet of Things

LinkedInEvernoteFacebook

Utilities might have up to 400 discrete systems that automate some portion of their processes for operating the grid, serving their customers and complying with thousands of regulatory requirements. These systems have names such as Customer Information System, SCADA, Meter Data Management System, Outage Management System and Work Order Management System.

In addition, for decades utilities have used sensors on critical power equipment in the grid and power plants to generate operational data. These systems are islands of sensors and electromechanical actuators for a narrowly defined mission and typically not connected to the Internet for security reasons. Still, their use of sensors across the grid qualifies utilities as pioneers in the connected economy of the Internet of Things.

As the Internet of Things continues to grow, the likes of Amazon, Facebook, Twitter and Apple are setting a high bar for data-driven customer insight and service. This forces utilities to think beyond their discrete functional systems to improve their overall level of performance and service.

To unify their discrete systems, utilities need new tools and capabilities to manage, analyze and visualize the nearly instantaneous transfer and access of data. The real and unintended consequences of managing and utilizing real-time data from the utility’s sensors, discrete systems and from third party sources challenges the stability and security of utility business and the ability to reliably serve changing customer needs.

Situational Intelligence is one of the critical capabilities that enable utilities to unify their data and derive value from their Internet of Things.

The supply chain of data that feeds utility processes and the underlying analytics has not, and most likely will not, progress uniformly and will have discontinuities for the foreseeable future. So, situational intelligence solutions need the ability to function and learn as a mesh network, to connect sources of data and information in many forms and speeds from the utility’s sensors, manual processes and from third party sources to feed analytics.

Analytics, in turn, support the visualization of information in a form that promotes an intuitive look and feel of how the utility grid is preforming and helps focus the operations staff on the priorities of the hour.

By unifying their discrete systems to better understand the grid overall and better focus on the priorities of the hour, utilities can securely position themselves for more favorable comparison with Amazon, Facebook, Twitter, and Apple.

Mr. Marshall is the President of Coastal Partners Inc. He served as Executive Vice President of Utility Operations at Kansas City Power & Light Company (KCP&L), a subsidiary of Great Plains Energy Inc. (GPE), and Executive Vice President of Utility Operations at GPE.

LinkedInEvernoteFacebook

Utility 2.0 Requires Situational Intelligence

LinkedInEvernoteFacebook

In the 13th PwC Annual Global Power & Utilities Survey, 94 percent of respondents predicted important changes or a complete transformation of the utility business model. Some in the industry have already created a name for this transformation: Utility 2.0.

The outlines of this new utility model are starting to emerge. A recent Utility Dive story contains interesting charts showing economic growth decoupling from energy demand. In other words, it’s now possible to grow the economy without adding to the existing energy supply. Some of this comes from improved energy efficiency, and some from a switch to less energy-intensive industries and processes.

At the same time, distributed, renewable energy sources such as residential solar panels are replacing energy acquired from the utility.

The combination of lower energy demand and increased distributed energy source results in stagnant or negative load growth for utilities. This is a problem for these geographically limited monopolies who have historically based their business model selling kilowatt hours of energy to users.

To compound matters, the Internet of Things continues to infiltrate many corners of our society with digital energy-related devices such as smart meters and smart thermostats that increase the amount of data and communications traffic utilities handle.

The end result of all this transformation is that the energy distribution network is becoming much more dynamic. It used to be a one-way flow of power from the utility to the consumer. Now, it’s a bi-directional flow of energy, communications, and data between utilities, consumers, service providers and others.

Managing more variability in demand and production, more data and more communications related to a fundamental commodity such as energy requires sophisticated real-time analytics like those provided by situational intelligence applications.

Leading global utilities such as E.ON, Ergon, RWE, and Sacramento Municipal Utility District are deploying these applications to balance supply and demand, detect distribution anomalies in real time, make the most of existing assets and limited capital budgets, and provide higher quality service to customers.

Utilities are a large and fundamental portion of the modern economy, and have been for more than a century. They may not change as rapidly as other core industries such as telecommunications or publishing, but their transformation will be just as disruptive. Situational intelligence helps ease the way to Utility 2.0.

LinkedInEvernoteFacebook

Situational Intelligence as a Risk Management Tool

LinkedInEvernoteFacebook

The evolution of the electric grid, and the power industry as a whole, has certainly changed the understanding and nature of risk. For many years the greatest risk utilities faced was financial, related to project risk in building and operating assets. Over time, with the convergence of technology, consumerism and policy, the number of risks, with both positive and negative consequences, are growing exponentially.

Utilities still have project risk and financial risk, but we also have heightened risks in cyber and physical security, reputation, customer retention, workforce attrition, regulatory compliance, organized markets, stranded assets and emergency response and resiliency. These risks create a multifaceted, multidimensional matrix of opportunities and consequences. Visibility into how risks relate to each other, directly or indirectly, and the ability to anticipate are keys to right-sizing the mitigation.

Risk avoidance is too expensive and too restrictive if an organization is seeking to innovate and meet evolving customer needs. Effective risk management and decision-making is a much better way to be responsive to capturing the value of new technologies and expanding customer offerings.

Policy adherence, regulatory compliance and basic transactions need to be fulfilled, but reside in the background, out of the view of customers. Customers aren’t as interested in back office functions unless they don’t work. Executing those capabilities well are the price of entry for successful business now.

So how is it possible to do all of these things well? Since there are many pieces of information that support decision making in this multidimensional, multifaceted matrix of risk and opportunity, it is important to have the information correlated and presented quickly and comprehensively to help manage the present and predict the future.

Situational intelligence, especially visual, provides information that isn’t one dimensional. It can present a multidimensional set of information that looks at risks and opportunities comprehensively allowing consideration of many options for mitigation.

The digital communications component of the modern electric grid has created two way information, exponentially more data and the ability to correlate disparate data to provide a full picture of the past, present and future. With it, utilities can incorporate and correlate financial, operational and environmental data visually to allow for an integrated view of risk and opportunity.

A strategy to deploy situational intelligence to manage data and the emerging internet of things will move utilities from network operators to technology platform providers. A well designed and deployed technology platform will ensure compliance and fulfillment of core functions and expand the ability of utilities to capture and deliver opportunities most valued by customers.

John Di Stasio is President of the Large Public Power Council.

LinkedInEvernoteFacebook

Analyzing IoT Data: Coming to an Organization Near You?

LinkedInEvernoteFacebook

The Internet of Things (IoT), a network of connected devices that can send and receive data over the Internet, is a hot topic. On the consumer front, IoT buzz has centered around health monitoring devices (think activity trackers such as FitBit, cardiac monitors, diabetes monitors, child anti-kidnap devices) and “smart” home devices like thermostats, appliances, and the like. On the business front, a tagged or sensor enabled piece of equipment or any business asset can be monitored and analyzed. This might include a sensor on a pressure valve on a piece of drilling equipment, a tagged piece of construction material or even food moving to market.

You can monitor these IoT enabled “things” for theft, environmental conditions and so on. You can also analyze the data flowing from them. For instance, a popular example of IoT enabled analysis is preventive maintenance. An oil rig might have a number of components on it generating data that is streaming from the rig. This might include data on temperature, pressure, humidity, viscosity of lubricants, how many times a part moves, and so on. Back at the home base, a model might be generated based on the characteristics of parts that have failed in the past (i.e., a decision tree model that might produce a rule stating that if the pressure exceeds value X and temperature exceeds value Y, then the probability of failure is 80%). Based on these models, and new data coming from the rig, alerts can be generated as to when certain parts should be replaced.

This is just one example of what can be done. The list of IoT enabled analytics is getting longer by the day.

TDWI is seeing increased interest by organizations in IoT data. For instance, in my most recent Best Practices Report on Next Generation Analytics, we asked respondents what kind of data they are analyzing now and they expect to be analyzing 3 years from now. The figure below illustrates some interesting results for both machine generated and/or IoT data and real time streaming data (which might also be IoT data). While in both cases, usage today for this kind of data is under 20%, it looks like usage will double in the next three years (bringing the total to 50% of respondents), if users stick to their plans.

Halper IoT Analytics 01

(source: TDWI Next Generation Best Practices Report 2014, n=328)

What does this mean for your organization? It means that organizations should start thinking about how IoT can impact their business. It might be in operational intelligence, or situational intelligence, or asset management. It might be in the quantified workplace, with smart buildings or workers wearing sensors that track their movements. Clearly, many organizations are still early on in their analytics journey, and IoT can seem overwhelming. The strategy might be a stepwise one. It may (but probably should not) happen overnight. You might tag and track some assets to begin. The point is to start learning more about it now and how it will affect your processes and culture; since that may be a big hurdle.

Fern Halper is Research Director for Advanced Analytics at TDWI.

LinkedInEvernoteFacebook

Analytics as an Organizational Culture Challenge

LinkedInEvernoteFacebook

For their 2013 report, “The Emerging Big Returns on Big Data,” Tata Consulting surveyed executives at more than 500 companies with one billion dollars or more in annual revenues about their efforts to invest in and profit from big data analytics.

Respondents across industries consistently ranked two issues as barriers to realizing ROI from Big Data:

  • Getting business units to share data across organizational silos
  • Building high levels of trust between the data scientists who present insights on Big Data and the function managers

These cultural issues consistently outranked more technical issues such as, “determining which Big Data technologies to use” and “reskilling the IT function to be able to use the new tools and technologies of Big Data” and “putting our analysis of Big Data in a presentable form for making decisions”.

Clearly, culture affects an analytics project’s opportunity for success. Getting all stakeholders to recognize and act on the project’s importance is one way to address this cultural barrier head on. How can you elevate your analytics project’s strategic priority?

In a recent Energy Central webinar, Erick Corona from Pacific Gas & Electric identified the two most important components to creating an analytics culture: a leader who “gets it” and employees who can “make it happen.”

Leaders who get it are convinced about the value of data and applying analytics as a way to do business. These leaders are consistent and courageous. They create an open culture that encourages curiosity and exploration of data, and engage in their own exploration by looking outside their industry for benchmarks and role models.

Employees who can make analytics happen, according to Corona, work well in teams, view working with data as an opportunity for improvement and marry that data with their expertise in their specific industry and role.

Once you have the right leaders and employees in place, there are several options for how to organize your culture, including creating an analytics center of excellence.

As the business guru Peter Drucker is credited as saying, “Culture eats strategy for breakfast.” Put culture at the top of your list when making analytics a priority in your organization.

LinkedInEvernoteFacebook