What Does The Dynamic Utility 2.0 Grid Look Like?


In a previous post I discussed how utilities are evolving a new business model—Utility 2.0—to handle a more dynamic, bidirectional power grid. What does that grid look like? Since a picture is worth 1,000 words, here are two charts that capture the Utility 2.0 dynamic.

Stagnant load growth

Load Growth Curve

The rate of electricity load growth in the U.S. has declined steadily since 1950, as shown by the green line in the chart above from the U.S. Department of Energy. This means that utilities can count on diminishing revenue growth from their traditional core business of delivering kilowatt hours to customers. Load growth dipped into negative territory during the Great Recession of 2008.

As load growth has declined, growth in gross domestic product (GDP) has stayed fairly consistent. In the 1980s, electricity load growth rates dropped below GDP growth rates. In other words, a major economy like the U.S. can continue to grow its economy with declining levels of electricity consumption.

Greater load variability

Duck Curve

As overall load growth has declined, daily load volatility has increased. The chart above from the California Independent System Operator shows the growing disparity between afternoon and evening consumption. With the continued growth of energy efficiency and rooftop solar power, customers are using less power from the grid during the day. When evening falls and solar production drops off, customers return to drawing power from the grid.

Managing low growth, high volatility with situational intelligence

Utilities and grid operators currently use the real-time visual analytics of situational intelligence to predict, analyze and manage short-term volatility on the grid. Analytics allows these organizations to optimize their grid and associated distributed energy resources, and minimize their operations and capital outlays.

This improved financial performance helps maintain revenues and profits in the face of stagnant load growth. Those profits, in turn, fund investment in new energy resources such as energy storage, electric vehicle infrastructure and utility-scale renewables.


Situational Intelligence Broadens the Appeal of Microgrids


Energy generation and distribution isn’t just for utilities anymore.

In a previous post about microgrids, I mentioned New York University and the teaching hospital of Tohuku Fukushi University in Sendai, Japan as two notable microgrids. In both cases, microgrids helped the facilities survive and function after a major catastrophe (Hurricane Sandy and the Fukushima earthquake and tsunami, respectively). Other public sector organizations, including military bases and jails, are also generating and distributing energy through microgrids.

The private sector is also getting into microgrids. According to a 2015 report from Deloitte, “a solid majority (55 percent) of businesses say they generate some portion of their electricity supply on-site, up from 44 percent in 2014.” That’s growth of 25 percent in a single year. In particular, two-thirds of technology, media, and telecommunications companies report generating energy on-site.

Although they are generating and distributing energy, none of these organizations are utilities. Their core competencies lie elsewhere. Managing energy is something that they do to lower operating costs and increase reliability of service. That means that they need an easy way to run their energy operations with minimal staff and effort.

Situational intelligence could help these customer-owned microgrids, in two main ways.

One, situational intelligence applications unite disparate data sources scattered across large organizations This allows energy planners and operators in non-utility organizations to draw on all relevant data to solve their energy-related challenges, while avoiding the tedious and time-consuming work of locating, cleaning and correlating data sets by hand.

Two, situational intelligence applications provide analytic and automation capabilities that enable small staffs to operate complex systems such as microgrids. That helps keep operating costs low, which is part of the purpose of customer microgrids.

It’s not that large public and private organizations don’t need utilities anymore. The utility grid still provides primary and backup power, plus offers a channel to participate in the wider energy market through surplus energy sales. However, the increasing availability and ease of advanced analytics, distributed energy generation and industrial automation through the Internet of Things make microgrids a viable way to realize increased reliability at lower costs.


Shedding New Light on Power Outages


According to EPRI, the U.S. economy loses $104 – $164 billion per year to unplanned power outages even after utilities have implemented an outage management system (OMS) under normal operating and external conditions. Of course this translates into thousands of hours of power-less living for consumers across the country, which is sometimes a mere inconvenience but other times a life-threatening situation.

The challenge for utilities is that while OMS’s provide crucial functionality to utilities, such as determining how to reset protective devices safely to restore power, these systems alone are clearly insufficient to significantly limit the economic impact of power outage.

The problem is complex since power outages are multi-faceted events that are bigger than a single system. That’s where situational intelligence can help. By drawing data from multiple disparate sources such as smart meters, protective devices, customer information, workforce management and weather feeds, situational intelligence applications provide utilities with the broader view needed to detect power outages earlier and restore power faster.

Utilities benefit from applying this unified view to improve how they prevent, detect, assess, and respond to outages. Early insights into the start of a potentially larger outage (and possibly limiting its spread), better understanding of the cause of an outage (and associated improvement in accuracy of restoration times), and clearer prioritization of which critical customers to restore power to first, are just a few examples of how situational intelligence adds value to the way utilities respond to outages today.

If you’re curious, read about Outage Intelligence, a situational intelligence application that provides valuable analysis for utilities to reduce the breadth and duration of outages.


Connected Cars Are Valuable (Even If They Aren’t Electric)


According to McKinsey, today’s cars—gasoline and electric—run on about 100 million lines of programming and the computing power of 20 personal computers. All that tech is internally focused on the car itself. But increasingly, car owners, businesses and governments are turning their attention to how cars can connect to the outside world to improve safety, performance and the in-car experience.

Such a mobile computing platform, operating on a network of roads and contending with real-time changes in terrain, weather, traffic and operating status, is a perfect environment for situational intelligence applications.

For example, many cars use GPS and sensors to provide predicative analytics about remaining driving range based on current amount of gasoline or battery charge. As a more advanced example, in a previous post I described how Tesla is using situational intelligence to combat range anxiety by ensuring drivers can easily remain in range of a charging station.

By connecting to external sources of data, individual cars could use situational intelligence applications to enhance performance and experience, such as

  • Predicting duration of consumables such as oil, brake and steering fluids, antifreeze and wiper blades
  • Identifying early warning signs of malfunctioning systems
  • Gasoline and charging optimization (“When and where should I fuel next?”)
  • Electricity charging station reservations

By connecting with each other as well as external sources of data, groups of cars could also contribute to and benefit from situational intelligence applications. Possibilities include

  • Accurate, real-time measurement for traffic rate of flow
  • Real-time weather mapping based on sensor data for multiple cars
  • Proximity safety in dense traffic environments
  • Car platooning and other throughput improvements
  • Corporate fleet analytics and management
  • Verifying outdoor advertising exposure

And of course, there’s the ultimate in connected vehicles–the self-driving car.

Business Insider predicts that by 2020, 75 percent of cars shipped globally will be built with the necessary hardware to connect to the internet.

As you can see, you do need to buy a Tesla or other electric vehicle to benefit from situational intelligence while on the road.



Resolving Different Conclusions from the Same Data


In the era of big data analytics, there may still be room for human input and judgement.

A recent Harvard Business Review article discusses the very real likelihood of reaching different conclusions from the same data. The article recounts how multiple teams of analysts were given the same question to answer and the same data set to research. Of 29 teams working the problem, 20 found a statistically significant relationship that answered the question. Nine team found no significant relationship.

In the end, the teams “converged toward agreement” that there was “a small, statistically significant relationship,” the cause of which was “unknown.”

This phenomenon could be helpful. If you have the luxury of multiple teams, you can generate a more thorough investigation and debate. This phenomenon could also be bad, an endless sort of analysis paralysis.

Big data only magnifies this problem. Imagine multiple teams working with multiple data sets, each of which is relevant to the answer, but none of which is sufficient by itself.

How can you tackle this?

Aside from compromise or consensus answers, the article mentions averaging different conclusions as another possible approach.

In big data analytics, you might substitute multiple algorithms for multiple teams. Ensemble methodologies have gained strong traction recently. For example, the Netflix Prize was won by an ensemble methodology (RBM). It’s fair to say, ensembles of regression trees (BT) are the most popular methodology for classification. Amex, for example, uses BT for fraud and credit worthiness.

Outside the application of analytics, business considerations might provide additional, deciding constraints for sorting out multiple approaches. Feasibility, budget or timeline for implementation, safety, regulatory constraints and other considerations could be the deciding factor when choosing an algorithm. For example, a financial company could use a BT for training their analytics at scale, but once in production they may switch to using simple regression-based classification to stay in compliance with regulations.

Having data supporting your conclusions is usually better than having no data. Better yet is a thorough examination of methods behind your analytical approach to deriving and applying value from big data.


The Science of Visualization: Receptor-driven Design for Augmented Reality

Tinklepaugh AR UI

Color brings beauty to our eyes, whether from the wings of a monarch butterfly or the broad brush strokes of a Van Gogh painting. Color also allows us to assign meaning and organization to items. At some point, most people have to ask how they should use color whether they are animating a cartoon character, painting an accent wall or, in my case making, a graphical user interface.

Here I will explain how I would go about using color for utilities-specific augmented reality applications.

The use of color rests on how our eyes and brains process light and detail. When selecting interface colors, I ask myself: What colors do I use, and how to maximize readability and decrease distraction?

It helps to think about how the visual system processes color. In the eye, there are two types of receptors that process light: rods and cones.

Tinklepaugh rods cones

Rods are bad for color, but great for detail. Cones are great for color, and aren’t good for detail.

Color exists partly because of an activity pattern of three retinal receptive cones that are suited for different wavelengths of light: short, medium and long wave. These cones work in combinations to send signals to our lateral geniculate nucleus and visual cortex for what color we are to perceive.

Your visual cortex process most information from red and green receptor cones gathered in a small indent in the back of your eye, called the fovea. More space in your cortex is devoted to processing red and green. What is the takeaway? Since blue receptors aren’t in your fovea, your brain works less to process them. Furthermore, rods also process blue, meaning even less energy is devoted to perceiving it.

Receptor-driven design

These variances in how we process light and color leads car designers to two opposing dashboard color philosophies: blue and red.

Tinklepaugh car dashboards

Red wavelength affects mainly cones, leaving the rods unsaturated, which results in better night vision. On the other hand, red wavelength enters your brain from your fovea, which means you use more visual cortex resources to process for higher acuity. With blue dashboards, your cones don’t require as much detail, which means you use fewer visual cortex resources to process. The trade-off is that your rods are processing light from two sources, the road and your dashboard, and therefore are working harder.

Cortical magnification

Hold up just one finger on your hand and look at it–your brain increase magnification in your visual cortex, which uses more cones and less rods.  Now, look at all five fingers on your hand–your brain lowers magnification, which consumes fewer resources in your visual cortex. This relies on fewer cones and more rods.

Tinklepaugh fingers

Interestingly, if you hold up two hands in front of you with all five fingers extended on the right and only your index finger on your left, your visual cortex activates far more and has more total volume dedicated to the finger than when processing your right hand with all five fingers extended.

So, how does any of this apply to Augmented Reality? Let’s take a look.

Tinklepaugh AR UI

Decreasing cortical magnification and acuity.

Here’s an interface that utility workers might use to assess linear assets in the field. The colors are pleasing, modern, unobtrusive–but that’s not the point of the colors. The color design helps field users visualize information more effectively and effortlessly by drawing attention to only what matters at present.

Remember that rods are most sensitive to light and dark changes, shape and movement, and place the smallest demand on the visual cortex. Let’s put all the UI elements in our peripheral that we can, unless they represent the most important data at this current point in time.

Contextual activation of receptors

Let’s make all our buttons and elements blue or white if we can, so they are less taxing on our visual systems. We use green and red very sparingly since they fall right in our fovea. Red alerts us to where the problem is reported via data being uploaded to our system. Green directs our attention to the start and end of where we think our linear asset is experiencing trouble. We can drag, drop, and slide around the placemarks all we want to better approximate and update the data source in real time, allowing asset planners to better diagnose corrective steps to take.

Now that you understand more about how your brain works with light and detail, you can start to notice how products and programs around you are using color to do more than just look pretty.



How Can Utilities Maximize Their Assets?


Electric utilities today are grappling with enormous changes in the way energy is produced, distributed and consumed wrought by renewable and distributed energy sources, smart meters, empowered consumers, changing regulatory models and more.

Accommodating these changes has led to a huge investment in new utility assets that must be integrated and managed alongside a vast portfolio of legacy assets. The range of assets operated by a typical utility spans dozens of categories – from wooden poles to smart meters to high-voltage transformers. To put this in context, the volume of assets a utility needs to manage can add up to tens-of millions within a single operational territory.

To efficiently manage this ever-changing asset portfolio, utilities need insights into how they are used and this requires solutions that bridge the gap between data available via enterprise applications and physical assets in the field. This type of intelligence allows organizations to analyze the data available to know where to invest their time, money, and skills to reduce risk and operational costs.

One example of how utilities gain this type of insight into assets is the new Asset Intelligence 4.0 application. With the latest enhancements, Asset Intelligence gives utilities complete transparency of operational status across the organization, and this ultimately gives them the resources they need to manage their valuable assets and make informed decisions at a moment’s notice.

If you’re curious, read more about the new version of Asset Intelligence.