Prescriptive Analytics, meet Contextual Augmented Reality



How should man and machine work together? Should they inform one another from our embodied and digital perspectives? Should machines be pro-active or reactive? Do we query them or can they query us? If so, how?

Check out this Augmented Reality prototype.

The prototype explores prescriptive operating systems that was inspired by my team at Futures Design Lab in San Francisco. We hatched an idea to use artificial intelligence and reality computing to help humanity. This blog builds on that idea to imagine how we might work interactively with machines or distributed personal computers via Augmented Reality.

What if your AR headset machine knew how you approach problems? What if it learned which applications you use at different parts of the day in relation to inputs and outputs on you communication channels and schedule? For example, often times the work I do is distributed across a Word Processor, Spreadsheet, Calendar, Design Tool, Email, Chat Application, Data Repository, and on and on…Wouldn’t it be great if I didn’t have to open an application but that the parts of applications I need work harmoniously and independently of me?

Today let’s consider a hypothetical new way to work with a fun operating system concept we will call YOU. YOU could be a contextual Augmented Reality interface. It might provide the options you want based on your own behavior. Beyond that, it may analyze and prescribe or recommend to you decision options to optimize how you manage your tasks.



When you put on your Augmented Reality Headset, YOU would be with you the moment you move your hand. Using advanced computer vision algorithms it positions UI elements at your fingertips for actions you take most frequently at that time of day, independent of what device you are on. No need to pick up a phone or log in to your PC. It is all right there in front of you.



You look at your Calendar by touching it with your other hand. You don’t need to remember how to activate it because YOU learns your actions. In other words, your interface is performing multivariate testing on your behavior to see which schema you are biologically and emotionally predisposed to. Once you activate the calendar you see your schedule populate. You see that you have Satellite Analysis to do because you work for NASA.



This is where a new level of options gets presented to you. YOU knows you look at many charts and graphs. YOU interfaces with a satellite reporting system for real time updates of all of the 1100 satellites in space at any given time. YOU would ideally present you with a few different charts and applications to select from, but let’s assume that step happened already. You chose a treemap depicting the statuses of aging satellites running out of propellant which, if not refueled, could descend back into the Earth’s atmosphere and burn up. Of all the satellites you could examine, YOU has selected one in particular for review by highlighting it in bright green.



You touch this satellite and a new modeling application pops up showing you the relative angular velocity of a healthy satellite with enough propellant. From here you speak to YOU and tell it to model various outcomes.


Prescriptive computation is happening all around us but it is subtle. You can notice it in Google Maps or Waze when it offers you a faster route. Enterprise Analytics companies are integrating prescriptive features into their products.

Computation is now transcending all the media we experience it in. While YOU is one vision of the future, it’s clear the current computational paradigm is rapidly changing. The ways you interact with your current desktop and mobile devices are going the way of the typewriter and pager. We all have a hand in shaping how we interact with data tomorrow. Let’s imagine a world where we might not have to be sitting at a chair for 8 hours a day to do groundbreaking work.



Putting Analytics in the Hands of End-Users


Anyone Anytime Anywhere

Operationalizing a technology or process (or both) means efficiently getting it in the hands of end-users who will realize value and benefits in their roles and by doing so extend the value and benefits to their organization.

This blog focuses on operationalizing analytics for decision support for humans, which as you’d expect accounts for most business decisions. TDWI Research reveals in a recently published best practices report that 75% of an organization’s decisions supported by analytics are made by humans. The entire report that includes a thorough examination of operationalizing analytics and the interrelated topics discussed in this blog, can be downloaded and read using this link: “Operationalizing and Embedding Analytics for Action.”

Analytics, simply put, is a category of information processing methods that derives value from data. Analytics is necessary to operate on data that is too complex and voluminous for manual methods. Specific types of analytics perform vastly different functions that generate different outputs that include: insightful details, predictions, recommendations, optimized choices, outliers (anomalies), patterns and trends. While the output of analytics may be interesting, the value and benefits are only realized when specific actions are taken. That means the recipients of the output from analytics must be able to consume it, comprehend it, and effectively use it to make decisions and take action. Often the shorter the time to action the better.

Until recently analytics has been confined to IT and data science professionals, impeding organizations from maximizing the benefit of the value in their data and from their investments in analytics. The recently published best practices report published by TDWI Research also cites the necessity, value, recognition and trend of making analytics and its output available to a wide group of people within an organization. Among the survey results in that report is an increasing awareness and willingness by organizations to operationalize analytics with 88% of their survey respondents claiming they have analytics in production or development that could be considered to be operationalized.

Another impediment is from delays between the availability of insights for decision support and when its actually needed diminish the value of the insights, or worse, allow other undesirable and potentially preventable consequences to occur. The most common reason for delays is due to inherently slow manual processes required to gather the necessary data, prepare it, have specific personnel run analytics programs, review and otherwise process the output, then convey the results to the decision makers. Each of these steps can take hours or days; even weeks in extreme cases. Timeliness is therefore also an important characteristic of a well operationalized solution. When appropriate actions are taken faster, gains can be maximized and adverse consequences can be averted or minimized.

The good news is that modern technologies make it possible to put actionable insights from analytics into the hands of end-users with few or none of the delays just discussed. That is, operationalized analytics can result in a very short or zero time-to-insights.

Making results available in a timely manner can be achieved by making analytics available on a self-service basis and/or making the output continuously available and readily consumable. One example of making the output continuously available and consumable is displaying intuitive visualizations of analytics output on a monitor wall in an operations control room. For some organizations, it is very commonly necessary to receive and act on insights and output from analytics both inside and outside of a control room. Delivering analytics output to people at their desk, on the factory floor, in the field and wherever they are is typically accomplished using browser-based applications, mobile devices, and ubiquitous communications networks (e.g., WiFi, 4G LTE, etc.).

Another best practice for operationalizing analytics is to embed analytics into existing business processes and the visualized output of the operational applications used to facilitate specific business processes. Analytics processing can be hidden in the background such that what end-users receive is seamlessly integrated into the screens and dashboards they’re accustomed to using. When this type of visual blending is not possible in the native application, situational intelligence, with its ability to create composite views, can be used to include the output of other applications combined with analytics into a single app window. This latter approach creates a broad and relevant context for decision-makers, enhancing their ability to act quickly and appropriately with confidence.

For the reason just mentioned, situational intelligence is in fact a powerful and highly effective way to operationalize analytics because this type of enterprise application lends itself to relatively easily operationalizing analytics with intuitive user interfaces and at-a-glance presentation of information and results from analytics. Tightly integrating visualizations with data and analytics results, especially with browser-based apps, makes insights readily consumable and actionable to anyone anywhere.  As a result, organizations from small start-ups to large global enterprises empower workers and correspondingly improve their business results and successes with widespread use of analytics.

As technology marches forward, processing power and analytics-specific frameworks such as Spark enable complex analytics processing software and jobs to be completed fast, even instantaneously in some cases. The ever-present Internet and browser-based user interfaces make analytics with richly visualized results available to anyone, anywhere, on large screens as well as on handheld mobile devices, truly putting analytics into the hands of a wide population of end-users. The benefits provided by situational intelligence are accelerating the ability to effectively operationalize analytics.

The age of operationalized analytics catalyzed by situational intelligence that delivers timely and readily consumable actionable insights to anyone anywhere is here.  Fasten your seat belt, the pace of taking action driven by analytics is accelerating.

Another option for operationalizing analytics is automation, which is when systems automatically make decisions and initiate actions via direct machine-to-machine communications. In these cases humans are not in the decision making loop. Automation is an important topic that will be addressed in future blogs.


What’s The Big Deal About Distributed Energy Resources?


Energy stories are dominating the news—not just oil prices, but electric cars, home energy storage systems, rooftop solar power, even Supreme Court rulings about demand response.

These devices and programs that move power on and off the distribution grid are collectively known as distributed energy resources (DERs), and they are a hot topic:

  • The DistribuTECH conference happening this month offers 60+ tracks related to DERs, distribution automation, demand response, energy efficiency, renewables integration, microgrids and energy storage.
  • The Supreme Court recently affirmed demand response.
  • New York and California have specific plans for supporting and exploiting DERs.
  • Nevada and Arizona are waging battles over solar energy and net metering.
  • Hawaii is pushing for 100 renewable energy by 2045, with 30 percent of homes on Oahu already using solar power.

Why are DERs such a big deal? Because it matters where DERs are located, what they connect to, and how and when they operate in real-time and in the future.

This increased focus about where, when and how things are operating on the distribution grid is a new mindset for utilities. Originally, the distribution grid was conceived and designed for a one-way flow of power from the utility’s central generation to consumers. DERs turn that pattern on its head.

The location of DERs matters because of their impact on the distribution grid. DERs like electric vehicles and energy storage devices draw large amounts of power from the grid. When the grid is overworked like this, voltage levels drop below acceptable levels, which leads to flickering lights, momentary outages, and eventually black outs. Other DERs such as rooftop solar panels can put too much power onto the grid. When this happens, voltage rises above acceptable levels, which leads to burnt out equipment and eventually power outages.

The time of day when DERs operate matters because of the impact on the prevailing assumptions about power consumption throughout a typical day. Historically, business hours have been the period of the heaviest power consumption, with afternoons / early evenings seeing the peak use. The current generation and distribution system has been designed around this consumption pattern. DERs disrupt this pattern. Solar power is most productive during sunny business hours, which means far less need for central generation. As soon as the sun goes down, however, solar goes away and suddenly lots of central generation is needed online in a hurry, at the busiest time of the day for energy use.

On the other hand, batteries and electric vehicles are charged most cost-effectively when rates are low, usually at night. Wind farms can be quite productive at night, which aligns well with energy storage. But wind farms are by their nature intermittent. Get enough large devices charging on a calm night and suddenly it isn’t necessarily the period of lowest generation and cheapest power.

Analyzing where, when and how things are operating is the unique strength and benefit of situational intelligence. DERs are one reason why situational intelligence has gained such a hold in the utilities space.

Soon you’ll see news stories about utilities adopting new business models and revenues streams based on making the increasingly dynamic distribution grid work smoothly and fairly for all participants, instead of just selling kilowatt hours to users. Just remember, you read it here first.