Improving Your Operations with Data Visualization Applications

LinkedInEvernoteFacebook

Operational and Cost Efficiency

Visual analytics is an effective way to understand the stories in your data and take appropriate actions. Effective data visualizations powered by visual analytics enable you to easily and interactively dive deep into your data to identify opportunities to improve efficiency and productivity, cut costs, and optimize resources; all of which are at the crux of increasing operational efficiency. Hence operations leaders want to quickly understand what is in their data so they can initiate and monitor improvement programs as well as make the best possible decisions for the types of situations they regularly face.

But there’s a small catch – while it’s tempting to believe that putting powerful data visualization authoring software in the hands of business users will result in useful solutions, this is rarely a recipe for success. That’s because creating effective data visualizations requires expertise because visualization authoring software itself does not magically surface insights. Human expertise is required to implement the type of functionality that makes it possible to surface and intuitively convey insights that are actionable for making operational improvements.

Basic tables and charts are easy to create, however solving problems and enabling data discovery that leads to root causes and opportunities to improve operations is an entirely different matter. Spreadsheets and most visualization software make it easy to create pretty charts as well as to combine tables and charts into dashboards but do not fully meet the needs of operations leaders. Let’s face it, if spreadsheets alone were sufficient, you’d have all you need to effectively run your business operations.

Questions that you should ask yourself are:

Do one or more simple visualizations or dashboards containing multiple simple visualizations solve real business operations problems?

Do the visualizations surface buried insights and make them readily comprehensible and actionable?

Is it possible to clearly visualize activity on a map and see changes unfold over time?

Is it possible and to synchronize components within a dashboard to update when data is selected, filtered and sorted?

Of these capabilities, how easily can they be implemented (if at all)?

The answer to these questions exposes the necessity for specific functionality that transcends just data visualizations; application software is required to deliver such functionality, or to be specific – data visualization applications. This is one of the reasons that expertise is required, because applications must be implemented to fully deliver on the promise of visual analytics. Expertise must include art, skill and knowledge that typical operations personnel do not possess. Business users do not understand their organization’s data landscape or how to model and transform their data for visualization. And they often don’t have the time to learn and hone the expertise needed to implement data visualization applications regardless of how simple modern data visualization developments tools are to use.

Full service offerings that use best-of-breed visual analytics are a great way to obtain the needed combination of expertise and visual analytics to enable you to achieve the objective set forth in this post – to improve all aspects of operational efficiency.

LinkedInEvernoteFacebook

Why We Need Situational Intelligence, Part 3

LinkedInEvernoteFacebook

In Part I and Part 2 of this series I addressed why situational intelligence is a natural and essential method of decision-making that is especially apropos for real-time business operations. Inherent in my argument is an altruistic belief that people make the best decisions and take the best actions with the information at hand. That is the crux of the matter – the information at hand and how accurate and actionable it is. What information is available to decision makers? Does it contain insights? Is it current? Is it clear or is interpretation and/or further analysis necessary before the information is actionable? Is it reliable? How comprehensive is the it? Correspondingly, how much uncertainty shrouds the information, the decision, the action and the outcome? What are the risks of making a bad decision (including no decision)?

Ideally the answers to the preceding rhetorical questions should all be encouraging. But how can these attributes of data and insights for decision-making be assured, especially when decisions are made by different people, when decisions needed are for unplanned situations, and when timeliness is important? Systematized decision-making aided by technology-generated intelligence is a way to assure that accurate insights are derived from data and actionable by decision makers. As discussed in the preceding blogs (and other blogs too), advanced analytics and visual analytics are essential building blocks for analytics that support operational decision-making. Data must be transformed into insights and intelligence. The insights must also be transformed so they are readily comprehended at-a-glance and are actionable.

Another key consideration is having a broad composition of data for analysis. The more data from relevant sources within the enterprise, from the IoT and from external sources, the more insights can be derived by analytics. Accessing external data enriches intra-enterprise data sources with relevant context that is useful when decision makers require supplemental information, such as when insights brought forward to decision makers is not immediately actionable. In such cases further discovery helps decision makers gain the needed understandings and confidence to make a decision. This is where additional data sources and the corresponding added context facilitates interactive data exploration so that decision makers can make timely and favorable decisions. Sources and types of external data include: weather, traffic, news, spot market prices and social media.

Having live connections to data sources ensures that decisions are made using the most up-to-date data, and also enables interactive exploration of underlying data to deeply understand and resolve complex multifaceted situations. A single system that maintains live connections to data sources yields another benefit – it helps organizations bridge their data silos and unify their data assets.

Here at the end of this blog series, situational intelligence now sounds easy, and somewhat obvious too – connect to relevant data sources, apply analytics, make the resulting insights and underlying data available to decision makers with intuitive visualizations so they can consistently make the best possible decision in any situation. If you use an off-the-shelf solution to implement situational intelligence, getting started is also relatively simple. Decide for yourself. What does your situation require?

If you have experiences, thoughts, opinions on this topic, please comment and share them.

LinkedInEvernoteFacebook

Why We Need Situational Intelligence, Part 2

LinkedInEvernoteFacebook

Big Picture 01

In my previous blog I addressed the need for situational intelligence (SI) as an approach to decision-making that combines insights with relevant context to create the big picture we need to make the best possible decisions with the lowest risk. I concluded that blog by promising to explain why and how various technologies such as data access and fusion, analytics with machine learning, artificial intelligence and visual analytics come together to support situational intelligence.

Situational Intelligence itself is not a technology, nor can you use just one technology to create it. Rather, a situational intelligence approach requires a combination of integrated technologies. The main types of technologies are listed below. Seamlessly integrating these technologies creates actionable insights that are especially applicable for real-time operational decision-making.

  • Live connections to data, both at rest and in motion, in a variety of formats and structures (including no structure at all). Access to multiple, disparate sources of data provides the context for new, deeper insights. Connecting directly to data creates great efficiency and savings savings because data access and preparation often consume as much as 80% of the effort of making data-driven decisions.
  • Analytics, big data, and streaming foundational technologies (such as Spark, Hadoop, SAP HANA) that are inherently scalable and enable high-performance execution of analytics and processing of large datasets. These foundations are typically distributed and use in-memory processing so that complex software executes and generates answers and insights as quickly as possible. Streaming message brokers such as Kafka and Internet of Things (IoT) platforms are also necessary to manage streams of data that can be passed through streaming analytics for real-time insights and/or to be stored for inclusion in subsequent applications of advanced analytics that derive deeper insights.
  • Advanced analytics and streaming analytics that derive insights from the data at rest and data in motion, respectively. Because situations inherently occur at specific times and locations, the ability to correlate spatial and temporal relationships increases the insights that can be derived. Similarly, the ability to correlate entity-to-entity relationships increases the insights by revealing actual and likely ripple effects. Altogether these analytics make it possible to identify the what, when, where, why and how of situations that happened or may happen. In addition, machine learning allows the analytics to adapt to your data and to your use cases.
  • Visual analytics is essential to complete the transformation of data into actionable insights. Intuitive renderings of the relevant data and resultant insights derived by analytics helps users comprehend and acted on data at-a-glance. Output from visual analytics included in alerts via email and SMS is a powerful way of notifying people about critical matters and focusing their attention on acute situations and the decisions to be made.

In summary, situational intelligence is an approach that combines data and analytics, including visual analytics, to aid human decision-making. Insights from advanced analytics and streaming analytics are combined with relevant underlying data to create context so that decision-makers have a complete understanding of each situation and make decisions that lead to the best possible outcome.

LinkedInEvernoteFacebook

Why We Need Situational Intelligence, Part 1

LinkedInEvernoteFacebook

air traffic control

Why do we need situational intelligence?

When we lack information, or information is not easily consumed or comprehended, then our decisions are compromised. We don’t have the right information, at the right time, or in the right form to lead to the best possible outcome.

At its core, situational intelligence handles all of the relevant data needed to derive insights that will guide your decisions. Today’s environment typically means large volumes of disparate data, pouring too quickly into over-burdened systems, leaving executives and analysts alike wondering if they can believe the data they see.  Some insights can be derived only by sifting through this ever-growing mountain of data looking for hidden correlations. Correctly correlating and analyzing all of necessary the data and correctly presenting results, recommendations and predictions is the biggest differentiator of situational intelligence over traditional analytics.

Unlike typical business intelligence or stand-alone analytics solutions, with situational intelligence we receive valuable details, recommendations and predictions that typically result in enhanced competitive advantage through:

  • Cost savings
  • Increased efficiencies, productivity and performance
  • Increased revenues
  • Improved client engagement that raises satisfaction
  • Better understanding of exposure that facilitates better management of risk

To put this into a concrete example, consider why airports need flight and ground traffic control systems. There is the obvious answer – to know where airplanes vehicles and people are both in the air and on the ground, to safely and efficiently stage their movement. Managing traffic at an airport requires context, such as the current and forecasted weather, to achieve the best possible safety and efficiency. Even a seemingly simple matter such as a broken jetway has many consequences that affect the context of ground control, fueling, cabin cleaning, luggage, passenger relocation, food service, etc. Now, imagine the complexity of handling a crisis situation such as an airplane needing to make an unplanned emergency landing.

Managing an operation with as much complexity, interdependencies and consequences as an airport requires the staff in the operations control center to have a live, real-time, big-picture view of everything that is happening and, ideally, also what is most likely to happen. As you surely recognize, keeping track of so much fast-changing information in a person’s head alone is impossible and prone to errors and omissions.

Clearly having as much relevant and easily comprehensible information as possible provides the context that we naturally seek to guide our decisions and actions. In a follow-on blog I will explain why and how various technologies such as flexible data access, analytics, machine learning, artificial intelligence and visualization should be seamlessly integrated to create and deliver situational intelligence that is truly actionable.

 

(Image courtesy of Flikr)

LinkedInEvernoteFacebook

Putting Analytics in the Hands of End-Users

LinkedInEvernoteFacebook

Anyone Anytime Anywhere

Operationalizing a technology or process (or both) means efficiently getting it in the hands of end-users who will realize value and benefits in their roles and by doing so extend the value and benefits to their organization.

This blog focuses on operationalizing analytics for decision support for humans, which as you’d expect accounts for most business decisions. TDWI Research reveals in a recently published best practices report that 75% of an organization’s decisions supported by analytics are made by humans. The entire report that includes a thorough examination of operationalizing analytics and the interrelated topics discussed in this blog, can be downloaded and read using this link: “Operationalizing and Embedding Analytics for Action.”

Analytics, simply put, is a category of information processing methods that derives value from data. Analytics is necessary to operate on data that is too complex and voluminous for manual methods. Specific types of analytics perform vastly different functions that generate different outputs that include: insightful details, predictions, recommendations, optimized choices, outliers (anomalies), patterns and trends. While the output of analytics may be interesting, the value and benefits are only realized when specific actions are taken. That means the recipients of the output from analytics must be able to consume it, comprehend it, and effectively use it to make decisions and take action. Often the shorter the time to action the better.

Until recently analytics has been confined to IT and data science professionals, impeding organizations from maximizing the benefit of the value in their data and from their investments in analytics. The recently published best practices report published by TDWI Research also cites the necessity, value, recognition and trend of making analytics and its output available to a wide group of people within an organization. Among the survey results in that report is an increasing awareness and willingness by organizations to operationalize analytics with 88% of their survey respondents claiming they have analytics in production or development that could be considered to be operationalized.

Another impediment is from delays between the availability of insights for decision support and when its actually needed diminish the value of the insights, or worse, allow other undesirable and potentially preventable consequences to occur. The most common reason for delays is due to inherently slow manual processes required to gather the necessary data, prepare it, have specific personnel run analytics programs, review and otherwise process the output, then convey the results to the decision makers. Each of these steps can take hours or days; even weeks in extreme cases. Timeliness is therefore also an important characteristic of a well operationalized solution. When appropriate actions are taken faster, gains can be maximized and adverse consequences can be averted or minimized.

The good news is that modern technologies make it possible to put actionable insights from analytics into the hands of end-users with few or none of the delays just discussed. That is, operationalized analytics can result in a very short or zero time-to-insights.

Making results available in a timely manner can be achieved by making analytics available on a self-service basis and/or making the output continuously available and readily consumable. One example of making the output continuously available and consumable is displaying intuitive visualizations of analytics output on a monitor wall in an operations control room. For some organizations, it is very commonly necessary to receive and act on insights and output from analytics both inside and outside of a control room. Delivering analytics output to people at their desk, on the factory floor, in the field and wherever they are is typically accomplished using browser-based applications, mobile devices, and ubiquitous communications networks (e.g., WiFi, 4G LTE, etc.).

Another best practice for operationalizing analytics is to embed analytics into existing business processes and the visualized output of the operational applications used to facilitate specific business processes. Analytics processing can be hidden in the background such that what end-users receive is seamlessly integrated into the screens and dashboards they’re accustomed to using. When this type of visual blending is not possible in the native application, situational intelligence, with its ability to create composite views, can be used to include the output of other applications combined with analytics into a single app window. This latter approach creates a broad and relevant context for decision-makers, enhancing their ability to act quickly and appropriately with confidence.

For the reason just mentioned, situational intelligence is in fact a powerful and highly effective way to operationalize analytics because this type of enterprise application lends itself to relatively easily operationalizing analytics with intuitive user interfaces and at-a-glance presentation of information and results from analytics. Tightly integrating visualizations with data and analytics results, especially with browser-based apps, makes insights readily consumable and actionable to anyone anywhere.  As a result, organizations from small start-ups to large global enterprises empower workers and correspondingly improve their business results and successes with widespread use of analytics.

As technology marches forward, processing power and analytics-specific frameworks such as Spark enable complex analytics processing software and jobs to be completed fast, even instantaneously in some cases. The ever-present Internet and browser-based user interfaces make analytics with richly visualized results available to anyone, anywhere, on large screens as well as on handheld mobile devices, truly putting analytics into the hands of a wide population of end-users. The benefits provided by situational intelligence are accelerating the ability to effectively operationalize analytics.

The age of operationalized analytics catalyzed by situational intelligence that delivers timely and readily consumable actionable insights to anyone anywhere is here.  Fasten your seat belt, the pace of taking action driven by analytics is accelerating.

Another option for operationalizing analytics is automation, which is when systems automatically make decisions and initiate actions via direct machine-to-machine communications. In these cases humans are not in the decision making loop. Automation is an important topic that will be addressed in future blogs.

LinkedInEvernoteFacebook

What if My Data Quality Is Not Good Enough for Analytics or Situational Intelligence?

LinkedInEvernoteFacebook

Spreadsheet 01

 

You may feel that the quality of your data is insufficient for driving decisions and actions using analytics or situational intelligence solutions.  Or, you may in fact know that there are data quality issues with some or all of your data.  Based on such feelings or knowledge, you may be inclined to delay an analytics or situational intelligence implementation until you complete a data quality improvement project.

However, consider not only the impact of delaying the benefits and value of analytics , but also that you can actually move forward with your current data and achieve early and ongoing successes. Data quality and analytics projects can be done holistically or in parallel.

“How?” you ask. Consider these points:

  • Some analytics identify anomalies and irregularities in the input data. This, in turn, helps you in your efforts to cleanse your data.
  • Some analytics, whether in a point solution or within a situational intelligence solution, recognize and disregard anomalous data. In other words, data that is suspect or blatantly erroneous will not be used, so the output and results will not be skewed or tainted (see this related post for a discussion about: “The Relationship Between Analytics and Situational Intelligence“). This ability renders data quality a moot point.
  • It is a best practice to pilot an analytics solution prior to actual production use. This allows you to review and validate the output and results of analytics before widespread implementation and adoption. Pilot output or results that are suspect or nonsensical can then be used to trace irregularities in the input data.  This process can  play an integral part in cleansing your data.
  • Some analytics not only identify data quality issues but also calculate a data quality score that relates to the accuracy and confidence of the output and results of the analytics. End-users can therefore apply judgement if and how to use the output, results, recommendations, etc. Results with low data quality scores point to where data quality can and should be improved.
  • Visualization is a powerful tool within analytics to spot erroneous data. Errors and outliers that are buried in tables of data stand out when place in a chart, map or other intuitive visualization.

You can be pleasantly surprised at how much success you can achieve using data that has not been reviewed, scrubbed or cleansed. So set aside your concerns and fears that your analytics or situational intelligence implementation will fail or have limited success if you do not first resolve data quality issues.

Instead, flip such thinking around and use analytics as one of the methods to review and rectify data quality.  In other words, integrating analytics into your efforts to assess and cleans your data is a great way to leverage your investment in analytics and get started sooner rather than later.

What are you waiting for?  Get started exploring and deriving value from your data no matter the status of its quality.

LinkedInEvernoteFacebook

Operational Analytics, Business Intelligence and The Internet of Things

LinkedInEvernoteFacebook

IoT 01

The Internet of Things (IoT) is rapidly changing the way business operations are monitored and managed. Connected devices detect and communicate the status of essentially any aspect of manufacturing, warehousing and distribution. Many of these same devices are also able to receive commands such as to open or close a switch or valve. As this digital transformation pervades throughout operations the speed at which adjustments and corrections can be made to improve processes, throughput and cost efficiency is becoming faster.

The increased speed of process throughput and improvement now exceeds the capabilities of traditional Business Intelligence (BI) systems offering “descriptive analytics” that are inherently retrospective. The traditional BI modus operandi was to review the output from analyses and then take corrective measures. The cycle time typically spanned days to more than a month. Nowadays with IoT, the cycle time is reduced to mere hours, minutes or even seconds.

This sea change poses challenges for BI solutions that were not designed for fast cycle times, much less immediate real-time processing of streaming data. Just about every operation today is awash in data and crunched for time.

The data problem will continue to pose ever greater challenges because:

  • The Internet of Things is expanding, which means that smart sensors will soon be almost everywhere, creating additional streams of continuous data.
  • New technology will measure data at ever finer intervals, such as synchrophasors used in the transmission and distribution of electricity that measure voltage up to 30 times a second
  • Lean operational processes, such as Kanban and flow, improve operations and just-in-time production and inventory, and generate large volumes of data in the process.
  • Digital customer service increases the number of touch points between customers and vendors, generating still more data.

For all this data to make an immediate impact on your operations, you need to be able to capture it, normalize it, and in many cases analyze it immediately.

This is where traditional (BI) solutions fall down. BI was not and is not designed for real-time analytics of large volumes of high-velocity data. It enables users to ask questions by querying their data, but leaves it to the user to convert the data-out responses to usable and actionable answers and then decide how to apply them. More specifically, BI systems were originally designed for producing data and reports, organized and visualized in presentable formats (e.g., tables, graphs). This was and still is a very useful and valuable, but it’s not the same as enabling ongoing and in many cases real-time operational process management.

To take a data-driven approach to improving operational efficiency, what you need is a more comprehensive analytics approach that integrates and analyzes multiple sources of data both in batches and in real-time to deliver insights that you can act on immediately to drive and/or fully automate business operations.

The need for a more comprehensive solution that transcends the now limiting capabilities of BI systems has been met by a new category of enterprise software solutions referred to as “situational intelligence” (SI). Situational intelligence is a superset of BI capabilities that adds analysis of operational systems with purpose-built advanced analytics that can consume any type of data: internal, external, structured, unstructured, big, streaming and more.

With access to all this data and an understanding of its contribution to the big picture, situational intelligence illuminates the what, where, when, why and how of every asset and situation to provide context needed to make fast and confident business decisions that lead to optimal actions and outcomes.

I strongly recommend that organizations not only adopt and operationalize advanced analytics, but do so within the context of SI solutions to thrive and survive as the IoT transformation continues to unfold.

That’s a bold statement, I know. In coming posts I’ll discuss specific use cases to show how situational intelligence optimizes operations, helps handle uncertainties that arise, and detects and corrects anomalies as they occur.

LinkedInEvernoteFacebook

Platforms Make Visual Analytics More Accessible

LinkedInEvernoteFacebook

Analytics software and the computing power to run it are becoming increasingly affordable, yet organizations thus far have not availed themselves to analytics. A recent Forrester Research report states that 88% of organizational data is not being analyzed. The lack of analysis for data coming from the Internet of Things (IoT) is more acute – according to a 2015 IoT report from McKinsey & Co., only 1% of IoT data is used for any actionable purpose.

Simply put, there’s a lot of opportunity out there for analytics software. What’s the best way to bring analytics into your organization?

There are generally two ways to implement software solutions. The first way is to identify a specific need and then develop a specific product, or so-called “point solution”, that addresses and solves that need. The second way is to develop a platform that provides resources to run multiple solutions. Once a platform is available in the marketplace, both the platform vendor and a community of developers can create and market solutions.

Solutions that run on a platform possess the following advantages over point solutions:

  • With a platform, innovation happens faster and for a larger number of consumers and industries, especially if the platform achieves strong support from a community of developers who can expand their business opportunities.
  • Organizations that use a platform can experiment in ways that a solution or a variant of a solution can be used for another purpose. Once a technology solution is operationalized and people begin to realize its benefits, end users naturally begin to form ideas about additional features and capabilities (just ask any product manager, who is the typical recipient of feature wish lists). Advanced analytics with its predictive capabilities is particularly susceptible to this type of wish list explosion. After all, if an analytics solution can predict which assets are likely to have the shortest or  longest remaining useful life, it should be able to predict other likelihoods too. Because of its relative ease in customizing and extending solutions, a platform makes it possible to clone an application and alter it to support a similar but different use case.

Because there’s so much opportunity out there for analytics, a platform adds greater value to your organization. A recent survey and report from Salesforce states that high performing organizations are 3 times more likely that others to be deriving value from analytics in more than 10 different use cases.  One platform supporting 10 different use cases is much more cost effective and efficient than acquiring or developing, and then maintaining, 10 different point solutions.

Making analytics more accessible through platforms seems like the fastest way to start tapping into the 88 to 99 percent of data that is not being analyzed.

 

LinkedInEvernoteFacebook

Democratizing. Operationalizing. Institutionalizing. Systematizing.

LinkedInEvernoteFacebook

Every person in your organization comes from a different background, different training and different perspective. Without some sort of social and operational glue, chaos could reign when these different people work together on a common task. This is especially true when working on analytics.

The glue that holds analytics organizations together has four components:

  • Democratizing means the ability to spread the use and benefits of a solution throughout the organization. It is a way to remove silos and other barriers, or at least bridge them and foster more collaboration and correspondingly more productivity.
  • Operationalizing means the ability to integrate a new tool, skill or approach into organizational processes. Successfully operationalizing something necessitates communicating its rationale and business value, making it easily available and training personnel in how to effectively employ it.
  • Institutionalizing means the ability to enshrine a consistent task, process, policy and other approach as “the way we get things done in our organization.”
  • Systematizing is similar to institutionalizing. It means having consistent and systematic processes and methods of conducting business; in other words, a collection of related, institutionalized approaches.

Situational intelligence lends itself to democratizing, operationalizing, institutionalizing and systematizing in  some important ways. Intuitive visualization of analytics results make them more accessible to more people, smoothing the way to democratic use and enshrinement as “the way we get things done.” Because spatial-temporal-nodal analytics delivers relevant context, a broad range of workers from the back office to the field can participate in and benefit from analytics. The focus on users taking action based on analytics (rather than on users performing analytics) means situational intelligence moves quickly into operations.

When all the different line workers, field workers, knowledge workers, managers and executives in your organization directly see and benefit from new and improved analytics processes, those processes are more likely to become institutional, systematic and part of your organizational culture. Interpersonal and interdepartmental collaboration rises, and everyone enjoys getting things done efficiently, optimally, anytime and anywhere.

LinkedInEvernoteFacebook

The Relationship Between Analytics and Situational Intelligence

LinkedInEvernoteFacebook

This blog contains categories and posts for situational intelligence and for analytics.  Outside of this blog, these terms sometimes are used interchangeably, so I thought it would be worthwhile to describe the relationship between analytics and situational intelligence. Generally speaking, analytics is a component of a situational intelligence solution.

Generically, analytics is a broadly used term that describes a type of computational software.  More specifically, the term analytics describes algorithms, models and an entire category of software applications. Specific types of analytics generate outputs and results that range from insightful details about past events to recommended responses to predicted future events.

Because we typically use analytics to obtain a detailed understanding of past events and to predict future events, the output of the analytics must be readily understandable and acted upon. This is one of the reasons why analytics algorithms and models are generally embedded within an application program that makes the output available and actionable to users and to other systems.

Algorithms and models operate on data, so analytics must somehow have access to systems and sources of data (generally in predetermined formats). This is another reason that analytics is embedded within an application program – to seamlessly integrate data access with analytical capabilities.

Analytics can be delivered in several different forms: as native algorithms (e.g., as an R package), as specific models and point solutions, and as a salient component of “intelligence” solutions, such as situational intelligence solutions.

Situational intelligence is the latest generation of intelligence solutions that accesses data from many systems and sources then, depending upon the use case and solution, correlates, analyses and presents the results of analytics in contextually relevant and intuitively actionable ways.

As an example, consider a situational intelligence solution that generates an optimal work schedule based on rules, constraints and changing conditions. While the solution may use stochastic optimization, an analytical method, to generate the optimized output, the complexity of this particular analytical method is hidden from end-users who receive output familiar and actionable to them – schedules and work orders.

Embedding analytics within intuitive and easy–to-use application software removes barriers to use and consumption. Conversely, this approach also extends and operationalizes analytics throughout the organization by delivering information to people that is easily consumed and comprehended (at-a-glance) when and where they need it to drive and affirm decisions and actions.  This approach spreads the power of analytics beyond the IT “glass house” and into the hands of the people taking action to achieve organizational goals.

Seamlessly integrating analytics into situational intelligence applications that elegantly handles the data input and output makes analytics accessible to more people to drive more favorable outcomes. This method of embedding analytics is among the best ways to democratize and operationalize analytics, and it also clarifies the relationship between analytics and situational intelligence.

LinkedInEvernoteFacebook