Improving Your Operations with Data Visualization Applications

LinkedInEvernoteFacebook

Operational and Cost Efficiency

Visual analytics is an effective way to understand the stories in your data and take appropriate actions. Effective data visualizations powered by visual analytics enable you to easily and interactively dive deep into your data to identify opportunities to improve efficiency and productivity, cut costs, and optimize resources; all of which are at the crux of increasing operational efficiency. Hence operations leaders want to quickly understand what is in their data so they can initiate and monitor improvement programs as well as make the best possible decisions for the types of situations they regularly face.

But there’s a small catch – while it’s tempting to believe that putting powerful data visualization authoring software in the hands of business users will result in useful solutions, this is rarely a recipe for success. That’s because creating effective data visualizations requires expertise because visualization authoring software itself does not magically surface insights. Human expertise is required to implement the type of functionality that makes it possible to surface and intuitively convey insights that are actionable for making operational improvements.

Basic tables and charts are easy to create, however solving problems and enabling data discovery that leads to root causes and opportunities to improve operations is an entirely different matter. Spreadsheets and most visualization software make it easy to create pretty charts as well as to combine tables and charts into dashboards but do not fully meet the needs of operations leaders. Let’s face it, if spreadsheets alone were sufficient, you’d have all you need to effectively run your business operations.

Questions that you should ask yourself are:

Do one or more simple visualizations or dashboards containing multiple simple visualizations solve real business operations problems?

Do the visualizations surface buried insights and make them readily comprehensible and actionable?

Is it possible to clearly visualize activity on a map and see changes unfold over time?

Is it possible and to synchronize components within a dashboard to update when data is selected, filtered and sorted?

Of these capabilities, how easily can they be implemented (if at all)?

The answer to these questions exposes the necessity for specific functionality that transcends just data visualizations; application software is required to deliver such functionality, or to be specific – data visualization applications. This is one of the reasons that expertise is required, because applications must be implemented to fully deliver on the promise of visual analytics. Expertise must include art, skill and knowledge that typical operations personnel do not possess. Business users do not understand their organization’s data landscape or how to model and transform their data for visualization. And they often don’t have the time to learn and hone the expertise needed to implement data visualization applications regardless of how simple modern data visualization developments tools are to use.

Full service offerings that use best-of-breed visual analytics are a great way to obtain the needed combination of expertise and visual analytics to enable you to achieve the objective set forth in this post – to improve all aspects of operational efficiency.

LinkedInEvernoteFacebook

Mobile Apps for Internet of Things Data Acquisition

LinkedInEvernoteFacebook

nugget, gold, bronze, copper, iron

The Internet of Things in many ways is a catchall phrase that is used to describe everything from types of devices, to communications gateways, to new service-oriented business models. IoT devices generally are capable of sensing and communicating. IoT devices in the consumer sector include thermostats, door locks, garage door openers, etc. In the industrial sector there are many sensors used in manufacturing processes, vehicles, heavy equipment, and so on. Sensing and communicating data has traditionally been referred to as data acquisitions – a common IoT use case. What is often overlooked is the use of smartphones and tablets for data acquisition. These devices include several sensors such as for audio, video and motion.

The following story highlights how the mobile devices that we use every day are becoming integral to the IoT ecosystem.

Recently I was at a cafe with a friend. A former coworker of my friend whose name is Craig walked in, so my friend invited him to join us. My friend asked Craig “where you currently working?” Craig answered “I am working as an independent contractor developing a unique mobile app.”

With the Apple and Google app stores full of apps, and in many cases offering multiple apps that essentially do the same thing, I wondered what app could be new and unique. I quickly found out as Craig described how the app would help mining companies improve how they determine where mineral deposits most likely exist. Easier identification of mineral deposits will accelerate, optimize and lower the cost of mining – a definite game changer.

Determining places to explore and excavate is a combination of manual labor and trial and error. Miners typically pick and scrape away at surfaces in a mine to collect sample material to be examined to determine if the sample contains mineral deposits. If mineral deposits are detected then further exploration at that area would be initiated.

Craig then explained how the app works. Each mineral has a unique signature that can be identified by a spectrometer (from how the mineral reflects light). Photos and videos taken with smartphones and tablets use compression so the signature cannot be detected using standard photo and video apps. The app he developed interfaces directly to the video sensor so it can analyze the reflected light with the needed fidelity to recognize spectral signatures that identify specific areas where desired mineral deposits can likely be found. The locations identified are marked and uploaded to an operations center for further review and for planning.

Learning about this app shows how ingenuity and software running on commercial off-the-shelf smartphones and tablets makes them applicable for data acquisition use cases. More use cases that integrate people and mobile apps into IoT use cases will surely ensue.

So the next time you pick up a smartphone or tablet think of the myriad of uses it can be programmed to perform, especially when connected to other devices, systems and people. If you know of clever uses of mobile apps for IoT use cases, please comment.

LinkedInEvernoteFacebook

Use Visual Analytics to Get Started with the IIoT

LinkedInEvernoteFacebook

Industrial IoT (IIoT) applications bring about many opportunities to increase operational efficiency by presenting personnel with timely insights into their operations. Visualizing IIoT data using visual analytics is a proven way to facilitate insight-driven decisions. So at the very least your IIoT initiative will start off by integrating IIoT connectivity, visual analytics and other system components. To best ensure early and ongoing success it is recommended that you follow the best practice of starting small, attaining quick wins and then increasing scope and/or scale.

The first step is to connect devices and systems and use visual analytics to create a simple visualization of your IIoT data. If the IIoT devices are mobile or geographically separated, then an appropriate visualization would be to display the location of the devices on a map such as shown above. This is an effective way to verify connections and validate successful integration.

The second step is to collect and intuitively visualize your IIoT data. At this point you can identify issues to make operational efficiency improvements.  As an example, a freight trucking business can see a map with the locations and times of where their trucks are moving at slower than expected speeds. This information is used to change the routes on the fly to maximize on-time deliveries. As this example highlights, connecting to IIoT data streams and visualizing the data facilitates operational efficiency improvements.

The third step is to correlate data from different systems and data sources, including time series data from devices at different locations. Visualizing data correlated by time and location makes it possible to create comprehensive big picture views that reveal details about what happened and is happening, where, when, why and how. Using the trucking example, areas where driving speeds are consistently slower than expected are highlight by the red lines on the map above. This information is used to refine future routes, schedules and delivery commitments.

The fourth step is to apply advanced analytics to the IIoT data to generate insights for inclusion into the visualizations. Returning to the trucking example, advanced analytics will recommend the optimal average truck speed to minimize fuel costs based on the weight of the load they are carrying. Visualizing each truck using color coding to highlight the biggest offenders makes the analytics results actionable at-a-glance so that operations managers and drivers can improve driving efficiency. In the image above it is easy to see the truck icons colored yellow and red that represent the trucks that are traveling outside of the optimal speed range.

Having completed these steps you are positioned to leverage your IIoT infrastructure and expand on your competency by combining visual analytics, data correlation and advanced analytics in innovative ways to address business problems and facilitate operational efficiencies that would not otherwise be possible. Future blog posts will cover such combinations and the corresponding operational efficiencies.

LinkedInEvernoteFacebook

Success Stories

LinkedInEvernoteFacebook

Success Story

Business analysts, data scientists, authors and bloggers sometimes describe data mining and analytics as “finding the story in your data.” It’s a good metaphor although I want to extend the notion from a single story that can be told to multiple stories because there are actually a multitude of stories in your data. Each story can be told based on the types of data your organization collects and by the analytics applied. Stories in your data can be about:

  • Your operations, including specific facilities, machinery and assets
  • Your sales activity
  • Your vendors and suppliers
  • Situations and events that affect or impact your organization
  • Your customers, their buying journey, their interaction with your organization and its website
  • If your organization has connected to the Internet of Things (IoT), then stories can also be told about the things, their environments, and their end-users including how they use and interaction with your products

Analytics can tell stories in many different ways. Which is to say that the types of analytics that you use will tell a different story. Knowing the particular type of stories you are interested in and want to benefit from most influence your criteria for choosing analytics. One type of storytelling, let’s call it situational intelligence, is a comprehensive fact-based story about what happened with details about where, when, why and how. Such stories contribute to understanding situations and root cause(s) that can drive decisions and actions to prevent future occurrences. Situational intelligence stories are therefore extremely valuable for understanding situations so you can take prompt and appropriate action to achieve success.

Your stories become enriched with more useful and actionable detail as more analytics are applied. The following brief high-level examples illustrate this point. Analytics enrich stories by identifying outliers, groups, patterns, and top and bottom performers. Forward-looking analytics (aka predictive analytics) enrich stories by foretelling things such as demand, foretelling what is most likely to happen. Other types of analytics identify and describe relationships between entities and enrich the stories with insights into ripple effects. Analyzing relationships enables telling detailed stories about the magnitude of situations – what else is or might be affected and to what extent. Analytics also is able to create a successful story ending by considering all possible outcomes and then choosing the best or optimal outcome. It should now be clear how multiple different analytics can be applied and combined to tell very rich and detailed stories. Actionable stories in fact, because the stories in your data drive investigation, decisions, actions and the best possible outcomes.

In addition to applying and combining different analytics to research the stories to compose compelling content (metaphorically speaking), there is another primary method of enriching stories – with data. After all, the data that is available to the analytics is the foundation and critical component. It is generally the case that the more data available, the more comprehensive of a story can be composed. And not just more data from the same source (e.g., 5 years of historical data versus 2 years) but data from complementary and supplementary sources. Your own data sources can and should include data streams from your website and/or your integration with the Internet of Things (IoT). You can and should augment your data from relevant external sources such as web services that provide weather, traffic, spot market prices, exchange rates, etc. Using multitude sources of data (referred to as broad data) is similar to how a journalist doing research for a story will seek and use all practical and relevant sources of information to tell the most accurate and complete story possible.

Let’s take this analogy one step further – consider a story about a customer’s use of a product that your organization produces and sells. The [usage] story can be told just by data captured from the customer’s interaction with your organization. If the customer is using a web-based service and/or an IoT device, your organization can receive an ongoing stream of data. If all of the data about the customer is fused with data from another system, such as the ecommerce system that captures aspects of the customer’s buying journey, then the resultant story becomes richer. Your organization can use such stories to understand and predict how the customer might acquire new products, what else the customer might need or want to buy, and how they might interact with your organization in the future. A more compelling story – a success story; one that can drive personalized experiences and sales offers that have a high likelihood of achieving customer retention and increased revenue.

More and more organizations are applying analytics to gain a competitive advantage using the actionable stories in their data. You too should embrace analytics to ensure your own success stories.

LinkedInEvernoteFacebook

Why We Need Situational Intelligence, Part 3

LinkedInEvernoteFacebook

In Part I and Part 2 of this series I addressed why situational intelligence is a natural and essential method of decision-making that is especially apropos for real-time business operations. Inherent in my argument is an altruistic belief that people make the best decisions and take the best actions with the information at hand. That is the crux of the matter – the information at hand and how accurate and actionable it is. What information is available to decision makers? Does it contain insights? Is it current? Is it clear or is interpretation and/or further analysis necessary before the information is actionable? Is it reliable? How comprehensive is the it? Correspondingly, how much uncertainty shrouds the information, the decision, the action and the outcome? What are the risks of making a bad decision (including no decision)?

Ideally the answers to the preceding rhetorical questions should all be encouraging. But how can these attributes of data and insights for decision-making be assured, especially when decisions are made by different people, when decisions needed are for unplanned situations, and when timeliness is important? Systematized decision-making aided by technology-generated intelligence is a way to assure that accurate insights are derived from data and actionable by decision makers. As discussed in the preceding blogs (and other blogs too), advanced analytics and visual analytics are essential building blocks for analytics that support operational decision-making. Data must be transformed into insights and intelligence. The insights must also be transformed so they are readily comprehended at-a-glance and are actionable.

Another key consideration is having a broad composition of data for analysis. The more data from relevant sources within the enterprise, from the IoT and from external sources, the more insights can be derived by analytics. Accessing external data enriches intra-enterprise data sources with relevant context that is useful when decision makers require supplemental information, such as when insights brought forward to decision makers is not immediately actionable. In such cases further discovery helps decision makers gain the needed understandings and confidence to make a decision. This is where additional data sources and the corresponding added context facilitates interactive data exploration so that decision makers can make timely and favorable decisions. Sources and types of external data include: weather, traffic, news, spot market prices and social media.

Having live connections to data sources ensures that decisions are made using the most up-to-date data, and also enables interactive exploration of underlying data to deeply understand and resolve complex multifaceted situations. A single system that maintains live connections to data sources yields another benefit – it helps organizations bridge their data silos and unify their data assets.

Here at the end of this blog series, situational intelligence now sounds easy, and somewhat obvious too – connect to relevant data sources, apply analytics, make the resulting insights and underlying data available to decision makers with intuitive visualizations so they can consistently make the best possible decision in any situation. If you use an off-the-shelf solution to implement situational intelligence, getting started is also relatively simple. Decide for yourself. What does your situation require?

If you have experiences, thoughts, opinions on this topic, please comment and share them.

LinkedInEvernoteFacebook

Why We Need Situational Intelligence, Part 2

LinkedInEvernoteFacebook

Big Picture 01

In my previous blog I addressed the need for situational intelligence (SI) as an approach to decision-making that combines insights with relevant context to create the big picture we need to make the best possible decisions with the lowest risk. I concluded that blog by promising to explain why and how various technologies such as data access and fusion, analytics with machine learning, artificial intelligence and visual analytics come together to support situational intelligence.

Situational Intelligence itself is not a technology, nor can you use just one technology to create it. Rather, a situational intelligence approach requires a combination of integrated technologies. The main types of technologies are listed below. Seamlessly integrating these technologies creates actionable insights that are especially applicable for real-time operational decision-making.

  • Live connections to data, both at rest and in motion, in a variety of formats and structures (including no structure at all). Access to multiple, disparate sources of data provides the context for new, deeper insights. Connecting directly to data creates great efficiency and savings savings because data access and preparation often consume as much as 80% of the effort of making data-driven decisions.
  • Analytics, big data, and streaming foundational technologies (such as Spark, Hadoop, SAP HANA) that are inherently scalable and enable high-performance execution of analytics and processing of large datasets. These foundations are typically distributed and use in-memory processing so that complex software executes and generates answers and insights as quickly as possible. Streaming message brokers such as Kafka and Internet of Things (IoT) platforms are also necessary to manage streams of data that can be passed through streaming analytics for real-time insights and/or to be stored for inclusion in subsequent applications of advanced analytics that derive deeper insights.
  • Advanced analytics and streaming analytics that derive insights from the data at rest and data in motion, respectively. Because situations inherently occur at specific times and locations, the ability to correlate spatial and temporal relationships increases the insights that can be derived. Similarly, the ability to correlate entity-to-entity relationships increases the insights by revealing actual and likely ripple effects. Altogether these analytics make it possible to identify the what, when, where, why and how of situations that happened or may happen. In addition, machine learning allows the analytics to adapt to your data and to your use cases.
  • Visual analytics is essential to complete the transformation of data into actionable insights. Intuitive renderings of the relevant data and resultant insights derived by analytics helps users comprehend and acted on data at-a-glance. Output from visual analytics included in alerts via email and SMS is a powerful way of notifying people about critical matters and focusing their attention on acute situations and the decisions to be made.

In summary, situational intelligence is an approach that combines data and analytics, including visual analytics, to aid human decision-making. Insights from advanced analytics and streaming analytics are combined with relevant underlying data to create context so that decision-makers have a complete understanding of each situation and make decisions that lead to the best possible outcome.

LinkedInEvernoteFacebook

Why We Need Situational Intelligence, Part 1

LinkedInEvernoteFacebook

air traffic control

Why do we need situational intelligence?

When we lack information, or information is not easily consumed or comprehended, then our decisions are compromised. We don’t have the right information, at the right time, or in the right form to lead to the best possible outcome.

At its core, situational intelligence handles all of the relevant data needed to derive insights that will guide your decisions. Today’s environment typically means large volumes of disparate data, pouring too quickly into over-burdened systems, leaving executives and analysts alike wondering if they can believe the data they see.  Some insights can be derived only by sifting through this ever-growing mountain of data looking for hidden correlations. Correctly correlating and analyzing all of necessary the data and correctly presenting results, recommendations and predictions is the biggest differentiator of situational intelligence over traditional analytics.

Unlike typical business intelligence or stand-alone analytics solutions, with situational intelligence we receive valuable details, recommendations and predictions that typically result in enhanced competitive advantage through:

  • Cost savings
  • Increased efficiencies, productivity and performance
  • Increased revenues
  • Improved client engagement that raises satisfaction
  • Better understanding of exposure that facilitates better management of risk

To put this into a concrete example, consider why airports need flight and ground traffic control systems. There is the obvious answer – to know where airplanes vehicles and people are both in the air and on the ground, to safely and efficiently stage their movement. Managing traffic at an airport requires context, such as the current and forecasted weather, to achieve the best possible safety and efficiency. Even a seemingly simple matter such as a broken jetway has many consequences that affect the context of ground control, fueling, cabin cleaning, luggage, passenger relocation, food service, etc. Now, imagine the complexity of handling a crisis situation such as an airplane needing to make an unplanned emergency landing.

Managing an operation with as much complexity, interdependencies and consequences as an airport requires the staff in the operations control center to have a live, real-time, big-picture view of everything that is happening and, ideally, also what is most likely to happen. As you surely recognize, keeping track of so much fast-changing information in a person’s head alone is impossible and prone to errors and omissions.

Clearly having as much relevant and easily comprehensible information as possible provides the context that we naturally seek to guide our decisions and actions. In a follow-on blog I will explain why and how various technologies such as flexible data access, analytics, machine learning, artificial intelligence and visualization should be seamlessly integrated to create and deliver situational intelligence that is truly actionable.

 

(Image courtesy of Flikr)

LinkedInEvernoteFacebook

Putting Analytics in the Hands of End-Users

LinkedInEvernoteFacebook

Anyone Anytime Anywhere

Operationalizing a technology or process (or both) means efficiently getting it in the hands of end-users who will realize value and benefits in their roles and by doing so extend the value and benefits to their organization.

This blog focuses on operationalizing analytics for decision support for humans, which as you’d expect accounts for most business decisions. TDWI Research reveals in a recently published best practices report that 75% of an organization’s decisions supported by analytics are made by humans. The entire report that includes a thorough examination of operationalizing analytics and the interrelated topics discussed in this blog, can be downloaded and read using this link: “Operationalizing and Embedding Analytics for Action.”

Analytics, simply put, is a category of information processing methods that derives value from data. Analytics is necessary to operate on data that is too complex and voluminous for manual methods. Specific types of analytics perform vastly different functions that generate different outputs that include: insightful details, predictions, recommendations, optimized choices, outliers (anomalies), patterns and trends. While the output of analytics may be interesting, the value and benefits are only realized when specific actions are taken. That means the recipients of the output from analytics must be able to consume it, comprehend it, and effectively use it to make decisions and take action. Often the shorter the time to action the better.

Until recently analytics has been confined to IT and data science professionals, impeding organizations from maximizing the benefit of the value in their data and from their investments in analytics. The recently published best practices report published by TDWI Research also cites the necessity, value, recognition and trend of making analytics and its output available to a wide group of people within an organization. Among the survey results in that report is an increasing awareness and willingness by organizations to operationalize analytics with 88% of their survey respondents claiming they have analytics in production or development that could be considered to be operationalized.

Another impediment is from delays between the availability of insights for decision support and when its actually needed diminish the value of the insights, or worse, allow other undesirable and potentially preventable consequences to occur. The most common reason for delays is due to inherently slow manual processes required to gather the necessary data, prepare it, have specific personnel run analytics programs, review and otherwise process the output, then convey the results to the decision makers. Each of these steps can take hours or days; even weeks in extreme cases. Timeliness is therefore also an important characteristic of a well operationalized solution. When appropriate actions are taken faster, gains can be maximized and adverse consequences can be averted or minimized.

The good news is that modern technologies make it possible to put actionable insights from analytics into the hands of end-users with few or none of the delays just discussed. That is, operationalized analytics can result in a very short or zero time-to-insights.

Making results available in a timely manner can be achieved by making analytics available on a self-service basis and/or making the output continuously available and readily consumable. One example of making the output continuously available and consumable is displaying intuitive visualizations of analytics output on a monitor wall in an operations control room. For some organizations, it is very commonly necessary to receive and act on insights and output from analytics both inside and outside of a control room. Delivering analytics output to people at their desk, on the factory floor, in the field and wherever they are is typically accomplished using browser-based applications, mobile devices, and ubiquitous communications networks (e.g., WiFi, 4G LTE, etc.).

Another best practice for operationalizing analytics is to embed analytics into existing business processes and the visualized output of the operational applications used to facilitate specific business processes. Analytics processing can be hidden in the background such that what end-users receive is seamlessly integrated into the screens and dashboards they’re accustomed to using. When this type of visual blending is not possible in the native application, situational intelligence, with its ability to create composite views, can be used to include the output of other applications combined with analytics into a single app window. This latter approach creates a broad and relevant context for decision-makers, enhancing their ability to act quickly and appropriately with confidence.

For the reason just mentioned, situational intelligence is in fact a powerful and highly effective way to operationalize analytics because this type of enterprise application lends itself to relatively easily operationalizing analytics with intuitive user interfaces and at-a-glance presentation of information and results from analytics. Tightly integrating visualizations with data and analytics results, especially with browser-based apps, makes insights readily consumable and actionable to anyone anywhere.  As a result, organizations from small start-ups to large global enterprises empower workers and correspondingly improve their business results and successes with widespread use of analytics.

As technology marches forward, processing power and analytics-specific frameworks such as Spark enable complex analytics processing software and jobs to be completed fast, even instantaneously in some cases. The ever-present Internet and browser-based user interfaces make analytics with richly visualized results available to anyone, anywhere, on large screens as well as on handheld mobile devices, truly putting analytics into the hands of a wide population of end-users. The benefits provided by situational intelligence are accelerating the ability to effectively operationalize analytics.

The age of operationalized analytics catalyzed by situational intelligence that delivers timely and readily consumable actionable insights to anyone anywhere is here.  Fasten your seat belt, the pace of taking action driven by analytics is accelerating.

Another option for operationalizing analytics is automation, which is when systems automatically make decisions and initiate actions via direct machine-to-machine communications. In these cases humans are not in the decision making loop. Automation is an important topic that will be addressed in future blogs.

LinkedInEvernoteFacebook

Demystifying Analytics

LinkedInEvernoteFacebook

black box

Unlike many other enterprise data processing solutions, analytics is viewed as a mysterious black box. Historically some analytics and models were so complex that only experienced IT professionals could execute analytics jobs on costly large computing platforms. IT personnel, database administrators, software developers, and other skilled personnel were necessary to interpret the output of analytics to arrive at the proverbial “answer.” In such cases the time-to-answer could be hours, days and even weeks.

The air of mystery especially envelopes analytics that derive likely outcomes (a/k/a “predictive analytics”). Predictive results are actually likely outcomes derived by processing large volumes of data using specific mathematical and statistical methods. Nevertheless some say colloquially that predictive analytics can predict the future. Statements such as this add to the mysticism about analytics.

Today’s vocabulary of analytics increases the aura of mysticism: Hadoop, big data, artificial intelligence, machine learning, data science, stochastic optimization, etc.

Because of this mysticism and the seeming ability to predict the future, people also have a notion that analytics is an elite category of software available only to large enterprises with large budgets, extensive IT infrastructures and dedicated teams.

The good news is that specialized terminology, rarified skill sets and expensive machinery no longer confine analytics to elite glass houses. As the simplicity of analytics becomes more commonplace, the aura of mysticism evaporates. The image of an esoteric technology for an elite few fades, giving way to adoption by a broad range of workers and end-users.

Powerful yet affordable commodity hardware and other technological advances make it possible for organizations regardless of size, budget and personnel to obtain and run analytics, consume and act on the results, and realize the many benefits. Software advances such as distributed high performance computing platforms and alternatives to traditional relational databases, to name a few, bring analytics within the grasp of any organization. Advances in data visualization remove the need to post-process analytics results into readily consumable and actionable answers, whether the answers are recommendations, predictions or other forms of insight. Ubiquitous communications, modern browsers and applications that take advantage of HTML5 greatly simplify the ability to deploy software solutions of any type and complexity, including analytics.

This new demystification and accessibility comes just in time. For organization to benefit from their data, personnel must be able to receive, comprehend and act on alerts, recommendations, predictions and all other insights generated by analytics. Without this operationalization of analytics, data threatens to flood the organization without providing any value.

As analytics become more accessible, their use and results will be embedded in many applications, which in turn hides complexity and helps make analytics ubiquitous. I eagerly look forward to many new and innovative uses of analytics, and the resulting business and societal benefits.

LinkedInEvernoteFacebook

What if My Data Quality Is Not Good Enough for Analytics or Situational Intelligence?

LinkedInEvernoteFacebook

Spreadsheet 01

 

You may feel that the quality of your data is insufficient for driving decisions and actions using analytics or situational intelligence solutions.  Or, you may in fact know that there are data quality issues with some or all of your data.  Based on such feelings or knowledge, you may be inclined to delay an analytics or situational intelligence implementation until you complete a data quality improvement project.

However, consider not only the impact of delaying the benefits and value of analytics , but also that you can actually move forward with your current data and achieve early and ongoing successes. Data quality and analytics projects can be done holistically or in parallel.

“How?” you ask. Consider these points:

  • Some analytics identify anomalies and irregularities in the input data. This, in turn, helps you in your efforts to cleanse your data.
  • Some analytics, whether in a point solution or within a situational intelligence solution, recognize and disregard anomalous data. In other words, data that is suspect or blatantly erroneous will not be used, so the output and results will not be skewed or tainted (see this related post for a discussion about: “The Relationship Between Analytics and Situational Intelligence“). This ability renders data quality a moot point.
  • It is a best practice to pilot an analytics solution prior to actual production use. This allows you to review and validate the output and results of analytics before widespread implementation and adoption. Pilot output or results that are suspect or nonsensical can then be used to trace irregularities in the input data.  This process can  play an integral part in cleansing your data.
  • Some analytics not only identify data quality issues but also calculate a data quality score that relates to the accuracy and confidence of the output and results of the analytics. End-users can therefore apply judgement if and how to use the output, results, recommendations, etc. Results with low data quality scores point to where data quality can and should be improved.
  • Visualization is a powerful tool within analytics to spot erroneous data. Errors and outliers that are buried in tables of data stand out when place in a chart, map or other intuitive visualization.

You can be pleasantly surprised at how much success you can achieve using data that has not been reviewed, scrubbed or cleansed. So set aside your concerns and fears that your analytics or situational intelligence implementation will fail or have limited success if you do not first resolve data quality issues.

Instead, flip such thinking around and use analytics as one of the methods to review and rectify data quality.  In other words, integrating analytics into your efforts to assess and cleans your data is a great way to leverage your investment in analytics and get started sooner rather than later.

What are you waiting for?  Get started exploring and deriving value from your data no matter the status of its quality.

LinkedInEvernoteFacebook