Re-imagining the Future of Asset Maintenance

Copyright: http://www.123rf.com/profile_wi6995
LinkedInEvernoteFacebook

Asset failure, or more accurately, avoiding asset failure, is big business, as it should be. For asset-intensive industries, asset failure can mean revenue loss, customer dissatisfaction, brand degradation, even regulatory fines. So improving the means by which asset failure is avoided is as important as the day-to-day production by the asset.

Many companies continue to take a break/fix approach to asset repair, or cyclical preventative maintenance, where pre-set characteristics of general asset types determine when maintenance is performed. Some are considering Condition-Based Maintenance (CBM), where some parameter of an asset is monitored and repair is performed when that parameter indicates a problem or imminent failure, based on a statistical models for that type of asset. But greater business benefits are achieved with Predictive and Prescriptive Maintenance, often powered by machine learning, which look at the state of each individual asset and predict the probability of failure into the future, and optimize maintenance and repair schedules based on that input along with other constraints.

Arc Advisory Group recently updated its Maintenance Maturity Model to note the availability and benefits of these more sophisticated analytic approaches. They noted that moving from preventive maintenance to predictive and prescriptive models can deliver 50 percent savings in labor and materials, which has a ripple effect from improvements in shipping times to customer satisfaction. They observe that new technologies in the industrial internet of things (IIoT) enable inexpensive, real-time asset monitoring. Measuring vibration, heat, lubricants, and other asset conditions in real-time are essential for the enterprise to adopt Predictive and Prescriptive Maintenance. Creating a ‘digital twin’ or software model of the asset gives analytics software a basis to compare ideal and observed measurements.

Doesn’t CBM provide many of the same benefits? Perhaps to a lesser extent, but there is no reason to settle for CBM. In CBM, analytics examines the current state of the asset to alarm for likely asset failure. However, not every condition that may appear to head toward failure actually will in that specific asset, and true asset maintenance optimization can occur only when an enterprise can reliably determine the difference. Avoiding unnecessary maintenance costs can extend asset life at a fraction of the cost.

Predictive maintenance powered by machine learning should allow you to ‘see over the hill,’ beyond the current condition, to determine the most probable outcome given the current condition of each asset. The combination of machine learning and IIoT could prove to be the missing link in smart and effective asset maintenance.

 

 

Image courtesy 123RF: Copyright: http://www.123rf.com/profile_wi6995

LinkedInEvernoteFacebook

CrateDB SQL Database Puts IoT and Machine Data to Work

LinkedInEvernoteFacebook

Space-Time Insight joined CrateDB in their launch of CrateDB 1.0, an open source SQL database that enables real-time analytics for machine data applications. We make extensive use of machine learning and streaming analytics, and CrateDB is particularly well-suited for the geospatial and temporal data we work with, including support for distributed joins. It allows us to write and query sensor data at more than 200,000 rows per second, and query terabytes of data. Typical relational databases can’t handle anywhere near the rate of ingestion that Crate can.

Crate handles and queries geospatial and temporal data particularly well. We also get image (BLOB) and text support, which is important for our IoT solutions, as they are often used to capture images on mobile devices in the field and provide two-way communication between people and machines. Crate is also microservice-ready — we’ve Dockerized our IoT cloud service, for example.

Finally, our SI Studio platform uses Java and SQL and expects an SQL interface, so choosing Crate made integration straightforward and allowed us to leverage existing internal skill sets.

Read more at Crate.io, Space-Time Insight and The Register.

LinkedInEvernoteFacebook

UK Smart Meter Roll Out: It’s All About The Data

LinkedInEvernoteFacebook

meter-installation-web

Despite some 2.6 million smart meters already being installed in the UK, it is the data infrastructure that is causing delays with the further roll out of smart meters in the UK, according to a recent BBC article. This IT project is necessary to support the volume of data anticipated to come from the smart meter roll out that is being pushed by the government.

From the chart below you can see how many meters have been installed since 2012.  Higher volumes of data are already being collected which reinforces the need for this important IT project to be up and running as soon as possible.

uk-smart-meters

(Chart and data available from the UK Department of Energy & Climate Change)

With news that the data infrastructure launch is pushed back until the autumn, what impact will this have?

How much data will smart meters generate?

To do a quick calculation on monthly meter reads from the potential smart meters across the UK, there would be around 53 million reads per month. By contrast, with smart meters that record data every 15 minutes, this could mean 96 reads a day from 53 million meters resulting in thousands of times more data being generated. This is obviously a rough estimation but gives an indication as to what the energy companies would be dealing with. This doesn’t include status messages from the meters which would add to the mass amount of data being generated.

Why is this so important, if smart meters are just about making billing automated and putting an end to manual meter reading? There is a lot more value within meter readings and status messages beyond billing.

The benefits of smart meters are clear for consumers: tracking how much energy you are using, monitoring the effect of changes that you have made to your energy consumption, and receiving accurate bills without having to submit a meter reading.

When applied properly, data helps energy companies to manage supply and demand in a much easier fashion. Energy companies benefit from analysing the data collected from the smart meters to enable new rates and business models, implement demand response programs, manage solar power panels in a better way and improve support for electric vehicles, to mention but a few.

To benefit from the thousands-fold growth in meter data, energy companies need analytics that locate the problems and opportunities hidden inside this massive amount of data. Smart meter analytics must be intelligent enough to do the heavy lifting for users, not just make it easier somehow for users to browse among millions of meters. Increasingly, analytics for this size of data set needs the intelligence and autonomy to make decisions independently.

Once the IT infrastructure is in the place, the UK energy companies can start pursuing the new value within smart meter data, analysing it to make better business decisions. All 53 million UK meters likely won’t be changed out by 2020, but that shouldn’t stop UK energy providers from using the smart meter data they already have, or will have soon.

(Image courtesy of rido / 123RF Stock Photo)

LinkedInEvernoteFacebook

Mobile Apps for Internet of Things Data Acquisition

LinkedInEvernoteFacebook

nugget, gold, bronze, copper, iron

The Internet of Things in many ways is a catchall phrase that is used to describe everything from types of devices, to communications gateways, to new service-oriented business models. IoT devices generally are capable of sensing and communicating. IoT devices in the consumer sector include thermostats, door locks, garage door openers, etc. In the industrial sector there are many sensors used in manufacturing processes, vehicles, heavy equipment, and so on. Sensing and communicating data has traditionally been referred to as data acquisitions – a common IoT use case. What is often overlooked is the use of smartphones and tablets for data acquisition. These devices include several sensors such as for audio, video and motion.

The following story highlights how the mobile devices that we use every day are becoming integral to the IoT ecosystem.

Recently I was at a cafe with a friend. A former coworker of my friend whose name is Craig walked in, so my friend invited him to join us. My friend asked Craig “where you currently working?” Craig answered “I am working as an independent contractor developing a unique mobile app.”

With the Apple and Google app stores full of apps, and in many cases offering multiple apps that essentially do the same thing, I wondered what app could be new and unique. I quickly found out as Craig described how the app would help mining companies improve how they determine where mineral deposits most likely exist. Easier identification of mineral deposits will accelerate, optimize and lower the cost of mining – a definite game changer.

Determining places to explore and excavate is a combination of manual labor and trial and error. Miners typically pick and scrape away at surfaces in a mine to collect sample material to be examined to determine if the sample contains mineral deposits. If mineral deposits are detected then further exploration at that area would be initiated.

Craig then explained how the app works. Each mineral has a unique signature that can be identified by a spectrometer (from how the mineral reflects light). Photos and videos taken with smartphones and tablets use compression so the signature cannot be detected using standard photo and video apps. The app he developed interfaces directly to the video sensor so it can analyze the reflected light with the needed fidelity to recognize spectral signatures that identify specific areas where desired mineral deposits can likely be found. The locations identified are marked and uploaded to an operations center for further review and for planning.

Learning about this app shows how ingenuity and software running on commercial off-the-shelf smartphones and tablets makes them applicable for data acquisition use cases. More use cases that integrate people and mobile apps into IoT use cases will surely ensue.

So the next time you pick up a smartphone or tablet think of the myriad of uses it can be programmed to perform, especially when connected to other devices, systems and people. If you know of clever uses of mobile apps for IoT use cases, please comment.

LinkedInEvernoteFacebook

Use Visual Analytics to Get Started with the IIoT

LinkedInEvernoteFacebook

Industrial IoT (IIoT) applications bring about many opportunities to increase operational efficiency by presenting personnel with timely insights into their operations. Visualizing IIoT data using visual analytics is a proven way to facilitate insight-driven decisions. So at the very least your IIoT initiative will start off by integrating IIoT connectivity, visual analytics and other system components. To best ensure early and ongoing success it is recommended that you follow the best practice of starting small, attaining quick wins and then increasing scope and/or scale.

The first step is to connect devices and systems and use visual analytics to create a simple visualization of your IIoT data. If the IIoT devices are mobile or geographically separated, then an appropriate visualization would be to display the location of the devices on a map such as shown above. This is an effective way to verify connections and validate successful integration.

The second step is to collect and intuitively visualize your IIoT data. At this point you can identify issues to make operational efficiency improvements.  As an example, a freight trucking business can see a map with the locations and times of where their trucks are moving at slower than expected speeds. This information is used to change the routes on the fly to maximize on-time deliveries. As this example highlights, connecting to IIoT data streams and visualizing the data facilitates operational efficiency improvements.

The third step is to correlate data from different systems and data sources, including time series data from devices at different locations. Visualizing data correlated by time and location makes it possible to create comprehensive big picture views that reveal details about what happened and is happening, where, when, why and how. Using the trucking example, areas where driving speeds are consistently slower than expected are highlight by the red lines on the map above. This information is used to refine future routes, schedules and delivery commitments.

The fourth step is to apply advanced analytics to the IIoT data to generate insights for inclusion into the visualizations. Returning to the trucking example, advanced analytics will recommend the optimal average truck speed to minimize fuel costs based on the weight of the load they are carrying. Visualizing each truck using color coding to highlight the biggest offenders makes the analytics results actionable at-a-glance so that operations managers and drivers can improve driving efficiency. In the image above it is easy to see the truck icons colored yellow and red that represent the trucks that are traveling outside of the optimal speed range.

Having completed these steps you are positioned to leverage your IIoT infrastructure and expand on your competency by combining visual analytics, data correlation and advanced analytics in innovative ways to address business problems and facilitate operational efficiencies that would not otherwise be possible. Future blog posts will cover such combinations and the corresponding operational efficiencies.

LinkedInEvernoteFacebook

Industrial Internet Of Things: End Up On The Winning Side

LinkedInEvernoteFacebook

industrial automation web

Operations and technology executives take notice when experts such as McKinsey project $11.1 trillion in economic value by 2025 as a result of linking the physical and digital worlds.  That’s a tremendous amount of economic value in a very short time, even if the experts might be a little off in their estimates.

The impact of the internet over the last 25 years certainly supports predictions of disruption and promise as the Internet of things (IoT) and Industrial IoT (IIoT) continue to connect the physical and the digital.  Organizations that transform themselves using IIoT can become giants; those who lag or fail in their execution may become mere memories. How do you ensure you and your organization land on the right side of this disruption?

Operational data is not a new phenomenon

Mentions of IIoT pepper nearly every operations- or technology-related conference these days.  Many traditional control system vendors are relabeling their offerings as part of the IIoT movement.  While industrial control systems remain critical components of operations in many industries, simply rebranding existing systems is certainly not going to going to deliver the trillions in economic value that McKinsey and others predict.  That magnitude of value creation comes only from truly transformative changes to how companies and industries operate.

Inherent risks in embarking on transformative change

Any large organization that can greatly benefit from the promise of an IIoT world has a number of existing critical assets, control systems, IT systems, processes, and skilled people that are essential to their operation.  Many industries have equipment and systems that have been acquired over several decades. Displacing all of these existing operational assets with a sparkly new, end-to-end IIoT-enabled operation is risky and typically not economically practical. Mergers, acquisitions, large IT projects and other attempts at transformative change fail at an astonishing rate. Estimated failure rates range from 30% according to the optimists to 70% from the pessimists.

If you are trying to create transformative change while relying on existing systems, processes systems, and people, you inevitably will face execution risks related to:

  • Lack of interoperability and openness of existing control and IT systems
  • Poor data quality in dependent systems
  • Lack of scalability, both technically and economically, of these systems
  • Insufficient internal talent, expertise, and bandwidth to manage a large project that touches the operations and IT sides of the business
  • Security exposures as you open up systems that have traditionally been on a closed loop system
  • Poorly defined objectives and accountability
  • Striking the wrong balance between building versus buying IIoT systems, ending up with either a solution that isn’t fit for purpose or a solution that exceeds cost and timeline estimates and doesn’t scale.
  • Difficulty maintaining balance of schedule, cost, ROI and executive support

How to end up on the winning side of IIoT

The risks and complexity make getting started with an IIoT initiative seem daunting.  But with this sort of disruptive change, playing the laggard is not an option. How do improve your odds for success?  Here are a few recommendations:

Think big but start small – Think big about how your organization can use new data sources and analytics to improve their operations and service uptime, but start small by first tackling a discrete, well-defined problem. Deliver value quickly and then consider another problem to tackle or extending the first solution to solve other related problems.

Clarify the problem, solution and accountability – Ensure the problem, solution requirements and dependencies are clearly understood.  Appoint a clear, accountable owner for the project who has organizational support.

Prioritize vendors that have “skin in the game” – Many software, hardware and communications vendors will happily sell you the parts of an IIoT solution–platform access, software licenses, sensors, access points, gateways, network access, and servers–but leave you to sort how to assemble these parts into something that solves your problem and provides value. Prioritize vendors who provide ongoing service with lower up-front costs.  This enables you to ensure the service delivers on its promised value before you have committed too much funding.

Challenge traditional thinking in your organization – What got you here won’t get you there! Clearly for many industries existing levels of security, reliability and regulatory compliance must not be compromised. However, that shouldn’t mean that new approaches such as cloud computing, internet connectivity, open source software, and commoditized hardware should be dismissed.  These will be required in many cases to realize the potential value of IIoT solutions.  Many companies use these technology solutions successfully today while balancing the associated risks.

Get Started – Don’t get stuck in analysis paralysis – Obviously it is important to ensure a problem, the solution and potential value for solving it are well understood.  It is also critical to assess risks and get necessary organizational buy-in.  Once you have done that, get started, learn and improve.  The opportunity is immense and those who lead with successful IIoT solutions will have tremendous efficiency advantages in their respective industries.

Allan McNichol is the former CEO of GOFACTORY and Managing Director for Intelligent Energy

(Image courtesy zurijeta / 123RF Stock Photo )

LinkedInEvernoteFacebook

Three Rules For User-Centered IoT Analytics

LinkedInEvernoteFacebook

In a recent TechTarget article, Maribel Lopez of Lopez Research says that manufacturing may have a head start in implementing Internet of Things (IoT) solutions, but she still sounds skeptical about IoT in general.

IoT “is a lot of talk and not a lot of action,” she says. “First of all, the phrase ‘IoT’ is meaningless because it doesn’t talk about anybody doing anything that’s useful. Just connecting your stuff is not enough.”

How do you make IoT solutions that are more action than talk? Lopez cites three rules for user-centered analytics in the Internet of Things:

  1. Be relevant to users:  Pushing data to users just because it’s possible is not helpful. Information presented to users needs to be relevant to a task or situation that needs attention. For instance, reporting that vibration in manufacturing equipment is within acceptable limits is of little use. Such information requires no action from the user.
  2. Do the work for users: Performing analysis for users is more useful than equipping users to perform their own analysis. Business intelligence tools may make a table of vibration data  easier to manipulate and visualize, but that manipulation and visualization work takes users away from their main task of operating the equipment. As Lopez says, “Saying that the vibration [of manufacturing equipment] is out of range is interesting yet not sufficient.”
  3. Be timely for users: Presenting users with exception data in context with time to react has far more value to users. That keeps users on task and ahead of potential issues. As Lopez says, “Saying the vibration is out of range and if it continues for the next two hours, it’s going to shut down the plant — that’s more interesting.”

Situational Intelligence abides by these rules by turning big data to little data, focusing users on events or conditions that require attention. It’s not looking at all the data that counts; it’s looking at the right data at the right time.

LinkedInEvernoteFacebook

Analytics 2016: A Look Ahead

LinkedInEvernoteFacebook

At the end of 2013, Gartner predicted that by 2017, one in five market leaders would cede dominance to a company founded after 2000 because they were unable to take full advantage of the data around them. They also predicted that by 2016 one in four market leaders would lose massive market share due to digital business incompetence.

Has this come to pass? Consider:

Gartner may not be prophetic, but it looks like they at least identified a major trend. So what will you need to take full advantage of big data in 2016 and remain, or become, a leader?

How much data are we talking about?

Big Data is a well-worn term anymore, but how much data is big? Jeremy Stanley, the CTO of Collective, cites this IDC graph in his presentation “The Rise of the Data Scientist.”

Spaur 2016 Look Ahead 01

By comparing points on the curve corresponding to 2016 and 2017, we can see that about 2,000 exabytes of data will be added to the digital universe in 2016.

2,000 exabytes equals two trillion gigabytes. This is roughly equivalent to the entire digital universe in 2012. Or for a silly spatial comparison: If 2,000 exabytes were divided among one-inch long, one gigabyte USB sticks, those sticks would stretch 31.6 million miles, reaching nearly from Earth to Venus.

Where will all this data come from?

In a blog post from micro-controller manufacturer Atmel, we can see that approximately five billion connected devices will join the Internet of Things in 2016.

Spaur 2016 Look Ahead 02

This is based on a consensus estimate drawn from firms such as Cisco, Ericsson, Gartner, IDC and Harbor Research.

Like the projected growth in data, the 2016 growth in connected devices will equal the entire universe of devices just a few years ago.

What kinds of data will we see?

It’s worth reflecting on what type of data IoT devices generate, because the types of data influence the types of analytics. Those additional five billion devices will provide data that:

    • allow manufacturers to follow their product through the supply network to the end-consumers
    • communicate when and where they are being used and how often
    • communicate when they need refilled, replenished, repaired, and replaced
    • alert when they are operating under distress and may fail
    • provide transportation and logistics operators with more granularity in managing their cargo and fleets
    • provide convenience to the people who deployed them (such as automatically adjusting the thermostat to a comfortable climate when the a person is within 15 minutes of their home)

Because devices are connected and communicating, they deliver a stream of data. This becomes a time series of data, since when data is recorded, sent and received yields useful insight into the data itself and the people /activities that generate that data.

Because devices are out in the world and not trapped on a desk top or in a data center, their location matters. This becomes GIS data, since where a device is it, what it’s near, and what it’s connected to on a network yields useful insight into the data itself and the people / activities that generate that data.

Time series and GIS data require new repositories and analytics that many organizations don’t yet have. This will become a challenge for companies in 2016. (The implications of new data types is a big topic that we’ll be exploring further in 2016.)

How will we handle and analyze all that data?

In his book The Singularity is Near, Ray Kurzweil argues that we’re drawing close to when a $1,000 computer will have the same cognitive power as a human brain.

Spaur 2016 Look Ahead 03

In 2016, a $1,000 computer will surpass a mouse brain. (You didn’t realize that a mouse brain does so much, did you?) The $1,000 human brain is just a couple years away at current rates.

We’re already at the point where, for many analytical tasks, we require computerized brains to do our heavy data integration and computational work. Think of weather modeling or financial markets or piloting aircraft and spacecraft.

What software will run on these more powerful computers? A Forbes article by Louis Columbus summarizes trends in big data analytics through 2020, including this graph:

Spaur 2016 Look Ahead 04a

In 2016, the big data analytics market will grow by approximately $1 Billion across five main categories: real-time analysis and decision-making, precise marketing, operational efficiency, business model innovation, and enhanced customer experience. The analysis of transactional, time-series and GIS data applies across these five domains.

Are you ready for 2016?

Like Gartner, these other studies are not necessarily prophetic, but they do point to the overall trend. The opportunity awaits in 2016 for you to apply increasingly affordable computing and analytics power to correlating, analyzing and visualizing new types of data to generate new insights, new opportunities and new revenues, thereby avoiding the fate of the eclipsed companies listed at the start of this post.

How do you take advantage of this opportunity in 2016?

Start small and move fast in testing use cases of data-driven changes that make an impact in your operational efficiency and your relationships with prospects and customers. That covers three of the five categories of analytics listed in the Forbes article. Increased operational efficiency generates savings that you can apply to further data-driven initiatives. Improved relationships with prospects and customers increase top-line revenues and bring you market visibility. With increased top-line revenues and bottom-line savings, you’re on your way to data-driven business improvement.

Why do you need to do this? Your customers expect it and your shareholders require it, mainly because your competitors are already doing it.

(Ron Stein contributed to this post)

LinkedInEvernoteFacebook

Operational Analytics, Business Intelligence and The Internet of Things

LinkedInEvernoteFacebook

IoT 01

The Internet of Things (IoT) is rapidly changing the way business operations are monitored and managed. Connected devices detect and communicate the status of essentially any aspect of manufacturing, warehousing and distribution. Many of these same devices are also able to receive commands such as to open or close a switch or valve. As this digital transformation pervades throughout operations the speed at which adjustments and corrections can be made to improve processes, throughput and cost efficiency is becoming faster.

The increased speed of process throughput and improvement now exceeds the capabilities of traditional Business Intelligence (BI) systems offering “descriptive analytics” that are inherently retrospective. The traditional BI modus operandi was to review the output from analyses and then take corrective measures. The cycle time typically spanned days to more than a month. Nowadays with IoT, the cycle time is reduced to mere hours, minutes or even seconds.

This sea change poses challenges for BI solutions that were not designed for fast cycle times, much less immediate real-time processing of streaming data. Just about every operation today is awash in data and crunched for time.

The data problem will continue to pose ever greater challenges because:

  • The Internet of Things is expanding, which means that smart sensors will soon be almost everywhere, creating additional streams of continuous data.
  • New technology will measure data at ever finer intervals, such as synchrophasors used in the transmission and distribution of electricity that measure voltage up to 30 times a second
  • Lean operational processes, such as Kanban and flow, improve operations and just-in-time production and inventory, and generate large volumes of data in the process.
  • Digital customer service increases the number of touch points between customers and vendors, generating still more data.

For all this data to make an immediate impact on your operations, you need to be able to capture it, normalize it, and in many cases analyze it immediately.

This is where traditional (BI) solutions fall down. BI was not and is not designed for real-time analytics of large volumes of high-velocity data. It enables users to ask questions by querying their data, but leaves it to the user to convert the data-out responses to usable and actionable answers and then decide how to apply them. More specifically, BI systems were originally designed for producing data and reports, organized and visualized in presentable formats (e.g., tables, graphs). This was and still is a very useful and valuable, but it’s not the same as enabling ongoing and in many cases real-time operational process management.

To take a data-driven approach to improving operational efficiency, what you need is a more comprehensive analytics approach that integrates and analyzes multiple sources of data both in batches and in real-time to deliver insights that you can act on immediately to drive and/or fully automate business operations.

The need for a more comprehensive solution that transcends the now limiting capabilities of BI systems has been met by a new category of enterprise software solutions referred to as “situational intelligence” (SI). Situational intelligence is a superset of BI capabilities that adds analysis of operational systems with purpose-built advanced analytics that can consume any type of data: internal, external, structured, unstructured, big, streaming and more.

With access to all this data and an understanding of its contribution to the big picture, situational intelligence illuminates the what, where, when, why and how of every asset and situation to provide context needed to make fast and confident business decisions that lead to optimal actions and outcomes.

I strongly recommend that organizations not only adopt and operationalize advanced analytics, but do so within the context of SI solutions to thrive and survive as the IoT transformation continues to unfold.

That’s a bold statement, I know. In coming posts I’ll discuss specific use cases to show how situational intelligence optimizes operations, helps handle uncertainties that arise, and detects and corrects anomalies as they occur.

LinkedInEvernoteFacebook

Can Connected Cars Learn to Increase Their Operational Efficiency?

LinkedInEvernoteFacebook

You become more adept at driving familiar routes over time. You learn the best times to leave to avoid traffic, which highway lanes move faster, and where large puddles form after heavy rainstorms. Could your connected car also become more adept over time?

Using the spatial, temporal and network analysis of situational intelligence, a connected car could learn your normal commuting routes and times, correlate those with weather records and your fuel efficiency performance, and offer suggestions for how to drive the route more fuel efficiently. For connected hybrid cars, such data could be transformed into optimizing when to draw on battery charge versus gasoline for more fuel efficiency. Similar analytics might make alternate suggestions if you prefer to make the trip as quickly as safety allows, regardless of fuel efficiency.

Connected cars could perform their own version of air conditioning demand response. On hot days, cars could analyze the forecasted outside temperature along your route, or draw on outside temperature readings from other connected cars ahead of you. Those other cars could also share their operating performance related to specific stretches of road. Your car would then combine this with your interior comfort preferences and your current fuel supply and usage.

Based on all this information, your car could control your air conditioner in much the same way your utility controls air conditioning during demand response event, pre-cooling the cabin on flat stretches of road then throttling back cooling while driving up hills. This would mean better fuel efficiency for you without sacrificing comfort while you drive.

Demand response is just one way that connected cars could learn from one another to increase operational efficiency. Crowd-sourced predictive maintenance might be another way.

Suppose that past data shared between connected cars shows that drivers in your city with your make and model of car tend to need their water pump replaced after 57,000 miles of in-city driving, regardless of highway driving miles. Based on this data, your car could warn you of a potential pump failure before it happens.

Your car could even, with your permission, share this water pump information with others. Your mechanic might receive the update and order the part ahead of your next appointment, saving you time at the shop and preventing a future breakdown while out on the road. The car manufacturer could use this water pump information to improve product design and reliability and inform dealers of potential issues. Makers of after-market car parts might pay to receive this sort of reliability information to help them design and market products.

Connection, analytics and machine learning are still on the horizon for the average car, but that horizon is fast approaching. I think we’ll find great new conveniences and operating efficiencies as that horizon draws near.

LinkedInEvernoteFacebook