We’ve all heard predictions about the billions of things — and dollars — that will make up the IoT mega-trend by 2050. But that doesn’t tell the full story. To unlock the true potential of IoT you must overcome data challenges more so than problems surrounding the “things” themselves.
These are best described as “last mile” data problems, such as extracting data from devices and remote platforms, to interpreting data analytics to drive productivity and peak performance. Whether talking about a connected home or industry-scale solutions, there’s often a disconnect between collecting data and exposing its information in a useful way that can be explored.
Think back to when Steve Jobs first introduced the iPhone. His argument for a large touchscreen was that every app needed its own user interface. A similar argument applies to analytics.
Every question we ask of data needs its own chart and visual perspective, which is especially true when it comes to the exploding amounts of sensor data that form the foundation of IoT. Unfortunately, most IoT applications ship with one-size-fits-all views, or “dead-end dashboards.” They answer a pre-determined set of questions, and nothing more.
Tools need to be far more flexible to accommodate a user’s needs. “Drillability” is crucial to making IoT data useful. For example, you may be able to use an IoT application’s data—say, for a broken engine—to predict the frequency and type of future failure events. But what if you wanted to look at specific parts that fail? Interactivity and shareability are key to answering this natural follow-up question.
Ideally, users will have casual, in-depth conversations with their data—and with other data explorers—so everyone can uncover the permutations and patterns that create change.
These in-depth questions are closely related to the second key to IoT success: integration. It’s not just interactive data analysis that provides answers; it’s also combining IoT data with additional context.
Let’s start with a consumer example like combing your Fitbit data for a possible link between exercise regimen and sleep patterns.
Now, imagine discovering enterprise-level insights by blending disparate data. Sensors embedded in a jet engine can help predict when it might need service. It could preempt failures and save billions of dollars – even lives. And by integrating it with other info, it could also help us make better budgeting decisions by product or region.
We live in a world where “perfect data” is increasingly becoming an oxymoron. However well the data may be stitched, it’s likely stored in a source you can’t connect to, missing key elements, or isn’t formatted for deep analysis. These drawbacks similarly apply to IoT applications, especially when there is no consensus on standards to support device interoperability.
Rather than having incomplete data paralyze your business, you must iterate toward the right answers. In particular, this applies to companies without large volumes of data to mine. Several organizations are focusing on simple streams of sensor-based data to drive projects that uncover simple insights and deliver early adoption of analytics. These smaller efforts face lower entry barriers and build momentum to take on larger challenges.
As you iterate, you learn that “good enough” data is usually sufficient to answer most, if not all, questions directionally. Moreover, a better understanding of data gaps helps you fix process issues that improve how your data is captured and ingested—and move you closer to actionable insights.
4. Embrace Data Gravity
Do you track website clickstreams or measure consumer sentiment? If so, you have external data, and it’s being generated and stored in the cloud. Why? Lower overhead, fast startup time, and infinite scalability. Constellation Research estimates that by 2020, 60% of mission-critical data will reside outside a business’s walls—that’s more than half of your data generated externally, in just three years.
What does this mean for IoT analytics? Traditionally, business data was generated behind its internal firewall—so it made sense that you had on-premises data warehouses, administrators, and analysis tools. Today, your organization must embrace platforms that conform with data gravity to conduct and manage analytics where the data sits. When you see the speed at which cloud-hosted tools can generate data, data gravity begins to make sense.
But moving to cloud-based BI doesn’t mean jumping in all at once. Remember that data gravity influences the location of analytics. So if your data is stored across cloud and on-premises, your analytics need to provide a hybrid solution. Cloud services are there to support your business, not to be an all-or-nothing solution.
5. Don’t Think Tools, Think Platform
IoT data is often heterogeneous and lives across multiple relational and non-relational systems, from Hadoop clusters to cloud warehouses to NoSQL databases. So, you have to dispel the notion that a one-size-fits-all IoT tool traverses the entire journey from data to insight.
For great analysis, you need robust data prep and enrichment, scalable storage, a catalog to aid with governance, and—to top it off—an intuitive end-user analytics platform to drive insight. Modern organizations combine the best solutions in an agile stack that can be reconfigured as needs evolve. These needs are a function of user personas, volumes, frequency of access, speed of data, and more. This stack is architected for the use case, and forms the foundation of your data strategy. Its flexibility will ultimately drive technology choices.
So what’s the one takeaway for the analytics leader or CIO who’s on the cusp of an IoT analytics initiative? Define a platform vision. Thinking this way might seem overwhelming at first, but remember you can grow a stack to suit your needs: Build the blocks as you achieve the plan piecemeal. Many decisions and actions are reversible, and you can course-correct as you learn more. You’ll see measurable results in no time with platform analytics.