The convergence of IoT and big data promises tremendous new business value and opportunities for enterprises across all major businesses. Critical to unlocking the value from all this newly created IoT data is data science and IoT analytics.

As IoT large data scales exponentially, enterprises seem to accelerate IoT analytics to handle the torrents of data and deliver zero-latency exploration to IoT data that has traditionally been saved and left in the dark.

The Convergence of IoT and Big Data

Within the past decade, IoT has grown from market to requirement. The interconnectivity between conventional computers and mobile devices sparks curiosity about non-internet-enabled physical devices and everyday objects, from automobiles to toaster ovens. 

Emergent fields such as Industrial Internet of Things (IIoT) and Web of Wearable Things (IoWT) demonstrate how IoT has come to be an intrinsic aspect of the economy, from manufacturing to ingestion.

Concurrent with the explosive increase of IoT is large data. As enterprises wrangle with new technologies aimed at capturing the volume, velocity, variety, and veracity of their data, they can also discover new value by using IoT data analytics.

Business Intelligence (BI) and data visualization solutions continue delivering new insights and new ways of solving old problems. As enterprises capture greater value from their ever-growing data warehouses and lakes, they comprehend IoT as the upcoming great frontier in big data analytics.

IoT and large data are now inseparable. The convergence of these two technologies chemicals their business value and opportunity for new business use cases that drive innovation across every industry. New sensors, instruments, and an array of other connected devices now always stream invaluable IoT data about the world around us.

By 2020, an estimated 4.4 trillion GB of data will be generated per year. And the value of this data will only continue to grow. The challenge today is IoT data analytics and visualization at scale.

Latency Kills IoT Data Analytics

With the exponential growth of data streams comes new IoT data challenges: analyzing and visualizing that data at scale. Popular spreadsheet tools max out at 100,000 rows or fewer. Mainstream IoT analytics applications are usually capable of greater scale, but the query times leave much to be wanted.

Data scientists and analysts can wait five minutes, or even five hours, for a query to return. This is because the scale of data, thanks largely to the unexpected growth of IoT, has dramatically outpaced the development of computing power.

This is critically important in data science and IoT, as data scientists and analysts become reluctant to run additional queries because of the wait period. Their data exploration is disrupted, their thought-processes impeded by another trip to the coffee machine while another query gradually turns to life. 

It is frustrating and totally antithetical to the analytical procedure. Hypotheses go awry. Data goes unexplored.

As a consequence of these wait times, data engineers have come up with ways of bypassing the limits of CPU processing. They take averages of the data, or they sample small percentages of datasets to extrapolate benefits.

However, this is totally inappropriate to the entire world of IoT data, where there is great value to be had in single location and timing events, events that would otherwise be washed away by bad data science.

As the constraints of CPUs become evident, a growing number of forward-thinking data engineers and data scientists seem to use graphics processing units, or GPUs, to provide incredible acceleration at the incredible scale of IoT big data.

Bigger Data, Better Insights

Vehicle telematics data creates new opportunities for logistics leaders to improve driver safety and reduce prices. Read the whitepaper for Logistics Leaders to learn more.

The Shift from CPU to GPU

Acceleration is a fundamental shift in the realm of data science and analytics. Within this world, mainstream analytics programs still reign supreme, but not for much longer. These tools consist of the frequent BI and data visualization solutions, as well as analytics tools for Geographic Information Systems (GIS).

These are feature-rich tools, primarily designed to present self-service reporting dashboards, drill-down, and visualization capabilities to a lot of employees.

Existing analytics tools typically rely on inherent processing technology and require complicated, expensive system architectures and data pipelines to encourage them. Even then, these analytics tools are slow, especially for IoT large data analysis.

Data scientists and analysts are accustomed to lengthy question times, from five minutes to five hours or even more.

As IoT big data scales exponentially, these query times become longer. The CPU hardware footprint becomes cost-prohibitively larger and more complex. These hindrances are why enterprises now look to accelerated analytics.

GPU acceleration provides 1000x the speed of normal queries, at a fraction of the hardware footprint. 

This is because GPUs are designed to rapidly render high resolution images and video through parallel operations on multiple sets of data. GPUs currently drive GPU databases, which we proceed into much greater detail about on our Introduction to GPU databases page.

Accelerated Analytics for VAST IoT Data

With the shift from CPU into GPU-based analytics solutions, enterprises are unlocking new business use cases around data that was simply too large or streaming too fast to analyze. 

Together with the inherent spatiotemporal (location and time) component of IoT data, as well as the ability of GPUs to plot billions of factors and leave complex polygons, the business value of accelerated analytics on IoT analytics usage cases grows exponentially.

IoT business analytics use cases can be characterized from the acronym VAST: Volume & Velocity, Agility, and Spatiotemporal.

Volume & Velocity

IoT data has tremendous volume and velocity. Data collection in IoT now streams in from a rapidly growing number of IoT sensors, clickstream data, server logs, transactions, and telematics data generated from moving items, such as mobile devices, cars, trucks, aircraft, satellites, and boats.

Often this data is pouring in at millions of records, or more, per second. Tables of IoT streaming data often range from tens of millions to tens of billions of rows.

Sometimes hundreds of billions. Learn more about OmniSci’s public business analytics, telco data analytics, defense analytics and military analytics solutions for real-time defense and intelligence insights.

Agility

IoT data is massive: it simply overwhelms traditional CPU architectures, forcing ever-expanding hardware footprints. To compensate on its limitations, engineers downsample, index, or pre-aggregate the data.

This is completely antithetical to the value proposition of IoT analytics use cases, which often require the agility to identify a single spatiotemporal event amidst billions of other occasions, not the average of a thousand, or a sample of a billion events.

Spatiotemporal Data

At least 80 percent of data records created today contain spatiotemporal data. For IoT data, that percentage is even higher. Plotting these points is computationally intensive on CPU analytics tools, and rendering is near impossible. 

GPUs were designed to render graphically-intensive video games, ergo they can plot and render millions of spatiotemporal things in milliseconds.

About the Author

Cesar Agostino

Hardware and Software Design Leader in 4i Platform.

View All Articles