The Internet of Cat Toys

One cat, a radio collar, and a night on the town – this little adventure turned into an entertaining article in Wired magazine 8 August 2014 by Andy Greenberg about the creative use of a feline investigator to find weak points in security of the neighborhood’s wifis. A sensor-toting cat is not where most of us imagined the Internet of Things (IoT) would go, but, while funny and surprising, this example captures a deeper truth about truly innovative technology: it transforms the future, but not always in the ways you expect.

So much is being said about IoT sensors deployed in so many different situations that you may be wondering, is the Internet of Things an amazing future or hype? This question is an easy one to answer, because in this case the future is already here. IoT is a large-scale transformation in how we watch our world and how we can make smart, quick, data-driven decisions, and it’s already happening. We are in the early days of this growing phenomenon. Already, thanks to widespread deployment of sensors, things talk to servers and things talk to other things directly. There are many examples of sensor monitoring being used in new situations. Some are flashy and imaginative, such as unmanned robots exploring the ocean surface or the development of methods to manufacture sensor-embedded fabric to make “smart shirts” and more. Another twist is the very new online service, called dweet, that lets users monitor raw data reporting from machines and, for a fee, creates a private group with access.

As exciting as these examples are, it is not just the novelty or imaginative nature that determines success. Not surprisingly, in almost all cases it is mainly economic forces that drive which ideas actually make it into production. A prime example of economic advantage is the manufacture of “smart components”. Not only do manufacturers employ increasing numbers of sensors to monitor the equipment used in factories in order to control quality, but manufacturers also are building components capable of reporting back on their performance in the field. Parts used in giant turbines that generate energy or in jet engines have a wide array of sensors, some of which report back to the manufacturer so that they can provide customers (utility companies, airlines) with performance profiles, including levels of wear. This information enables them to fine tune performance and do predictive scheduling of maintenance more effectively. Thus these “smart components” have a value-added aspect for the customer as compared with mute parts. That in turn provides a competitive edge to the manufacturer.

Sensors are already in place, and there are new sensor technologies coming, but in some ways the biggest frontier is in discovering how to handle the level of data they collect. Just measuring it doesn’t change the way you do things if you cannot also collect, store and access it in reasonable ways for analysis. This problem has happened before. In 1978 an early NASA environmental monitoring satellite, Seasat, collected so much remote sensor data about ocean and atmospheric conditions during its short 106 day lifespan that it could not be analyzed. Much of the information sat dormant for years until computing technology improved sufficiently to be able to deal with it. There’s a lesson here: A similar challenge is happening today, as the exponential expansion of new types of sensors in the IoT world is creating a flood of information at a level never seen before. 

How can you listen effectively to the Internet of Things?

It is clear that something big is happening; we can see that in many ways. But what isn’t clear is what the details will be. We can see the fundamental forces driving the change, but we likely will be surprised by how exactly it all turns out, at least in some cases. We can, however, pay attention to the fundamentals and thus be prepared to take advantage of valuable innovations when they do come.

When you look beyond the imaginative ways people are thinking about using sensors, you see that it’s the practical considerations of how to handle the deluge of data that can make or break these endeavors. Part of the fundamental preparation to deal with expanding IoT is a data pipeline from sensor to analysis. As the sensor measures a parameter, there’s a need for a proximal collection step that sets up the channel to communicate the observation. Qualcomm, for example, is working on building the communications technology to build into sensor devices. Next comes the need to move the huge amounts of data to the location where central analysis will take place. To address this need, Cisco is developing communications technology to move data from source to storage and analysis. 

Internet of things data pipeline 

Up to now, the central analysis function has had a bit less attention, but that’s changing. A lot of information from the IoT is time series data, so it’s particularly important to have excellent new tools to build reliable, scalable and high performance time series databases. Some companies offer to do so as a service, either specialized to particular areas, such as the environmental monitoring data platform from newly re-named Planet OS, or as more broadly applicable services for storage and analysis of time series data. Others meet the challenge with No-SQL Hadoop-based time series databases built in-house using a combination of enterprise and open source tools such as Apache HBase, OpenTSDB, Grafana, InfluxDB and the enterprise NoSQL database from MapR (MapR-DB). The MapR-DB is integrated into the regular file system, making it extremely convenient to manage. 

As the IoT grows, there’s a growing need for storage and access technologies that can keep abreast of the demand to handle high velocity machine data, whether from sensors or from more traditional feedback from servers. Recently, a test for ingestion rates showed great performance is possible with the right tools and design.This test used open source extensions developed by MapR to modify Open TSDB combined with the MapR-DB running on fast machines. The results? Ingestion rates as high as 100 million data points per second were achieved

With these new technologies on hand to handle the fundamentals in machine data flow at scale, including the ability to build effective, high performance time series databases, the possibilities for the IoT become wide open - smart shirts, smart cars, and now it seems, smart cat collars. It’s exciting to watch as a new frontier opens up, in part because we don’t know exactly what’s in store. We do know there will be hugely valuable advances in medicine, in better handling of natural resources, and in transforming manufacturing both as a process and as a reporting conduit for the performance of the products being made. Perhaps the most exciting IoT application will be one that no one at present has even considered – that’s the unexpected nature of revolutionary shifts in technology. 

And for that matter, who knows what your cat will do next?

Get the open source extensions to Open TSDB from the MapR App Gallery 

    Or download from GitHub.

Read more about the 100 millions points per second design 

Follow author on Twitter: @Ellen_Friedman

no

Streaming Data Architecture:

New Designs Using Apache Kafka and MapR Streams

 

 

 

Download for free