Trust and the Internet of Things

One of the main things we are focusing on at Dev Jam is scalability. The goal of OpenNMS has always been to become the de facto management platform of choice for everyone, and as more and more things become connected to the Internet, scalability will be a huge issue. It’s one thing to monitor a drink machine and quite a different thing to monitor millions of them.

The main performance bottleneck in OpenNMS has always been the storage of time series data. While much of it can be addressed via hardware, we realized we needed a write-optimized system that could perform at scale. The result of that work, lead by Eric Evans, is NewTS, the New Time Series database built on Cassandra.

As a left leaning libertarian, I’m really concerned about the privacy issues of the Internet of Things. One of the reasons I work on OpenNMS is to insure that the best platform for managing this data is also one that can be privately owned. It will be the end user’s choice one how that data is stored and shared.

Yesterday Apple announced a number of new initiatives at their WWDC keynote. These included iHome, a home automation platform, and iHealth, a platform for gathering personal biometric data. Among the new shiny was a total lack of concern about privacy. An example given was an integration between iHome and Siri, where you could tell Siri “I’m going to bed” and certain actions would happen. What people tend to forget is that almost all of Siri’s processing is done on Apple servers, and if you tell Siri you are going to bed, you are also telling Apple and who knows who else. The potential for 1984-like abuse is there, and it will be up to Apple users to either trust Apple or do without.

But does this have to be the case? Do users have to give up privacy or blindly trust in third parties in order to access useful technology?

I don’t think so.

The technology is there, and I believe eventually society will demand it.

One of the oldest examples of such technology is public key encryption. It’s a beautiful system, and while the encryption aspect is important, so is the ability to digitally “sign” things. What’s brilliant about it is that no personal information has to be given up – if one trusts the key then one can trust the signature even without knowing who signed it, and third parties can verify the signature as well.

I am eagerly awaiting the release of the Angel open source health sensor. I have a strong interest in tracking my personal health data, but I also don’t want to share it with a third party unless I choose to do so. None of the popular sensors, to my knowledge, have the option of keeping that data private if you want it to be useful.

So I’m constantly on the lookout for companies and products that “get it”, that understand my desire to only share the information I choose. I want to be marketed to, show me cool things I’m interested in, but on my terms.

One company I recently came across is called Personagraph. On the surface it appears to be another analytics company, but if you dig deeper you can find that they have a strong interest in personal privacy. I learned about them on one of my recent trips to Silicon Valley, but I had to dig to find references that reflect what I was told.

Of their three main products, I’m really drawn to PG Protect. The idea is as follows:

Let’s say that I have an application that collects several hundred metrics about me. For example, totally as a mind experiment, assume that I like to watch The Big Bang Theory in the nude while eating popsicles. I might not want to share the nudity aspect with anyone, but I am okay with letting people know I like the show and that I like popsicles. In a traditional marketing application, I would need to reveal these things to a third party, which would in turn sell them to interested companies.

But what if instead of individual metrics, some sort of aggregate score was created from my likes and interests, and that anonymous score was the only thing that was presented to the third party? They might be looking for people who score high on a number of metrics and yet they could be matched with me without knowing exactly which metrics I trigger. I could then be presented with offers and ads, and then the choice could be up to me if I want to engage. Speaking of “engage”, a product like Personagraph’s PG Engage could be used by the company doing the marketing to see how well their campaign was doing, without having to know any personal information about me.

Pretty cool, huh? What I like about their products is that they provide trust without having you to take it on faith.

Unfortunately, it’s hard to get the privacy angle from their website. I did find a little YouTube ad that touches on it:

but the real meat can be found in this white paper. If you care about privacy, I encourage you to check it out as well as to make sure the companies providing the products and services you use are aware that privacy is important to you.