Gartner Smackdown on Open Core

I’ve been saying for years now that open source is more than just a marketing term, and I’ve probably bored most of my three readers by harping on it.

While it may not seem to matter to most people, it is dreadfully important to my business that open source not become synonymous with fauxpen source (or to use the more politically correct term – open core). It is hard enough to educate the market on the values of open source software (if it is free, how can it be good?) without having the waters muddied with the confusion brought on with the hybrid open source/commercial software model.

But then again, I am not a powerful voice in open source (at least based on my Twitter followers) so why should people listen to me?

However, as far as analysts go, there are few that are held as in high esteem as Gartner. Today, Michael Coté sent me a link to a Gartner blog post by Brian Prentice called “Open-Core: The Emperor’s New Clothes“.


There is a gem of a quote in every paragraph, so just go read the article, but here are some of my favorites:

You’ll soon realize that the fabric making up the garb of their stated innovation is a fabrication. They’ll then be exposed for exactly who they are – a good old fashion software vendor. Just like every other one you’ve come to know.

I have asked, on numerous occasions, for someone from the open core front to explain to me the different between their model and the traditional software model of having an open API to add functionality while keeping the best parts of the application closed. HP was able to create a huge environment around OpenView this way, without claiming to be open source (even though they use “open” in their name).

Furthermore I have personally been told by one such open-core provider that the reason a new feature, which was clearly of value to all users, was only being provided in the paid-for, proprietary version was that they “had investors they needed to satisfy.”

This sounds familiar. (grin)

Open source projects such as OpenNMS can provide a powerful, scalable and flexible solution. While I never want to lead with it when talking to potential clients, it also has a large cost benefit. The problem with the open core model is what Gartner calls the “super-size trigger”.

Besides, what you already know is that this type of functional separation creates what Gartner refers to as a “super-size trigger.” The minute you require a feature only available in the full version then the entirety of your commitment needs to be scaled up and re-costed to the full-cost offering.

Since open source is not a marketing term, these hybrid vendors are only trying to ride the hype. Brian addresses that, too:

This is where the hype starts to creep in. The idea that a functionally complete, proprietary solution is somehow unique because it was built atop an open source base fails to recognize the fact that many proprietary solutions are being built using open source components

But what about the communities around these open core projects. Aren’t they different than a commercial software product?

Even the very definition of “community” is being adapted to suit the open core narrative. What has largely interested the corporate IT world is the concept of a community as a collection of code contributors working outside a normal project/company structure. But now open core providers are extending the term community to include users and even resellers. That, of course, is what we’ve all been calling a software ecosystem for the last twenty years. Same old, same old – just co-opted terminology used to describe it.

I have no real problem with open core as long as they don’t use the term open source. It is simply another commercial software business model. But when they refer to it as open source it causes confusion in the marketplace for my business, and that’s what bothers me. From the article:

Be clear, there’s nothing nefarious going on with open core. It’s just that there’s just nothing particularly new or innovative going on either.

I believe that all commercial software companies will start focusing on creating a much larger “software ecosystem” which will include access to some source code, new APIs, social networks, etc. But it will not represent an open source business.

It’s nice to see Gartner putting it out there in such a straightforward and blunt manner.

Open10MS: Still Open … Still Free

I’m sitting in my office, which once housed all three of the OpenNMS Group founders, drinking some Copperline Amber while listening to the “tap tap tap” of the drums as the guys in the next room play Rock Band on our HD projector.

The reason? Ten years ago the OpenNMS Project was registered as project 4141 on Sourceforge (become a fan on Facebook).

Not many open source projects can claim 10 years, and I am both delighted and humbled that our community has not only allowed OpenNMS to survive but to thrive.

To celebrate, we threw a little party. Here’s a picture:

Back row:

Jay Aras (MA, USA), Donald Desloge (NC, USA), Brad Miesner (NC, USA), Mike Davidson (NC, USA), Me (NC, USA), Brian Weaver (NC, USA), David Hustace (NC, USA), Matt Brozowski (NC, USA), Jeff Gehlbach (GA, USA)


Alex Finger (France), Craig Miskell (New Zealand), Bill Ayres (OR, USA)

iMacs: Matt Raykowski and Mike Huot (MN, USA), Johan Edstrom (CO, USA), Antonio Russo (Italy), Alejandro Galue (Venezuela), Klaus Thielking-Riechert (Germany), DJ Gregor (OH, USA)

Front Row:

Seth Leger (NC, USA), Ben Reed (NC, USA) and Larry Karnowski (NC, USA)

It is truly a nationwide and worldwide effort (it was hard to pick a time for the picture, as it was Tuesday evening for the Europeans and Wednesday morning for Craig).

As I have often mentioned, I didn’t start OpenNMS, so it was nice to have Brian Weaver, the original architect of the product, come out for the day. He told me that the project started on 1 July 1999, so it is actually several months older, but as we have a firm date on Sourceforge I figured we’d stick with that.

Also in attendance were Mike Davidson and Larry Karnowski, two of the original developers, as well as Ben Reed and Seth Leger (but the latter two work for the company now).

Here’s a picture of the gang from May of 2002, when OpenNMS version 1.0 was released. See if you can pick out the people who are still around.

In addition to beer and pizza, David’s wife made an amazing cake:

and DJ Gregor sent us, overnight, some Jeni’s Splendid Ice Cream.

Yum. Here’s to the next 10.

Fun With Billing

I bought my iPhone back in November, which was the last month I would have a Sprint bill.

So I thought.

Since December, every month like clockwork I get a multipage statement in the mail letting me know I have a credit for $1.24.

If we assume it costs, conservatively, $0.50 to send that bill, they’ve already spent almost twice what they owe me to let me know about the credit.

In other billing news, our newest insurance provider, Guardian, has informed my insurance broker that we haven’t paid our last two bills. As someone who is incredibly anal-retentive about paying bills I can explain: we haven’t gotten any.

And, in order to sign up to pay your bill online, you have to have information that is only sent with the paper bill, which I haven’t received.


Is it any wonder that the adoption of paperless billing has stalled? If we can’t trust our suppliers to deal with paper bills correctly, why would we trust them to do it electronically? With the huge penalties for missing, say, a credit card payment by even by one day, entering into an agreement to only get an electronic reminder can be a bit scary.

I wish they would adopt the process my eye doctor uses to verify appointments. Two days before the appointment they send out an e-mail. In the e-mail is a link to confirm or reschedule the appointment. If you don’t click on it, they call you the next day.

If Citibank or Chase would implement a system where they would send a paperless statement (or let you know that it is ready) and if you didn’t verify receipt within a certain period of time they’d send out the paper statement, I’d sign up in a heartbeat. Technologically it wouldn’t be hard to implement.

Can you tell I’m a little overwhelmed with paperwork lately? As OpenNMS has grown I’ve had to spend more and more time with administrative tasks than playing with software. I’m not sure I like it.

At least I got to help a client figure out a notification issue that was giving him trouble yesterday, and a lot of it is dealing with new business, which is always great, so I guess I shouldn’t complain.

OpenNMS in the Cloud

One of the things I hate is the buzzword du jour, be it virtualization, “devops” or “the cloud“. It’s not that there isn’t some nugget of truth in all of the press surrounding such things, but one of the reasons I got into open source in the first place was its focus on results and not fluff.

With a commercial software product it is very difficult to determine if it is the right solution to a particular problem without buying it. With open source software, there is no licensing cost and thus it is possible to easily try it out before making a commitment to use it. Thus the focus is on usefulness and not a flyer saying “we’re the best”.

This isn’t to say that the open source world is completely free of fluff and posturing. With the prevalence of venture-backed open core companies, their ultimate goal is not the proliferation of robust open source code but to be purchased for a large multiplier. The best way for them to create perceived value is to latch on to the latest buzzword, as if to say “hey – you need a piece of this – better hurry up and buy us,” and it is a strategy that has worked well in a number of cases. I just don’t like calling it open source.

So I have been pretty quiet on the use of OpenNMS in “the cloud”. This isn’t to say that we don’t manage cloud resources, but the management challenges of cloud-based services aren’t much different than “normal” ones. The power and flexibility of OpenNMS make it as useful in the cloud as elsewhere.

In fact, one of the major players in cloud computing, Rackspace, uses OpenNMS to manage its Cloud Files system.

We are happy to announce that we are working with another major company BT (British Telecom Group) in developing a trusted cloud management platform called the Cloud Service Broker. In the words of John Gillam, Programme Director, BT Global Services:

The Cloud Service Broker TM Forum Catalyst provides an excellent opportunity to address the barriers to cloud adoption for enterprise customers. Whilst enterprises wish to lever value from the cloud, they are apprehensive over losing control, citing areas of concern such as IT Governance, application performance, runaway costs, inadequate security and technology lock-in. The CSB addresses this by matching cloud services to each enterprise’s needs, enforcing the right policies, and then showing how this can be backed up by an ongoing service level agreement. We believe developments of this nature will be of primary importance in future cloud services.

We will be presenting our work at the TMForum’s Management World conference in Nice, France, this May. In addition to BT’s offering, we will be demonstrating integration with products from Comptel, Square Hoop and Infonova in order to deliver a complete cloud services platform.

Open Data and Open Source

We hit a snafu this week that kind of brings into sharp focus what Tim O’Reilly was talking about at the Open Source Business Conference last week.

We are toward the end of a project for Papa Johns Pizza where we will be using OpenNMS to monitor all of their stores. This will require our remote monitor to be placed in 2500+ locations, and they wanted a way to display the status of those monitors on a map.

Without spending too much time thinking about it, we just grabbed the Google Maps API and ran with it.

Unfortunately, we didn’t realize that using the Google Maps API has an associated cost. It’s based on page views and can get very expensive. Part of it has to do with the fact that Google likes to make money (of course) but I was also told that there are licensing issues involved as well. No key is required if the server is publicly accessible, but that tends not to be the case for OpenNMS servers.

In Tim’s talk he demonstrated that while the struggle used to be over closed vs. open applications, with the advent of companies like Google creating huge, closed databases, the new struggle will be closed vs. open data. It was just funny to run smack up against that less than a week later.

Papa Johns uses MapQuest for their web site and they have a license, so in the near term we’ll simply add MapQuest as an option along with Google Maps. I want to be able to offer a free version as well.

There are some free alternatives, such as OpenStreetMap. As we have always viewed OpenNMS as a network management application platform, we want to offer as much choice as possible. We’re looking at using Mapstraction to enable support for multiple map backends (including Google Maps and MapQuest) and hope to have that done for version 1.8.