Nagios Spreads FUD About OpenNMS

It was brought to my attention that Nagios Enterprises has decided to go after OpenNMS by publishing a document called “How Nagios Compares to OpenNMS“. I am always flattered when companies feel the need to compare themselves to our project, especially when Nagios is considerably better known than OpenNMS, at least according to a quick look at Google hits (5.3 million to 453 thousand just now). I assume they feel that positioning themselves against our product is useful.

And I guess it could be, if they weren’t so wrong.

I would not pretend to make such a list from the OpenNMS viewpoint, since I don’t know all that much about Nagios, and what I do know comes from replacing it at a number of our customers’ sites. But let me correct some misunderstandings presented by this document.

Almost every missing check mark on the OpenNMS side is wrong. OpenNMS can’t monitor “Web Transactions”? What about the Page Sequence Monitor, used to insure the performance of such websites as It doesn’t have “Google Maps Integration”? Well, not just Google Maps, but Mapquest and Open Street Map as well. It can’t do Active Directory Authentication? Please, I set it up all the time.

I really chuckled when I saw that the product can’t do Data Export, Advanced or Scheduled reporting, since I just spent a week doing just that at a large bank based in Chicago (one that caters to institutions and wealthy individuals).

With the JasperReports integration, OpenNMS can generate amazing reports, and since we’ve extended it to be able to mine JRobin/RRDtool data directly, we were able to get OpenNMS to report on a huge Virtual Machine server farm at the bank on things such as CPU utilization, process memory utilization and load average in a format that could be exported to Excel without the need to even glance at a graph. It was fully automated, including the report generation, distribution and even provisioning the devices to be included in it.

The kicker is that OpenNMS can run any Nagios checkscript, even using NRPE (although I strongly recommend using the Net-SNMP extend function versus that protocol for reliability), so I fail to understand where OpenNMS fails in the “Custom Plugins” department.

They even get our pricing wrong, even though we publish it online. The package they refer to as “1 year of standard support and one 1 week of consulting” maps to our Greenlight Project, which is $23,000 not $30,000 (that is for the Greenlight Plus Project, which includes two weeks of on-site services).

But if you are choosing to use Nagios XI based on price, I think you should go with with it. OpenNMS is designed to be a network management application platform, and as such has a much wider scope than Nagios, which, let’s face it, is at its heart a script management interface. I’m not sure how one provisions devices in Nagios, but since that was left off of their chart I must assume it isn’t a key feature, whereas it plays a huge role when you are trying to manage a network of any size, such as the one at Towerstream. Considering the scale at which OpenNMS is useful, our prices are a bargain when truly compared to competitors’ products.

Two years ago at the Netways Monitoring Conference I saw a presentation from Audi where they implemented Nagios. It took them over a year. We have done the same at similar sites in less than three weeks. If you have more than one thousand devices, you are going to be very unhappy with the performance of Nagios, whereas OpenNMS has a track record of monitoring tens of thousands of devices on a single server for numerous companies over several years.

And finally, every bit of OpenNMS code is published under an open source license. Nagios XI is not.

On Thursday, The OpenNMS Group turns seven. We don’t have a million customers, but considering how awesome our customers are, I know that I couldn’t find a million of their caliber.

And never once have we resorted to this kind of FUD to promote our products.

Ready Player One

Note: This is a somewhat long review of the book Ready Player One by Ernest Cline. The short version is that if you are over 40 and you self-identify as a geek, you’ll love this book. Even if you are not over 40, you should check it out, but it will really resonate if you were a teenager during the 1980s. The following is as spoiler-free as I can make it, but if you are a purist, you might want to skip it.

I was first introduced to Ready Player One on the blog of Patrick Rothfuss. Usually, that is enough to make me at least check it out, but it was also item number three in Entertainment Weekly’s top ten list, and that surprised me since EW isn’t exactly known for its coverage of science fiction/fantasy.

Now before you start teasing me about reading EW (or even referring to it as “EW”) I don’t do it to keep up with the latest antics of Brittany Spears (she’s such a little scamp, isn’t she?). In my job it is sometimes a good idea to have some handle on popular culture, so now I know about that nice young man named Ted Situation who lives in New Jersey with his sister Snookie and they do that charity work on the coast. Plus, I keep it in the bathroom, and I find that the articles are the perfect length for the amount of time I spend there. Assuming I’m eating right, I can get through an issue in about a week, which is how often it is published.

Anyway, the basic plot is as follows: it is the year 2041, and the world is not a pleasant place. World economies have collapsed, the environment is a wreck and energy is scarce. Most people escape the drudgery of their lives in a online simulation called the OASIS (think Second Life crossed with World of Warcraft with a dash of Gibson’s Cyberspace). The creator of the OASIS is videogame designer James Halliday, who quite naturally is also one of the world’s richest men.

When the story opens, Halliday has just died. Having no heirs, he has placed his entire fortune, including a majority stake in the company that owns the OASIS, into an Easter Egg (extra, undocumented code put into a game or application) hidden in the simulation itself. The first person to find it, gets it. The story follows one such egg hunter (or “gunter”) as he searches for clues, all of which are based on things from the 1980s.

With that premise, I expected a nice little stroll down memory lane with nostalgia.

I did not expect nostalgia to punch me in the gut.

The book brought up memories of things I haven’t thought about in decades. For example, Halliday’s first video game console was an Atari 2600. My first video game console was an Atari 2600 – which I still have, by the way (and in the original box). At one point in the novel the plot involves Dungeons and Dragons, specifically a module called Tomb of Horrors. I can remember playing Tomb of Horrors. I didn’t remember it before reading the book, but as the scene was described I was thinking to myself “isn’t that the module with the sphere of annihilation at the end of the first corridor?” and sure enough, there it is, in the next paragraph.

I can also remember coming home from school and turning the antenna toward Charlotte to bring in this UHF station that carried Japanese shows (yes, kiddies, back in the day television came in over the air and not in on a little wire). One I remember involved a giant gold robot/rocketship named Goldar and his wife, a silver robot/rocketship named Silvar. Apparently that was The Space Giants. There was also an animated show involving a World War II era battleship flying through space. That, apparently, was Space Battleship Yamato.

There were also copious references to 80s music and movies, all of which really resonated with me. At one point the video game Tempest is referenced, and I was once part owner of a Tempest machine when I was at Harvey Mudd.

Cline even uses the term “open source” on a number of occasions. The bad guy in the novel is the IOI corporation, a services provider that has made a lot of money in the OASIS. From the book:

Like most gunters, I was horrified at the thought of IOI taking control of the OASIS. The company’s PR machine had made its intentions crystal clear. IOI believed that Halliday never properly monetized his creation, and they wanted to remedy that. They would start charging a monthly fee for access to the simulation. They would plaster advertisements on every visible surface. User anonymity and free speech would become things of the past. The moment IOI took it over, the OASIS would cease to be the open-source virtual utopia I’d grown up in.

Now the OASIS is in no way an open source product or platform. It just isn’t, but I so much prefer someone misinterpreting the term to mean “freedom” instead of “well, all I have to do is just expose the code”. The heroes in the book do embody a lot of what is sometimes called The Open Source Way in their behavior, goals and interactions with others.

Cline claims that the first ever Easter Egg can be found in the game Adventure for the Atari 2600. He states that back then, game designers were never recognized or given credit for their creations. This changed when Warren Robinett hid his name in Adventure.

There is a secret room in the game. In order to get to it, a number of things must happen. First, you have to retrieve a tiny, one pixel object hidden in a maze. Second, in the room next to the hidden room, you have to bring a number of objects (I think it is three). When you do this, the objects will start to flash. That has nothing to do with the Easter Egg but is instead an artifact due to the processor in the console being so slow that it couldn’t refresh more than two objects at a time. It is the same reason that the aliens in Space Invaders sped up as you kill them – the processor could then make fewer aliens move faster.

Once the barrier on the side of the room is flashing, and you have the “grey dot”, you can pass into a room that looks like this:

That was taken from my Atari Flashback machine – I didn’t want to have to dig out the old CRT television to hook up the original one I have.

Since so much of my enjoyment came from the fact that I lived through this time period, I am not sure how younger people will find the book. At one point in time I thought I’d figured out a plot point that would have really disappointed me (think deus ex machina) but I was wrong. I think the story stands enough on its own that geeks of all ages will enjoy it.

Ohio LinuxFest 2011

Just a reminder that I will be speaking (along with a lot of people much more interesting than me) at the Ohio LinuxFest the second weekend in September. On a side note, September 10th will mark 10 years since I started working on OpenNMS.

In any case, the organizers are offering a prize to the 1000th person to register for the conference. The Enthusiast level is free, so if you can be in Columbus, Ohio, you should check it out.

Lovin' Me Some SOGo (#noapple)

I really want to thank “jm k” for sending me a note awhile back on the SOGo project.

One of my big complaints about Android has always been that one must rely too much on hosting your data at Google to get the most benefit out of it. I have a nonnegotiable requirement to be able to synchronize my contacts and calendars across devices, and for the moment Apple doesn’t force me to use iCloud. However, I want to move away from iOS, so some sort of sync solution is required.

Any solution must be multi-platform. Most of the guys in the office have iPhones and MacBooks. Jeff has an Android phone and runs Fedora on his laptop. I run Ubuntu on my work desktop and, for the moment, OS X on my Macbook Air and the desktop at home.

Enter SOGo. SOGo is an open source “groupware” (how I hate that term) solution that enables one to manage calendars and contacts through a webUI, as well as desktop and mobile devices. The webUI also includes an IMAP connector that lets you access an IMAP server (a lá Squirrelmail – although one of the gotchas that hit me was that I couldn’t send mail unless the “To:” address was in my contacts).

For those that think open source can’t be beautiful, the webUI is very clean and attractive. It’s also all AJAX-y so you can manage your information as if you were using a native app (i.e. right click on a contact to bring up a menu that lets you update, delete, etc.).

But the real power lies in its sync capabilities. It implements both CalDAV and CardDAV protocols, which are becoming more widespread, and it is now possible for me to sync up most of my worlds.

Getting started isn’t super easy, however. Jeff did most of the work getting it installed on one of our Debian Squeeze servers (they supply packages) and while it is easy to get the software on the machine, getting it configured is another matter. It is pretty important to use LDAP for user management, and since we don’t have tons of LDAP experience there was a learning curve.

Being the boss has its benefits, so I pretty much sat back and complained a lot. However, I was able to help in getting the Apple stuff to sync, and especially in the case of the OS X Address Book the procedure borders on ritualistic.

In the hope that someone else will find this useful, here’s how I got the Address Book to sync.

Launch the Address Book, go to Preferences -> Accounts and add a new CardDAV account. Put in your server name, username and password and hit “Next”. This will cause the application to verify its connection to SOGo.

In our case, it failed. My belief is that it is due to the fact that the path the the SOGo DAV share doesn’t start at root.

After a lot of trial and error, I found a solution. After you create the account, look in the

~/Library/Application Support/AddressBook/Sources

directory and you should see a code for the account profile, something like this:


Descend into that directory and you’ll see a file called “Configuration.plist”. You’ll have to edit that file and make three changes:


Make sure “haveWriteAccess” is set to 1 (true), “isSharedABAccount” is set to zero (false), and the “servername” string should be the fully qualified path to your user’s DAV share on the SOGo server (the port must be explicitly stated, even though one would assume the “https” tag would default to that).

And please use SSL for the connection – this is a lot of personal data you’ll be putting up there.

Anyway, hats off to the SOGo team as well as to Jeff for getting this running. Check out their demo if you are interested (username and password both “sogo1”). Now all that we have left is to set up a Firefox sync server for bookmarks and passwords and we should have synchronization covered.

Google and Motorola

I was happy this week to read about Google’s acquisition of Motorola’s mobile phone business. As a long time Apple fanboy who is now trying to move away from their products, I still desire their high quality. Since Google is the driving force behind the only serious competitor to iOS, by owning a handset manufacturer they will be in a better position to make hardware specially tailored to Android.

This doesn’t mean I’ll end up with a Google phone. I think HTC makes some nice gear as well, but my guess is that there will now be more choice.

I’ve come under some criticism for my #noapple decision. I’ve spent nearly ten years working in free software and that tends to cause people to develop a prejudice that I’m somehow anti-business or a communist. When I complain that Apple is making too much money, that their margins are undeserved, I’m told that this is just the way capitalism works.


Capitalism depends on the proper functioning of markets. Properly functioning markets demand easy entry and exit. When there is excess profit in a market, competitors move in, produce more goods which drive down the price until profit disappears. If profit should ever go negative, companies will leave the market, reducing the number of goods and causing prices to rise.

Unfortunately, there are a great many industries that don’t have easy entry and exit. Take aircraft manufacture. There is a reason that there are only two main producers of commercial aircraft in the world – it takes a tremendous amount of money just to get started, and once invested, it is hard to leave. The deregulated banking industry in the US has created a small number of “super” banks that are “too big to fail” which causes its own entry/exit issues (if you want to get your bile up, check out Dan Ariely’s article on compensation of bank officers versus market cap).

As much as we’d love to think of the computer software industry as being a free market, software patents and proprietary hardware block easy entry and exit. Take the iPad. Apple designs a nifty little device that generates a tons of consumer demand. This creates excess profit, which causes Samsung to create a competing product in the Galaxy tablet. In a normally functioning market, this should both inspire innovation and lower prices. Thus the end users benefit.

However, in a world of software patents, Apple blocks the sale of the Samsung product in the courts. Consumers get no options, prices don’t change, and you either get to spend too much money on the Apple product or go without. This is in the best interest of Apple preserving its margins, but monopolies are the antithesis liberal market economies.

The Google/Motorola deal was north of US$12B, so I think it is safe to say that the mobile handset market is pretty hard to enter and exit. In a world of giant companies, only a Google can take on an Apple at this level. I don’t expect either of them to act in my own best interest, but Google has shown time and time again a willingness to err on the side of openness, whereas Apple is now working hard to consolidate and close its entire production stack (iOS, A4/A6 ARM processors, and the possible move to Sharp for displays, etc.).

I run The OpenNMS Group as a for-profit company. We focus on open source software not out of any zealotry, but because it makes the most sense for our clients. I’m all for making money, but I want to compete fairly in the marketplace. Luckily, the internet and commodity hardware make it possible for us to compete with products from much bigger companies. I don’t ever want to see that go away. I think Google’s move will better position itself against Apple (from both a product and patent perspective) and that will benefit everyone.

OpenNMS Training in September

The biggest complaint I hear about open source in general is the lack of documentation. Writing good documentation is hard, and with complex tools like OpenNMS, sometimes a more hand-on, interactive approach is better.

That is why we periodically offer training. We have two courses scheduled for September.

The first is the week of 12-16 September and will be held at the OpenNMS HQ in Pittsboro, North Carolina.

The second is the week of 26-30 September and will be held in Fulda, Germany, the site of our Users Conference back in May. We are happy to be able to offer this in conjunction with our partner, NETHINKS.

Both courses will be taught by me and David Hustace, so if the thought of listening to me talk for hours on end frightens you, take solace in the fact that David teaches half of it.

Hope to see you there.

Choosing a Desktop (#noapple)

Once again I have to apologize for the light blogging. That will change this week I believe. I have been busy with other writing projects as well as switching to a Linux desktop. It took longer than I thought it would.

As I mentioned earlier, I am working to move away from Apple products. While I wouldn’t classify myself as a rabid fanboy, I do self identify with the fanboy label, so this is a pretty big decision for me.

Before I go any further, I realize that the three people who read this blog are technologists and probably have some of their self identity, if not self worth, tied into their technology choices. My move away from Apple is a personal decision based on a number of factors, but I won’t be teasing or disparaging those who decide to stick with using Apple products. In addition, I’m going to discuss some even more controversial choices I made when it comes to using Linux, and again these are choices that work for me and do not mean that those who chose differently “suck”, to use the vernacular.

To recap, with the release of OS X Lion I’ve seen my beloved Apple move to even more strongly lock down the user experience than ever before. I believe this is in preparation to get rid of OS X altogether, and to move even MacBooks to iOS. As I have tied up a lot of information in somewhat proprietary Apple formats, I felt it was time to move, now or never.

I have two initial goals. First, I want to move to a Linux desktop environment, and I want to do it in such a way that I don’t give up anything I had when running OS X. This is important: I am working under the hypothesis that open source desktop solutions can compete with Apple feature for feature. Now I am not expecting to be able to upgrade the firmware on my iPhone 4 using Linux, but I want the same convenience I’ve come to expect from OS X, as well as a pretty and polished interface.

Second, I want to do this with minimal hardware changes. My target system is an early 2009 24-inch iMac with 4GB of RAM and an NVIDIA graphics card.

My plan was to remain on OS X Snow Leopard and just switch everything to FOSS applications. This didn’t work for a number of reasons. While the UNIX basis for OS X makes it possible to port most of the open source tools I want, they often don’t fit very well and seem to clash with the O/S, unlike the native apps. Plus, there’s the whole “in it for a penny, in it for a pound” aspect – if I am serious about changing, I should just do it.

When it comes to operating systems, the most “free” distro out there is Debian. I run Debian on more than half of my servers. Unfortunately, native Debian is a poor choice for a desktop, especially on proprietary hardware like my iMac. While I have no doubt that I could get things to a useable state with Debian, one of my stated goals is easy of use, and from the desktop standpoint Debian ain’t it.

So that left Fedora and Ubuntu. Both are projects controlled by corporate interests (Red Hat and Canonical) and most of my freetard friends prefer Fedora from a “freedom” standpoint. Also, Red Hat is just a few miles down the road from the OpenNMS offices, so I have a soft spot for them. I decided that my first choice would be Fedora 15.

Note: Before ya’ll start bringing up Mint and Pinguy, remember that I really want ease of use over time, and for this I am leaning toward distros sponsored by larger organizations than those two, even if they are based on Ubuntu.

The next thing I had to decide was which desktop option I should choose: GNOME or KDE. Fedora supports GNOME 3, but the last time I used a Linux Desktop (circa 2001/2002) I liked KDE. In reading about GNOME 3 vs. KDE 4, it seems that most people prefer KDE. Remember, I wasn’t vested in either, it is just that I had fond memories of using Konsole back in the day.

Getting Fedora installed was not easy. I believe most of my issues arose from the NVIDIA card, but by using the “basic graphics” option I was able to get the installer to run.

On a side note, I am a huge fan of the Apple Time Machine backup solution. It has saved my hide more than once, and I knew that I could futz around with the disk and partitions all I wanted I still get my system back.

The installer was pretty easy. I especially liked the option to encrypt a partition at install (versus trying to figure it out later). I set up four partitions – swap, /boot, root and /home. I figured having a separate /home partition will make things easier in the future with upgrades, etc.

Once I got Fedora 15 installed, I rebooted and saw a sight that would plague me for days: a little blinking “underline” cursor. The system would not even try to boot.

At this point I had kind of violated my “ease of use” goal, so I decided to drop Fedora and download Ubuntu. The problem was that Ubuntu wouldn’t even get to the installer part – it just sat at the blinking underline as well.

Between a lot of searching online and fiddling with grub options (specifically “nomodeset”) I managed to get Fedora to boot. I then uninstalled the nouveau NVIDIA driver and downloaded the proprietary one from the RPMFusion repository.

Here’s where I hit another snag. Fedora has recently upgraded to the 3.0 Linux kernel (although they number it 2.40). RPMFusion has yet to upgrade all of its packages to support this kernel, so I had to play around with the kernel modules that build on the fly versus the prebuilt options (akmod vs. mod).

Another side note: while at this point I was extremely frustrated, I was also having some fun. It has been awhile since I had to learn about the internals of my operating system.

Finally, I got KDE to boot. It was different than I remembered, but I picked things up pretty easily. Here’s where I hit two more snags.

I launched Konsole and the first thing that struck me was that it was ugly. The default font looked like something created by a 9-pin Epson printer from 20 years ago. I was so used to having beautifully smooth, anti-aliased fonts that it was a shock to see something so much worse, and it violated one of my rules that I shouldn’t have to give anything up to switch to Linux.

But I figured that would be a problem I could address later. My next task was to get my Apple Magic Mouse to work over Bluetooth.

This was pretty easy in KDE. There was a little Bluetooth configuration widget. It saw my mouse and paired flawlessly. In a minute or so I had a listing for my mouse and a little green dot next to it.

The only problem was that it wouldn’t do any of the sort of things that you expect from a mouse, such as moving the mouse pointer.

More searching seemed to indicate that the kernel upgrade broke Magic Mouse support. By this time my Screw It meter had pegged, and in a fit of pique I based the system and restored OS X. Then I went home.

I had spent a couple of days playing with this (making backups and restores taking up most of that time – luckily I had my laptop so I could actually get some real work done) and I was ready to give up.

I couldn’t stop thinking about it, however, and as I tossed and turned trying to sleep that night I decided that I hadn’t given Ubuntu a fair shot. I remembered that I had some issues installing 64-bit Ubuntu on Apple hardware in the past, so maybe I should try 32-bit. It violated my rule about giving things up, but I figured it would only cost me some time and a blank CD to try.

I went into the office on Friday, downloaded the 32-bit version of 11.04 (Naughty Nightnurse) and booted to it.

Same blinking cursor. Grrrr.

When I downloaded the image I noticed one labelled “alternate” on the website. Upon searching I found that this was an alternate installer that might address my graphics card problem. I downloaded the 64-bit version and lo and behold it worked.

The Ubuntu installer is different from the Fedora one, especially in the disk layout section. Whereas Fedora just has a checkbox next to “encrypt this volume” Ubuntu requires you to first create the partition and then create another encrypted volume that ties to it. Once I got past that part I found that Ubuntu adds the option to just encrypt your home directory on a per user basis (a la FileVault) which is cool, but since I had already chosen to encrypt the whole /home partition it yelled at me about the fact that swap partition was unencrypted and that was a security risk (as passphrases could be cached in swap). So I hit the option to also encrypt swap and moved on.

When I got to the part about installing grub, I couldn’t remember which partition was /boot, so I back tracked through the installer to see where it was. Unfortunately, this caused the newly encrypted swap partition to get corrupted, and nothing I could do through the installer would let me delete it. I couldn’t delete the partition itself without deleting the encrypted volume, and it wouldn’t let me do that due to the corruption.


At this point I put the iMac under my arm and went home. Using an older Fedora boot disk I was able to remove the Linux partitions (I tried booting back to OS X but the Apple Disk Utility has serious issue dealing with ext 4 partitions). I then redid the Ubuntu install, choosing just to encrypt my home directory, and it completed without incident.

Reboot … and I’m staring at a blinking cursor once again. Not giving in to despair this time, I added the “nomodeset” option to grub and voilá, I was looking at an Ubuntu Desktop.

It complained about 3D not being enabled and thus this “Unity” interface could not load, but it was extremely easy to install the proprietary NVIDIA driver from the additional hardware installer, which also removed the nouveau driver. Crossing my fingers, I rebooted once more.

Let me say this, Ubuntu is freakin’ gorgeous.

Once I had the proper hardware driver, Unity came up in all of its glory and I really liked it. Remember, I don’t really have a horse in the race of KDE vs. GNOME, and the Unity Launcher reminded me of the OS X dock. The purple and orange theme was easy on the eyes while distancing itself from the muted pastel blues favored by Apple. The fonts were beautifully smooth, and I loved the fact that it was extremely easy to install new software, and the fact that when installing something like Thunderbird it came with a theme that fit in well with the rest of the system.

I still had a weekend worth of work before I could even come close to having what I needed to make Ubuntu my working desktop, but after spending four days playing with installing a Linux desktop I was so extremely happy to have something I could live with, and something that I could use for the basis of testing out the rest of my #noapple hypothesis. I am still running OS X on my MacBook Air and my iMac at home, but I think I’ll wait to 11.10 Onanistic Oliphant is released before upgrading those. It’s only a couple of months away and that will give me time to get real comfortable with Ubuntu.

I’ll write more about individual issues and applications later.