This Blog Can Now Vote

It’s hard to believe but this blog is now 21 years old, having started back on this day in 2003. In the beginning I only had the one reader, but now I’m up to a whole three readers! I’m hoping by the time it turns 30 I can get to four.

I am writing this from the French Quarter in New Orleans, where I am attending a meeting. I was trying to think about a topic for this auspicious anniversary, and my go-to was to complain, once again, about the death of Google Reader also killing blogs, but instead I thought I’d just mention a few of the events I’ll be attending in the first part of 2024.

Next week I’ll be heading to Berlin for FOSS Backstage. I’ve been to Germany many times and love visiting that county, and this will be my first visit to Berlin so I am looking forward to it. I’m speaking on open source business models, which is a topic I’m passionate about. It should be a lot of fun.

April will find me in Seattle for the Open Source Summit. I won’t be speaking, but I will be in the AWS booth and would love to meet up if you happen to be attending as well.

Finally, in May I’ll be in Paris for the inaugural Open Source Founders Summit. If you run a commercial open source company, or are thinking about starting an open source company, consider applying to attend this conference. Emily Omier is bringing together pretty much the brain-trust when it comes to open source business (and me!) and it promises to be a great discussion on how to make money with open source while remaining true to its values.

The Brightspeed Saga

While I love living “out in the country” I often envy my urban friends for their network connectivity. When I moved out to the farm in 1999 the only “high speed” access was satellite, and even that required a modem and a phone line. I was overjoyed when Embarq finally deployed DSL to my house, and while 5bps down might not seem like much these days it seemed heaven-sent back then.

Jump forward 20 years and Embarq became Centurylink which is now Brightspeed. I had a pretty poor opinion of Centurylink (or as I called them CenturyStink) but high hopes for Brightspeed when they bought Centurylink’s ILEC business in our area, but they have been disappointing. Here is that story.

Both my wife and I work from home, and when our DSL circuit is working it works well enough for us to get our jobs done. At 11Mps down and 640kbps up it doesn’t even qualify as “broadband” but it is a trade off I’m willing to make in order to live where I live.

Starting back in early November we began to have issues with the quality of the DSL connection. Quality issues are always frustrating since the support technicians at the provider never seem to have the tools to properly measure it. Instead they just tell me the circuit is “up” so I should be satisfied, even though I tell them that while it is up, it is unusable.

The issue was high latency and packet loss. Latency is a measure of the time it takes information to travel through the network and packet loss indicates that some of that information never makes it to its destination. The protocols used in networking will automatically deal with packet loss by sending the information again, but the more this happens the worse the experience is for the user. Things that can handle packet loss gracefully like e-mail, web pages and chat just seem very slow, while anything that requires a more steady flow of information like video or gaming just don’t work at all.

Having done network monitoring for much of my professional life, I monitor the quality of my DSL circuit by attempting to reach the 8.8.8.8 IP address, which is a highly available DNS server run by Google. Here is a recent graph:

Now normally the graph should be green and pretty much focused around 45ms. This one was all over the place. I asked my neighbor to execute a ping to the same IP address and her connection was working fine, so I assumed it was an issue specific to me.

Trying to get support from Brightspeed was very frustrating. As I mentioned above they just tend to tell me everything is okay. I even reached out to one of my LinkedIn contacts who is an executive vice president at Brightspeed for help, and I think he was responsible for a ticket being opened on November 7th in the “BRSPD Customer Advocacy (Execs)” queue. I really appreciated the effort but it didn’t help with getting my issue resolved.

Just after Thanksgiving I called again and I was told that the problem was that my modem was wrong and doesn’t work with the DSL circuit we have, even though it was the model they sent to me and it had been working fine up until November. In any case they said they would send me a replacement and it would arrive in two days.

Ten days later I call back and ask, hey, where is my modem? They told me it was still “in process” but since I’ve been waiting so long they’ll overnight one to me. It shows up the next day and of course doesn’t fix the issue, but it has newer software than my other modem and reports the status of the DSL circuit. On the web page of the device this is usually represented by the word POOR in dark red, but sometimes it would improve to MARGINAL in a slightly lighter red. I call and explain this to Brightspeed, and after dealing with this for over two months they agree to send out a technician in two days.

When I finally get the e-mail confirming the appointment, it is for the following Thursday, January 5th, eight days away. That also happens to be when we are closing on a new house, so I can’t be here to meet with them. For some appointments they show up early, so I didn’t change it right away, but when they hadn’t shown up by Tuesday I decided to reschedule it.

I went to the e-mail and clicked on the link to reschedule and got sent to the Centurylink site. Of course they wanted me to confirm my account, but nothing I typed in: phone number, e-mail, or account number, worked because I no longer have a Centurylink account. (sigh)

[Note: it looks like this has been corrected, finally, on the Brightspeed website. Not sure about links in e-mails]

In the process I did find out that my Brightspeed account number ends in “666” so perhaps that is indicative of something.

I eventually ended up calling support once again. I believe it would take the average caller about six minutes to reach a human through their system, as it prompts you for a variety of things before allowing you to speak to a person, but I had been calling for over two months so I can speedrun the thing in about four and half minutes by pressing buttons before the prompt is finished.

The person I talked to about rescheduling the appointment kept me on hold for about 30 minutes before telling me that the whole dispatch system for technicians was down and that she would call me back in two hours. She never did.

The next day I made one more attempt to reschedule the appointment, but was told that the next available appointment was so far out in the future that I should just keep it, since the technician won’t need to enter the house. I left a long letter taped next to the demarcation box on my house with a detailed description of the problem, and hoped for the best.

Unfortunately, they sent out Brandon. To my knowledge there are only two technicians assigned to our rather large county: Brandon and Elton. I much prefer working with Elton since Brandon doesn’t really seem to be the kind of person who does a deep dive into the problem, but I recently learned that Elton has moved into the back office and wasn’t doing service calls anymore.

As I feared, Brandon marked the issue as closed without fixing it. Once again into the support phone queue, where I was told that he had run a test “for five minutes” and my circuit was fine. (sigh)

I did get a text asking if my problem was resolved, to which I said “NO!” and I was later contacted by a person from Brightspeed to follow up. After a very long conversation she offered to send someone else out, and that person arrived yesterday.

Philip, who is based out of Wake County (one county to the east of us) showed up promptly at 8am and within ten minutes had diagnosed a grounding issue with the wires coming to our house. In about 45 minutes he had repaired it, but he warned me that there was also an outage in the area which would explain my now 900ms ping times (but no packet loss). I trusted him that it would eventually resolve and about 30 minutes later things were much, much better.

You can see where the network was bad before Philip showed up, the gap where he was working on the system, and then the return to a more expected quality of service.

It still isn’t perfect. I’m seeing a lot of jitter from time to time which is indicated by the spikes, but for the most part the user experience is fine. I was able to participate in our departmental weekly video call without issue yesterday for the first time in months.

And that’s was really bothers me the most. For nearly three months Brightspeed was gaslighting me that my service was fine, when, as most IT professionals would expect, it turned out to be a physical layer problem. In retrospect it makes sense since we’ve been having an especially wet winter and that would have caused the grounding issue to be amplified.

I figure I spent between 40 and 60 hours actively involved in getting this addressed, and that is time I’ll never get back.

Of course it could be worse. The local newspaper published a story about a community in Chatham County that was without service from Brightspeed for a total of 51 days. At least our connection was usable enough that it only required a few trips to the public library for access during important deadlines.

There is some good news in that same newspaper issue that some attempts are being made to help those of us in rural areas get broadband. Some of you may be thinking Starlink, but I was on their waiting list for two and a half years without getting my equipment and when they pushed it out to late 2023 I just gave up and asked for my deposit back.

I am not a huge “we need to regulate everything” kind of guy, but broadband has become one of those services that is so important and that the free market has failed to provide that I would welcome government involvement in getting this issue addressed. But so far the communications lobby has been strong enough to prevent any kind of oversight, so I won’t hold my breath.

“Help Me” Easy Button

Ever since my parents got older, I’ve be wanting to create a tech company focused solely on making technology available for the elderly in a fashion that is easier for them to understand. Perhaps this will all go away with AI and digital assistants, or when my generation that grew up with tech gets older, but I have watched them struggle sometimes with mobile phones and TV remotes, even ones that are supposed to be simple, and I realize there is probably a market for such solutions.

While I don’t have capital for such an undertaking, I do have access to open source software and hardware, and I have a device idea that shouldn’t require heroic effort to create.

Remember the “Easy Button” from Staples?

Staples Easy Button

What I want is something about the same size, but when you press it, it will send a notice to an app on my phone.

My mother died last year and we recently bought a new home in part because it has a basement apartment where my father can live. He will have his own space but we’ll be close enough to help him out if he needs it.

The idea for this button came to me when I was thinking about what would happen if he needed some help but for whatever reason he couldn’t either call out so that we could hear him or get to a phone. Unless he was severely incapacitated he should be able to press a big button, and since I almost always have my phone with me all I would need would be an app that could send me a notice.

Since my three readers are very smart (and not to mention devilishly attractive) you have probably thought about existing services (remember the “I’ve fallen and I can’t get up” Life Call ad from 15 years ago?) but older people can be extremely proud and they hate being reminded of their age. I’m pretty certain my father would resist carrying around a device on a lanyard but he wouldn’t mind having a button nearby “just in case”.

The feature set would be pretty short:

  • A big button (‘natch)
  • Some indication that the button has been pressed (light or buzzer)
  • Settings:
    • A way to name the button
    • Configure the Wi-Fi connection
    • Configure a list of users to contact when the button is pressed

For ease of use the first generation of such a device would not be battery powered but if it was there would need to be a way to make sure the battery was charged.

While I have worked with Raspberry Pi boards I have not done anything with the Pi Zero, but I assume this would be a perfect application for it. The original Easy button could be repurposed for this device but there are also a ton of options on Amazon that could work as well.

A bit harder would be the app software, as I am lead to believe getting notices in the background on mobile devices can be tricky and I don’t want to have to have the app running all the time. I have enough skills that I could make something that would send an e-mail (which would remove the need for a separate app) but I’m hoping for a solution with more reliability and less latency. It would be nice to have it send a notice no matter where I am but if it were easier to only work when my phone was on the same local network as the button that would be acceptable (I’m trying to figure out a solution that wouldn’t involve a server).

Anyway, just putting this down as a placeholder for when I have some free time to pursue it, but I also figure someone may have done this already and by posting this I’ll find out about it.

Farewell Dark Sky

I am always optimistic with the New Year, but 2023 has already brought me one disappointment: the death of the Dark Sky weather app.

My friend Ben introduced me to Dark Sky many years ago. Unlike most weather apps, Dark Sky focused on micro-forecasts. It would tell you when rain was imminent, how strong it would be, and how long it would last. It was amazingly useful. When I was at Lollapalooza back in 2017 a torrential downpour hit Chicago – so strong that it shut the festival down. But I remained relatively dry because Dark Sky warned me it was coming with about a 10 minute lead time. That allowed me to run to the subway and get out of the weather before the skies opened up.

In 2020, Apple bought Dark Sky and as of yesterday the Dark Sky app no longer works. The functionality has been added, or shall I say buried, within the default Apple weather app.

For a company like Apple that prides itself on UI/UX you would think they would do a better job of it. This is a screenshot of the Dark Sky app just before midnight on the last day of 2022.

With one click you see the current temperature, the rain outlook, and a timeline for how long the rain will last.

In the Apple Weather app you have to open it, scroll down until you find the precipitation widget (and not the precipitation map), click on that and you can kind of figure out the rain forecast if you look hard enough. Here is the prediction for Wednesday.

I mean, I’ll get used to it, but it is sometimes hard to say goodbye to something you’ve used for years. If the app were open source there is a chance it would live on, but when we opt to use proprietary software we also cede a lot of choices to the software vendor. So it goes.

Goodbye, Dark Sky, you’ll be missed.

“Run-of-the-Mill Person”

I just noticed that my Wikipedia page has been deleted (the old version is still on the Internet Archive).

I’m not sure how I feel about this. When I was first made aware of its existence oh so many years ago I was both flattered and a little embarrassed, mainly because I didn’t think I rated a page on Wikipedia. But then I got to thinking that, hey, pretty much anyone should be able to have a page on Wikipedia as long as it adheres to their format guidelines. It’s not like it takes up much space, and as long as the person is verifiable as being a real person, why not?

I am certain I would have been okay with my page being deleted soon after it was created, but once you get used to having something, earned or not, there is a strong psychological reaction to having it taken away. From what I can tell the page was created in 2010, so it had been around for nearly 12 years with no one complaining.

The most hurtful thing was a comment about the deletion from EdwardX from London:

Nothing cited in the article counts towards WP:GNG, and I can find nothing better online. Run-of-the-mill person.

Really? Was the “Run-of-the-mill person” comment really necessary? (grin)

I’m still happy about what I was able to accomplish with OpenNMS and building the community around it, even if it was run-of-the-mill, and I plan to promote open source and open source companies for the remainder of my career, even if that isn’t Wikipedia-worthy.

Nineteen Years

Nineteen years ago my friend Ben talked me into starting this blog. I don’t update it as frequently any more for a variety of reasons, specifically because more people interact on social media these days and I’m not as involved in open source as I used to be, but it is still somewhat of an achievement to keep something going this long.

My “adventures” in open source started out on September 10th, 2001, when I started a new job with a company called Oculan to work on their open source monitoring platform OpenNMS. In May of 2002 I became the lead maintainer on the project, and by the time I started this blog I’d been at it for several months. Back then blogs were one of the main ways an open source project could communicate with its community.

The nearly two decades I spent with OpenNMS were definitely an adventure, and this site can serve as a record of both those successes and those struggles.

Nineteen years ago open source was very different than it is today. Today it is ubiquitous: I think it would be rare for a person to go a single day without interacting with open source software in some fashion. But back then there was still a lot of fear, uncertainty and doubt about using it, with a lot of confusion about what it meant. Most people didn’t take it seriously, often comparing it to “shareware” and never believing that it would ever be used for doing “real” things. On a side note, even in 2022 I recently had one person make the shareware comparison when I brought up Grafana, a project that has secured nearly US$300 million in funding.

Back then we were trying to figure out a business model for open source, and I think in many ways we still are. The main model was support and services.

You would have thought this would have been more successful than it turned out to be. Proprietary software costing hundred of thousands if not millions of dollars would often require that you purchase a maintenance or support contract running anywhere from 15% to 25% of the original software cost per year just to get updates and bug fixes. You would think that people would be willing to pay that amount or less for similar software, avoiding the huge upfront purchase, but that wasn’t the case. If they didn’t have to buy support they usually wouldn’t. Plus, support doesn’t easily scale. It is hard finding qualified people to support complex software. I’d often laugh when someone would contact me offering to double our sales because we wouldn’t have been able to handle the extra business.

One company, Red Hat, was able to pull it off and create a set of open source products people were willing to purchase at a scale that made them a multi-billion dollar organization, but I can’t think of another that was able to duplicate that success.

Luckily, the idea of “hosted” software gained popularity. One of my favorite open source projects is WordPress. You are reading this on a WordPress site, and the install was pretty easy. They talk about a “five minute” install and have done a lot to make the process simple.

However, if you aren’t up to running your own server, it might as well be a five year install process. Instead, you can go to “wordpress.com” and get a free website hosted by them and paid for by ads being shown on your site, or you can remove those ads for as little as US$4/month. One of the reasons that Grafana has been able to raise such large sums is that they, too, offer a hosted version. People are willing to pay for ease of use.

But by far the overwhelming use of open source today is as a development methodology, and the biggest open source projects tend to be those that enable other, often proprietary, applications. Two Sigma Ventures has an Open Source Index that tries to quantify the most popular open source projects, and at the moment these include Tensorflow (a machine learning framework), Kubernetes (a container orchestration platform) and of course the Linux kernel. What you don’t see are end user applications.

And that to me is a little sad. Two decades ago the terms “open source” and “free software” were often used interchangeably. After watching personal computers go from hobbyists to mainstream we also saw control of those systems move to large companies like Microsoft. The idea of free software, as in being able to take control of your technology, was extremely appealing. After watching companies spend hundreds of thousands of dollars on proprietary software and then being tied to those products, I was excited to bring an alternative that would put the power of that software back into the hands of the users. As my friend Jonathan put it, we were going to change the world.

The world did change, but not in the way we expected. The main reason is that free software really missed out on mobile computing. While desktop computers were open enough that independent software could be put on them, mobile handsets to this day are pretty locked down. While everyone points to Android as being open source, to be honest it isn’t very useful until you let Google run most of it. There was a time where almost every single piece of technology I used was open, including my phone, but I just ran out of time to keep up with it and I wanted something that just worked. Now I’m pretty firmly back into the Apple ecosystem and I’m amazed at what you can do with it, and I’m so used to just being able to get things going on the first try that I’m probably stuck forever (sigh).

I find it ironic that today’s biggest contributors to open source are also some of the biggest proprietary software companies in the world. Heck, even Red Hat is now completely owned by IBM. I’m not saying that this is necessarily a bad thing, look at all the open source software being created by nearly everyone, but it is a long way from the free software dream of twenty years ago. Even proprietary, enterprise software has started to leverage open APIs that at least give a nod to the idea of open source.

We won. Yay.

Recently some friends of mine attended a fancy, black-tie optional gala hosted by the Linux Foundation to celebrate the 30th anniversary of Linux. Most of them work for those large companies that heavily leverage open source. And while apparently a good time was had by all, I can’t help but think of, say, those developers who maintain projects like Log4j who, when there is a problem, get dumped on to fix it and probably never get invited to cool parties.

Open source is still looking for a business model. Heck, even making money providing hosted versions of your software is a risk if one of the big players decides to offer their version, as to this day it is still hard to compete with a Microsoft or an Amazon.

But this doesn’t mean I’ve given up on open source. Thanks to the Homebrew project I still use a lot of open source on my Macintosh. I’m writing this using WordPress on a server running Ubuntu through the Firefox browser. I still think there are adventures to be had, and when they happen I’ll write about them here.

Nextcloud News

I think the title of this post is a little misleading, as I don’t have any news about Nextcloud. Instead I want to talk about the News App on the Nextcloud platform, and I couldn’t think of a better one.

I rely heavily on the Nextcloud News App to keep up with what is going on with the world. News provides similar functionality to the now defunct Google Reader, but with the usual privacy bonuses you expect from Nextcloud.

Back before social networks like Facebook and Twitter were the norm, people used to communicate through blogs. Blogs provide similar functionality: people can write short or long form posts that will get published on a website and can include media such as pictures, and other people can comment and share them. Even now when I see an incredibly long thread on Twitter I just wish the author would have put it on a blog somewhere.

Blogs are great, since each one can be individually hosted without requiring a central authority to manage it all. My friend Ben got me started on my first blog (this one) that in the beginning was hosted using a program called Moveable Type. When their licensing became problematic, most of us switched to WordPress, and a tremendous amount of the Web runs on WordPress even now.

Now the problem was that the frequency that people would post to their blogs varied. Some might post once a week, and others several times an hour. Unless you wanted to go and manually refresh their pages, it was difficult to keep up.

Enter Really Simple Syndication (RSS).

RSS is, as the name implies, an easy way to summarize content on a website. Sites that support RSS craft a generic XML document that reflects titles, descriptions, links, etc. to content on the site. The page is referred to as a “feed” and RSS “readers” can aggregate the various feeds together so that a person can follow the changes on websites that interest them.

Google Reader was a very useful feed reader that was extremely popular, and it in turn increased the popularity of blogs. I put some of the blame on Google for the rise of the privacy nightmare of modern social networks on their decision to kill Reader, as it made individual blogs less relevant.

Now in Google’s defense they would say just use some other service. In my case I switched to Feedly, an adequate Reader replacement. The process was made easier by the fact that most feed readers support a way to export your configuration in the Outline Processor Markup Language (OPML) format. I was able to export my Reader feeds and import them into Feedly.

Feedly was free, and as they say if you aren’t paying for the product you are the product. I noticed that next to my various feed articles Feedly would display a count, which I assume reflected the number of Feedly users that were interested in or who had read that article. Then it dawned on me that Feedly could gather useful information on what people were interested in, just like Facebook, and I also assume, if they chose, they could monetize that information. Since I had a Feedly account to manage my feeds, they could track my individual interests as well.

While Feedly never gave me any reason to assign nefarious intentions to them, as a privacy advocate I wanted more control over sharing my interests, so I looked for a solution. As a Nextcloud fan I looked for an appropriate app, and found one in News.

News has been around pretty much since Nextcloud started, but I rarely hear anyone talking about its greatness (hence this post). Like most things Nextcloud it is simple to install. If you are an admin, just click on your icon in the upper right corner and select “+ Apps”. Then click on “Featured apps” in the sidebar and you should be able to enable the “News” app.

That’s it. Now in order to update your feeds you need to be using the System Cron in Nextcloud, and instructions can be found in the documentation.

Once you have News installed, the next challenge is to find interesting feeds to which you can subscribe. The news app will suggest several, but you can also find more on your own.

Nextcloud RSS Suggestions

It used to be pretty easy to find the feed URL. You would just look for the RSS icon and click on it for the link:

RSS Icon

But, again, when Reader died so did a lot of the interest in RSS and finding feed URLs more became difficult. I have links to feeds at the very bottom of the right sidebar of this blog, but you’d have to scroll down quite a way to find them.

But for WordPress sites, like this one, you just add “/feed” to the site URL, such as:

https://www.adventuresinoss.com/feed

There are also some browser plugins that are supposed to help identify RRS feed links, but I haven’t used any. You can also “view source” on a website of interest and search for “rss” and that may help out as well.

My main use of the News App is to keep up with news, and I follow four main news sites. I like the BBC for an international take on news, CNN for a domestic take, Slashdot for tech news and WRAL for local news.

Desktop Version of News App

Just for reference, the feed links are:

BBC: http://newsrss.bbc.co.uk/rss/newsonline_uk_edition/front_page/rss.xml

CNN: http://rss.cnn.com/rss/cnn_topstories.rss

Slashdot: http://rss.slashdot.org/slashdot/slashdotMain

WRAL: http://www.wral.com/news/rss/48/

This wouldn’t be as useful if you couldn’t access it on a mobile device. Of course, you can access it via a web browser, but there exist a number of phone apps for accessing your feeds in a native app.

Now to my knowledge Nextcloud the company doesn’t produce a News mobile app, so the available apps are provided by third parties. I put all of my personal information into Nextcloud, and since I’m paranoid I didn’t want to put my access credentials into those apps but I wanted the convenience of being able to read news anywhere I had a network connection. So I created a special “news” user just for News. You probably don’t need to do that but I wanted to plant the suggestion for those who think about such things.

On my iPhone I’ve been happy with CloudNews.

iPhone Version of CloudNews App

It sometimes gets out of sync and I end up having to read everything in the browser and re-sync in CloudNews, but for the most part it’s fine.

For Android the best app I’ve used is by David Luhmer. It’s available for a small fee in the Play Store and for free on F-Droid.

Like all useful software, you don’t realize how much you depend on it until it is gone, and in the few instances I’ve had problems with News I get very anxious as I don’t know what’s going on in the world. Luckily this has been rare, and I check my news feed many times during the day to the point that I probably have a personal problem. The mobile apps mean I can read news when I’m in line at the grocery store or waiting for an appointment. And the best part is that I know my interests are kept private as I control the data.

If you are interested, I sporadically update a number of blogs, and I aggregate them here. In a somewhat ironic twist, I can’t find a feed link for the “planet” page, so you’d need to add the individual blog feeds to your reader.

Order of the Green Polo: Requiescat In Pace

One of the first “group chat” technologies I was ever exposed to was Internet Relay Chat (IRC). This allowed a group of people to get together in areas called “channels” to discuss pretty much anything they felt like discussing. The service had to be hosted somewhere, and for most open source projects that was Freenode.

You might have seen that recently Freenode was taken over by new management, and the policies this new management implemented didn’t sit well with most Freenode users. In the grand open source tradition, most everyone left and went to other IRC servers, most notably Libera Chat.

In May of 2002 when I became the sole maintainer of OpenNMS, there was exactly one person who was dedicated full time to the project – me. What kept me going was the community I found on IRC, in both the #opennms channel and the local Linux users group channel, #trilug.

It was the people on IRC who supported me until I could grow the business to the point of bringing on more people. I still have strong friendships with many of them.

I was reminded of those early days as we migrated #opennms to Libera Chat. At the moment there are only 12 members logged in, and most of those are olde skoool OpenNMS people. I haven’t used IRC much since we switched to Mattermost (we host a server at chat.opennms.com) and with it a “bridge” to bring IRC conversations into the main Mattermost channel. Most people moved to use Mattermost as their primary client, but of course there were a few holdouts (Hi Alex!).

While I was reminiscing, I was also reminded of the Order of the Green Polo (OGP). When David, Matt and I started The OpenNMS Group in 2004, interest in OpenNMS was growing, and there was a core of those folks on IRC who were very active in contributing to the project. I was trying to think of someway to recognize them.

At that time, business casual, at least for men, consisted of a polo shirt and khaki slacks. Vendors often gifted polo shirts with their logos/logotypes on them to clients, and a number of open source projects sold them to raise money. We sold a white one and a black one, and I thought, hey, perhaps I can pick another color and use that to identify the special contributors to OpenNMS.

Green has always been associated with OpenNMS. In network monitoring, green symbolizes that everything is awesome. We even named one of our professional services products the “Greenlight Project“. Plus I really like green as a color.

Then the question became “what shade of green?” For some reason I thought of Tiger Woods who, by this time, late 2004, had won the prestigious Masters golf tournament three times (and would again the next spring). The winner of that tournament gets a “hunter green” jacket, and so I decided that hunter green would be the color.

Also, for some unknown reason, I saw an article about a British knighthood called “The Order of the Garter“. I combined the two and thus “The Order of the Green Polo” was born.

It was awesome.

People who had been active in contributing to OpenNMS became even more active when I recognized them with the OGP honor. They contributed code and helped us with supporting our community, as well as adding a lot to the direction of the project. We started having annual developer conferences called “Dev-Jam” and OGP members got to attend for free so we could spend some face to face time with each other. I considered these men in the OGP to be my brothers.

As OpenNMS grew, we looked to the OGP for recruitment. It was through the OGP that Alejandro came to the US from Venezuela and now leads our support and services team (if OpenNMS went away tomorrow, getting him and his spouse here would have made it all worth it). When you hired an OGP member, you were basically paying them to do something they wanted to do for free. Think of is as like eating an ice cream sundae and finding money at the bottom.

But that growth was actually something that lead to the decline of the OGP. When we hired everyone that wanted a job with us, the role of the OGP declined. Dev-Jam was open to anyone, but it was mandatory for OpenNMS employees. Not all employees were OGP even though they were full-time contributors, so there was often pressure to induct new employees into the Order. And, most importantly, as we aged many OGP members moved on to other things. Hey, it happens, and it doesn’t reflect poorly on their past contributions.

We had a special mailing list for the OGP, but instead of discussing OpenNMS governance it basically became a “happy birthday” list (speaking of which, Happy Birthday Antonio!). When OpenNMS was acquired by NantHealth, we had to merge our mail systems and in the process the OGP list was deactivated. I don’t think many people noticed.

Recently it was brought to my attention that associating OpenNMS with the Masters golf tournament through the OGP could have negative connotations. The Masters is hosted by the Augusta National Golf Club and there have been controversies around their membership policies and views on race. It was suggested that we rename the OGP to something else.

One quick solution would be to just change the shade of green to, perhaps, a “stoplight” green. But this got me to thinking that the same logic used to associate the color with racism could apply to the whole “Order of” as well, since that was based on a British knighthood which, much like Augusta, is mainly all male. Plus the British don’t have the best track record when it comes to colonialism, etc.

I think it is time for something totally new, so I’ve decided to retire the Order of the Green Polo. The members of the OGP are all male, and I’m extremely excited that as we’ve grown our company and project we have been able to greatly improve our diversity, and I would love to come up with something that can embrace everyone who has a love of OpenNMS and wants to contribute to it, be that through code, documentation, the community, &tc.

OpenNMS has changed greatly over the past two decades, and it has become harder to contribute to a project that has grown exponentially in complexity. As part of my role as the Chief Evangelist of OpenNMS, I want to change that and come up with easier ways for people to improve the OpenNMS platform, and I need to come up with a new program to recognize those who contribute (and if you want to skip that part and get right to the job thingie, we’re hiring, but don’t skip that part).

To those of you who were in the Order of the Green Polo, thank you so much for helping us make OpenNMS what it is today. I’m not sure if it would exist without you. And even without the OGP mailing list, I plan to remember your birthdays.

Open Source Contributor Agreements

I noticed a recent uptick in activity on Twitter about open source Contributor License Agreements (CLAs), mostly negative.

Twitter Post About CLAs

The above comment is from a friend of mine who has been involved in open source longer than I have, and whose opinions I respect. On this issue, however, I have to disagree.

This is definitely not the first time CLAs have been in the news. The first time I remember even hearing about them concerned MySQL. The MySQL CLA required a contributor to sign over ownership of any contribution to the project, which many thought was fine when they were independent, but started to raise some concerns when they were acquired by Sun and then Oracle. I think this latest resurgence is the result of Elastic deciding to change their license from an open source one to something more “open source adjacent”. This has caused a number of people take exception to this (note: link contains strong language).

As someone who doesn’t write much code, I think deciding to sign a CLA is up to the individual and may change from project to project. What I wanted to share is a story of why we at OpenNMS have a CLA and how we decided on one to adopt, in the hopes of explaining why a CLA can be a positive thing. I don’t think it will help with the frustrations some feel when a project changes the license out from under them, but I’m hoping it will shed some light on our reasons and thought processes.

OpenNMS was started in 1999 and I didn’t get involved until 2001 when I started work at Oculan, the commercial company behind the project. Oculan built a monitoring appliance based on OpenNMS, so while OpenNMS was offered under the GPLv2, the rest of their product had a proprietary license. They were able to do this because they owned 100% of the copyright to OpenNMS. In 2002 Oculan decided to no longer work on the project, and I was able to become the maintainer. Note that this didn’t mean that I “owned” the OpenNMS copyright. Oculan still owned the copyright but due to the terms of the license I (as well as anyone else) was free to make derivative works as long as those works adhered to the license. While the project owned the copyright to all the changes made since I took it over, there was no one copyright holder for the project as a whole.

This is fine, right? It’s open source and so everything is awesome.

Fast forward several years and we became aware of a company, funded by VCs out of Silicon Valley, that was using OpenNMS in violation of the license as a base on which to build a proprietary software application.

I can’t really express how powerless we felt about this. At the time there were, I think, five people working full time on OpenNMS. The other company had millions in VC money while we were adhering to our business model of “spend less than you earn”. We had almost no money for lawyers, and without the involvement of lawyers this wasn’t going to get resolved. One thing you learn is that while those of us in the open source world care a lot about licenses, the world at large does not. And since OpenNMS was backed by a for-profit company, there was no one to help us but ourselves (there are some limited options for license enforcement available to non-profit organizations).

We did decide to retain the services of a law firm, who immediately warned us how much “discovery” could cost. Discovery is the process of obtaining evidence in a possible lawsuit. This is one way a larger firm can fend off the legal challenges of a smaller firm – simply outspend them. It made use pretty anxious.

Once our law firm contacted the other company, the reply was that if they were using OpenNMS code, they were only using the Oculan code and thus we had no standing to bring a copyright lawsuit against them.

Now we knew this wasn’t true, because the main reason we knew this company was using OpenNMS was that a disgruntled previous employee told us about it. They alleged that this company had told their engineers to follow OpenNMS commits and integrate our changes into their product. But since much of the code was still part of the original Oculan code base, it made our job much more difficult.

One option we had was to get with Oculan and jointly pursue a remedy against this company. The problem was that Oculan went out of business in 2004, and it took us awhile to find out that the intellectual property had ended up Raritan. We were able to work with Raritan once we found this out, but by this time the other company also went out of business, pretty much ending the matter.

As part of our deal with Raritan, OpenNMS was able to purchase the copyright to the OpenNMS code once owned by Oculan, granting Raritan an unlimited license to continue to use the parts of the code they had in their products. It wasn’t cheap and involved both myself and my business partner using the equity in our homes to guarantee a loan to cover the purchase, but for the first time in years most of the OpenNMS copyright was held by one organization.

This process made us think long and hard about managing copyright moving forward. While we didn’t have thousands of contributors like some projects, the number of contributors we did have was non-trivial, and we had no CLA in place. The main question was: if we were going to adopt a CLA, what should it look like? I didn’t like the idea of asking for complete ownership of contributions, as OpenNMS is a platform and while someone might want to contribute, say, a monitor to OpenNMS, they shouldn’t be prevented from contributing a similar monitor to Icinga or Zabbix.

So we asked our our community, and a person named DJ Gregor suggested we adopt the Sun (now Oracle) Contributor Agreement. This agreement introduced the idea of “dual copyright”. Basically, the contributor keeps ownership of their work but grants copyright to the project as well. This was a pretty new idea at the time but seems to be common now. If you look at CLAs for, say, Microsoft and even Elastic, you’ll see similar language, although it is more likely worded as a “copyright grant” or something other than “dual copyright”.

This idea was favorable to our community, so we adopted it as the “OpenNMS Contributor Agreement” (OCA). Now the hard work began. While most of our active contributors were able to sign the OCA, what about the inactive ones? With a project as old as OpenNMS there are a number of people who had been involved in the project but due to either other interests or changing priorities they were no longer active. I remember going through all the contributions in our code base and systematically hunting down every contributor, no matter how small, and asking them to sign the OCA. They all did, which was nice, but it wasn’t an easy task. I can remember the e-mail of one contributor bounced and I finally hunted them down in Ireland via LinkedIn.

Now a lot of the focus of CLAs is around code ownership, but there is a second, often more important part. Most CLAs ask the contributor to affirm that they actually own the changes they are contributing. This may seem trivial, but I think it is important. Sure, a contributor can lie and if it turns out they contributed something they really didn’t own the project is still responsible for dealing with that code, but there are a number of studies that have shown that simply reminding someone about a moral obligation goes a long way to reinforce ethical behavior. When someone decides to sign a CLA with such a clause it will at least make them think about it and reaffirm that their work is their own. If a project doesn’t want to ask for a copyright assignment or grant, they should at least ask for something like this.

While the initial process was pretty manual, currently managing the OCAs is pretty automated. When someone makes a pull request on our Github project, it will check to see if they have signed the OCA and if not, send them to the agreement.

The fact that the copyright was under one organization came in handy when we changed the license. One of my favorite business models for open source software is paid hosting, and I often refer to WordPress as an example. WordPress is dead simple to install, but it does require that you have your own server, understand setting up a database, etc. If you don’t want to do that, you can pay WordPress a fee and they’ll host the product for you. It’s a way to stay pure open source yet generate revenue.

But what happens if you work on an open source project and a much bigger, much better funded company just takes your project and hosts it? I believe one of the issues facing Elastic was that Amazon was monetizing their work and they didn’t like it. Open source software is governed mainly by copyright law and if you don’t distribute a “copy” then copyright doesn’t apply. Many lawyers would claim that if I give you access to open source software via a website or an API then I’m not giving you a copy.

We dealt with this at OpenNMS, and as usual we asked our community for advice. Once again I think it was DJ who suggested we change our license to the Affero GPL (AGPLv3) which specifically extends the requirement to offer access to the code even if you only offer it as a hosted service. We were able to make this change easily because the copyright was held by one entity. Can you imagine if we had to track down every contributor over 15+ years? What if a contributor dies? Does a project have to deal with their estate or do they have to remove the contribution? It’s not easy. If there is no copyright assignment, a CLA should at least include detailed contact information in case the contributor needs to be reached in the future.

Finally, remember that open source is open source. Don’t like the AGPLv3? Well you are free to fork the last OpenNMS GPLv2 release and improve it from there. Don’t like what Elastic did with their license? Feel free to fork it.

You might have detected a theme here. We relied heavily on our community in making these decisions. The OpenNMS Group, as stewards of the OpenNMS Project, takes seriously the responsibilities to preserve the open source nature of OpenNMS, and I like to think that has earned us some trust. Having a CLA in place addresses some real business needs, and while I can understand people feeling betrayed at the actions of some companies, ultimately the choice is yours as to whether or not the benefits of being involved in a particular project outweigh the requirement to sign a contributor agreement.

The Server Room Show Podcast

A couple of weeks ago I had the pleasure to chat with Viktor Madarasz on “The Server Room Show” podcast.

The Server Room Podcast Graphic

Viktor is an IT professional with a strong interest in open source, and we had a fun and meandering conversation covering a number of topics. As usual, I talked to much so he ended up splitting our conversation across two episodes.

You can visit his website for links to the podcast from a large variety of podcast sources, or you can listen on Youtube to part one and part two.

It was fun, and I hope to be able to chat again sometime in the future.

Note: Viktor is originally from Hungary, as was my grandfather. I tried to make getting some Túró Rudi part of my appearing on the show, but unfortunately we haven’t figured out how to get it outside of Hungary, and we all know that I’d talk about open source for free pretty much any time and any place.