Wednesday, October 30, 2013

Healthcare.gov: Who - or what - is to blame?

I recently read something in the news that shocked me – they said the legacy of a President might be affected by the performance of a website. “Legacy of a President” and “performance of a website” don't even belong in the same conversation, let alone the same sentence. And yet, that was part of the discussion.

Of course they were overstating it a tad, as is the norm for news reporting these days. But it has had a tremendous effect on the public's perception of a critical component of Obama's presidency, and on the effectiveness of the Affordable Care Act itself. All because of a poorly planned software release.

How could this happen? It's certainly not for a lack of funds, with the website (and all the underlying integrations and data structures) reportedly costing at least $375 million, probably more. There were multiple large consulting firms, with hundreds of high paid and very smart architects and developers working on this project. And yet, the launch can be summed up as no less than an epic fail.

There's no easy answer. We watch Dr. Oz and Dr. Phil and expect all our problems can be solved with a simple pill or a simple platitude, but the problem here runs deep, and is not going to be solved easily. Real change requires real effort, and this is one of those situations where actual cultural change must occur if we have any hope of avoiding this kind of problem in the future. There will be plenty of finger pointing – this is Washington D.C. after all, and the political machine must be fed. But the blame lies on both sides, and there's no easy answer.

The people in charge will say the consultants and developers didn't tell them it wasn't going to be ready. They didn't escalate problems soon enough, and they didn't communicate the seriousness of the issues. The developers will say that they did tell them, but those in charge weren't listening, didn't want to hear it, and changed their minds on the requirements late in the game, unwilling to believe that those late changes couldn't be accommodated.

Odds are, they will both be right. Our industry is broken. There is a basic flaw in communication, and it causes far too many projects, small and large, incidental and critical, to miss their release dates or hit the date but suffer in performance and effectiveness. This communication failure is on both sides of the table. I've had the pleasure of sitting in a seat on either side, and I've seen this flaw at work from both perspectives.

The builders of systems – the developers, architects and engineers – fear that they cannot tell those requesting these systems the truth. They fear that if they tell them that the requirements they've been given (“Can't you just...?”) are far more difficult and complicated than they appear, the requester will simply find someone else who will tell them what they want to hear. If they want the work, they must meet whatever self created time line and budget the requester has come up with, and find a way through change management to make up for the misalignment between expectations and reality.

The requesters of systems don't want to hear the truth. They want to be told that they will get whatever they want (and can change their mind at any time) based on a time line and budget that makes them happy. When issues arise during the course of the project, they want to hear that it can be fixed and brought back on track without sacrifice. They often exhibit selective hearing, ignoring the caveats and focusing only on the positive. Group think often takes over, and what was obviously a disaster in the making in retrospect seems like a good idea at the time.

Over the next few blog posts I'm going to propose some ways to reduce this risk. But the reality is that we must change this culture from both sides if we are ever going to have a chance at getting past these sorts of issues. The problems plaguing Healthcare.gov isn't unique, simply the most visible and damaging example of a long standing problem. Until the builders believe they can come to the table with an honest assessment of the work, and the requesters both trust that this assessment is true and accept it, we have no hope of consistently getting software projects completed on time and on budget. The outcome of these problems isn't just a bad website. It can damage us all, right up to the President of the United States.

Tuesday, October 15, 2013

Data Never Lies....right?

I hear it a lot these days - Data Never Lies. Whether it's a political debate, a conversation on your fantasy football team, or the latest research into a childhood illness, everyone seems to agree - Data Never Lies.

The issue is that people take raw data and interpret and intuit conclusions, decisions, and courses of action, all while spouting the concept that "Data Never Lies" as proof positive that they are most certainly right.

At its very core, the statement Data Never Lies is accurate. Raw data doesn't lie. Data points are merely moments in time, frozen facts that provide a very specific truth at a very specific time under very specific circumstances. It is currently 72 degrees. That's data. That's a fact. And on it's own, it is utterly useless. Bring in other, relevant data and you begin to build a story - at 2 pm on Wednesday, October 8th 2013 in Ann Arbor, Michigan, it is 72 degrees. Supporting data gives greater detail, and makes for a better decision. But how much data is enough? Based on the information provided, Bobby decided it was a perfect day to mow the lawn, but was sadly torn to pieces by the pack of wild coyotes in the front yard. A little more data would have gone a long way.

Not only must there be enough data to build a complete story, but the data must be relevant and in context. Knowing there's wild coyotes in the front yard might seem irrelevant to weather data, but not to the decision in question. Knowing that I had Raisin Bran for breakfast this morning might make for a more compelling story, but it is irrelevant to the decision to mow the lawn. Bad data - data that is not relevant to the decision - can often be difficult to ferret out, and can lead to the wrong conclusions if included.

Even with all the right data, it can lead you to the wrong decision. That's because data requires interpretation to have any value in plotting a course, predicting the future, or making a decision. This interpretation requires human intervention, human skills, and human thought. Just look at the current political situation and you can see how flawed our ability to interpret data and make decisions can be. The proof is simple. I could have made that statement in any month of any year of any decade of civilized history and you would have thought "I know just what he's talking about". Humans make assumptions, their brains take shortcuts, their intuition is often weak, and they see the data they want to see, based on their own agendas and the baggage they bring with them.

How do you avoid falling prey to the coyotes? The first step is to know what the question is you're trying to answer, and understand it first. If you don't understand the question, you'll never understand the answer. IF you don't understand the question, you won't gather enough (or appropriate) data, turning you into a coyote snack.

Don't gather too much data. This is not a contradiction to the previous point. Sometimes the goal is to be on a fishing expedition, looking for a question that you didn't know you had, and it's important to look at as much data as possible for trends. But that's a situation where you're searching for a question, not an answer. When you already have a question or decision to make, too much irrelevant and useless data will cloud and confuse the process. How many times have you seen a movie where one side in a legal battle swamps the other with millions of documents to hide the truth? Don't do it to yourself.

The toughest nut to crack is the correct interpretation of the data. It's important to remove as many pre-disposed ideas and obstacles that might lead a human to jump to a conclusion or select a course of action based on an agenda and not on the data. Organizations and individuals often give off subtle clues as to the answer they want, and this can drive those interpreting the data to make a bad call. This is an example of organizational group think. Avoid making up your mind in advance, and learn to recognize when you're gathering the data and seeing only what you want to see to get the answer you want, rather than letting the answer flow naturally from the data.

And the next time your spouse tells you that's it's the perfect day to mow the lawn because Data Never Lies, remember to ask about the coyotes.

---Michael Crawford