Monday, December 30, 2013

How to Tank a Project 101 - THE DATE

Last blog post I talked about the problems with communications in the IT industry, and how those problems result in projects that are perceived failures. Generalities are important, but now it's time to get to some specifics.

The single most important metric for determining the success or failure of a project is THE DATE. You know which one I'm talking about - the date you set for the completion. Completion of what? Yep, that's part of the problem.

While THE DATE might be treated as the most important metric, it's actually one of the worst. It holds no real measure of success or failure, since it provides no understanding of whether or not the project actually met the real goals - solving the problems or making the improvements that were intended. What's worse, THE DATE is often arbitrary, while the goals are not. And yet the goals are often sacrificed for the sake of the date - how does that make any sense?

I think our obsession with THE DATE comes from our educational process. We are taught that hitting the date set by the teacher is all important, no matter the cost. And it does matter in that situation, since learning is a time based function. But once we're in the real world, other factors apply. Too often, we ignore these other factors, and continue our obsession with THE DATE.

So how does this all important metric get set? With all the diligence and care you'd expect with something so critical? Usually not.

Does this sound familiar? Someone with a 'chief' or 'vice president' at the start of their title stops you in the hall and says "I read an article in an in-flight magazine about the Cloud - how much time do you think it would take to move our swizzlestick production systems to a model like that?" You look like a deer caught in the headlights of a Mack truck. The sheer number of factors that will alter and influence a project of this type cause your brain to momentarily lock up. You start weighing your career options - the fast food industry can't be that bad, can it? You hem and haw, and the boss senses your fear. Usually they follow up with something like "Don't worry - I won't hold you to it. I'm just trying to get a rough idea." So believing that if everything in the universe aligned perfectly you might make it in 6 months and knowing that you'd better not take more than 12 and keep your job, you blurt out "around 9 months or so". You part company, and feel pretty good about it. After all, they will go through a process of requirements gathering and goal setting before finalizing the date, right?

Let's pretend this conversation takes place next week, during the second week of January. And before you know it, you see a project pop up on the planning calendar for 'implementation of Cloud architecture for swizzlestick software' with a completion date on the 1st of September. Not October, which would have been at least 9 months from the conversation, not even a general announcement of September, but a specific day that you'll never make. And that's how you derail a project before it starts, because now, best intentions aside, that's THE DATE.

This isn't how it plays out every time of course. Sometimes there's a bit more discussion, sometimes even some basic goal setting and requirements gathering. Some due diligence gets performed. And it's the right time to set a date - you can't run a project without one, of course. But even then, we set THE DATE without consideration for reality.

Let's go with a better scenario. Data is gathered, goals determined, basic requirements understood. And let's assume your team is extremely good at reviewing requirements and giving estimates. Extremely good is going to be 80% accurate at best at this early stage of any project - I think 60% is a far more reasonable number, considering the number of issues that can arise over the course, but we'll give it the benefit of the doubt and go with 80%. That means a 9 month project could be off by 7 weeks - almost 2 months! - and still be within the original estimate window.

Now let's be even more honest: when was the last time your team was 80% accurate with the initial estimate because you finished 7 weeks early? I didn't think so. The reality is that being 80% accurate in estimating means at best you'll hit 9 months, at worst it will be 11.

In an attempt to make it even more difficult for us to hit THE DATE, we refuse to accept any generality. At the very start a go-live day is picked, and that's what will determine success. If you start this 9 month project on February 1st, the go live WILL be October 31st. That day. Not the 30th, and certainly not December 1st. Since missing this day by a small amount won't become apparent for months into the project, this specific day will become firmly cemented in everyone's mind. Any slip means failure.

To highlight just how ridiculous this, consider a project with a fairly solid, tried and true estimation - having a baby. Odds are far better than 80% that it will be a nine month project, and yet no one is foolish enough to pick their child's birthday before they're even pregnant.

In all this focus on THE DATE, we lose sight of the goals. This project was proposed for specific reasons. There are cost savings, or efficiencies, or increased revenue at stake. When THE DATE becomes the most critical metric, we become willing to abandon features and functions, cutting back on the actual deliverables. It's a long proven fact that people will work to the metric that they are rewarded for. Did we get a system or process that still gave us what we wanted in the first place? No, but we hit THE DATE.

Is it possible to still manage a project to an successful completion while treating THE DATE differently? Yes, but it requires being smarter about how we pick THE DATE, and how we go about managing to THE DATE. And that will be the subject of our next blog post!

- Michael Crawford

Wednesday, October 30, 2013

Healthcare.gov: Who - or what - is to blame?

I recently read something in the news that shocked me – they said the legacy of a President might be affected by the performance of a website. “Legacy of a President” and “performance of a website” don't even belong in the same conversation, let alone the same sentence. And yet, that was part of the discussion.

Of course they were overstating it a tad, as is the norm for news reporting these days. But it has had a tremendous effect on the public's perception of a critical component of Obama's presidency, and on the effectiveness of the Affordable Care Act itself. All because of a poorly planned software release.

How could this happen? It's certainly not for a lack of funds, with the website (and all the underlying integrations and data structures) reportedly costing at least $375 million, probably more. There were multiple large consulting firms, with hundreds of high paid and very smart architects and developers working on this project. And yet, the launch can be summed up as no less than an epic fail.

There's no easy answer. We watch Dr. Oz and Dr. Phil and expect all our problems can be solved with a simple pill or a simple platitude, but the problem here runs deep, and is not going to be solved easily. Real change requires real effort, and this is one of those situations where actual cultural change must occur if we have any hope of avoiding this kind of problem in the future. There will be plenty of finger pointing – this is Washington D.C. after all, and the political machine must be fed. But the blame lies on both sides, and there's no easy answer.

The people in charge will say the consultants and developers didn't tell them it wasn't going to be ready. They didn't escalate problems soon enough, and they didn't communicate the seriousness of the issues. The developers will say that they did tell them, but those in charge weren't listening, didn't want to hear it, and changed their minds on the requirements late in the game, unwilling to believe that those late changes couldn't be accommodated.

Odds are, they will both be right. Our industry is broken. There is a basic flaw in communication, and it causes far too many projects, small and large, incidental and critical, to miss their release dates or hit the date but suffer in performance and effectiveness. This communication failure is on both sides of the table. I've had the pleasure of sitting in a seat on either side, and I've seen this flaw at work from both perspectives.

The builders of systems – the developers, architects and engineers – fear that they cannot tell those requesting these systems the truth. They fear that if they tell them that the requirements they've been given (“Can't you just...?”) are far more difficult and complicated than they appear, the requester will simply find someone else who will tell them what they want to hear. If they want the work, they must meet whatever self created time line and budget the requester has come up with, and find a way through change management to make up for the misalignment between expectations and reality.

The requesters of systems don't want to hear the truth. They want to be told that they will get whatever they want (and can change their mind at any time) based on a time line and budget that makes them happy. When issues arise during the course of the project, they want to hear that it can be fixed and brought back on track without sacrifice. They often exhibit selective hearing, ignoring the caveats and focusing only on the positive. Group think often takes over, and what was obviously a disaster in the making in retrospect seems like a good idea at the time.

Over the next few blog posts I'm going to propose some ways to reduce this risk. But the reality is that we must change this culture from both sides if we are ever going to have a chance at getting past these sorts of issues. The problems plaguing Healthcare.gov isn't unique, simply the most visible and damaging example of a long standing problem. Until the builders believe they can come to the table with an honest assessment of the work, and the requesters both trust that this assessment is true and accept it, we have no hope of consistently getting software projects completed on time and on budget. The outcome of these problems isn't just a bad website. It can damage us all, right up to the President of the United States.

Tuesday, October 15, 2013

Data Never Lies....right?

I hear it a lot these days - Data Never Lies. Whether it's a political debate, a conversation on your fantasy football team, or the latest research into a childhood illness, everyone seems to agree - Data Never Lies.

The issue is that people take raw data and interpret and intuit conclusions, decisions, and courses of action, all while spouting the concept that "Data Never Lies" as proof positive that they are most certainly right.

At its very core, the statement Data Never Lies is accurate. Raw data doesn't lie. Data points are merely moments in time, frozen facts that provide a very specific truth at a very specific time under very specific circumstances. It is currently 72 degrees. That's data. That's a fact. And on it's own, it is utterly useless. Bring in other, relevant data and you begin to build a story - at 2 pm on Wednesday, October 8th 2013 in Ann Arbor, Michigan, it is 72 degrees. Supporting data gives greater detail, and makes for a better decision. But how much data is enough? Based on the information provided, Bobby decided it was a perfect day to mow the lawn, but was sadly torn to pieces by the pack of wild coyotes in the front yard. A little more data would have gone a long way.

Not only must there be enough data to build a complete story, but the data must be relevant and in context. Knowing there's wild coyotes in the front yard might seem irrelevant to weather data, but not to the decision in question. Knowing that I had Raisin Bran for breakfast this morning might make for a more compelling story, but it is irrelevant to the decision to mow the lawn. Bad data - data that is not relevant to the decision - can often be difficult to ferret out, and can lead to the wrong conclusions if included.

Even with all the right data, it can lead you to the wrong decision. That's because data requires interpretation to have any value in plotting a course, predicting the future, or making a decision. This interpretation requires human intervention, human skills, and human thought. Just look at the current political situation and you can see how flawed our ability to interpret data and make decisions can be. The proof is simple. I could have made that statement in any month of any year of any decade of civilized history and you would have thought "I know just what he's talking about". Humans make assumptions, their brains take shortcuts, their intuition is often weak, and they see the data they want to see, based on their own agendas and the baggage they bring with them.

How do you avoid falling prey to the coyotes? The first step is to know what the question is you're trying to answer, and understand it first. If you don't understand the question, you'll never understand the answer. IF you don't understand the question, you won't gather enough (or appropriate) data, turning you into a coyote snack.

Don't gather too much data. This is not a contradiction to the previous point. Sometimes the goal is to be on a fishing expedition, looking for a question that you didn't know you had, and it's important to look at as much data as possible for trends. But that's a situation where you're searching for a question, not an answer. When you already have a question or decision to make, too much irrelevant and useless data will cloud and confuse the process. How many times have you seen a movie where one side in a legal battle swamps the other with millions of documents to hide the truth? Don't do it to yourself.

The toughest nut to crack is the correct interpretation of the data. It's important to remove as many pre-disposed ideas and obstacles that might lead a human to jump to a conclusion or select a course of action based on an agenda and not on the data. Organizations and individuals often give off subtle clues as to the answer they want, and this can drive those interpreting the data to make a bad call. This is an example of organizational group think. Avoid making up your mind in advance, and learn to recognize when you're gathering the data and seeing only what you want to see to get the answer you want, rather than letting the answer flow naturally from the data.

And the next time your spouse tells you that's it's the perfect day to mow the lawn because Data Never Lies, remember to ask about the coyotes.

---Michael Crawford