Archive for the ‘Uncategorized’ Category.

5 ways to speed up your Agile adoption

Too many people these days seem to think that adopting an Agile method is quick and easy. Not so! It’s definitely worth it, but the road can be long and hard.

In corresponding with a newbie, I gave a list of five ways they could speed up their adoption. Here they are, with a bit of explanation:

  1. Have a clear project charter. If you don’t know what your project is supposed to achieve, it will take you a lot longer to get there, and a lot of decisions will be muddled. Write up a clear statement of purpose, post it prominently, and keep it up to date when goals change.
  2. Shorten your iterations. An iteration is a regular, fixed-length period where you decide what you’re going to do, go do it, and then measure how much you got done. (By done, I mean 100% done and releasable, what some call “done done”. 98% done equals not done.) Keep iterations as short as possible. I recommend a week. Two weeks can be ok; three or four is risky. Three months is downright nuts.
  3. Release more often. As often as you can, get software out to real users. If you think you’re already doing it as often as possible, you’re probably wrong. Many teams release weekly, some daily, and a few ship several times a day.
  4. Get more data. You wouldn’t drive a car with the windshield painted over, steering by where you think you are, but that’s how a lot of people drive projects. Increase the volume and clarity of real-world data on product impact. Use that to evaluate what you’ve released, and to shape upcoming work. This can include guerrilla user testing, user surveys, usage analytics, customer surveys, user context research, sales data, and just having people over for a beer.
  5. Use an experienced Agile coach. I may be a little biased, but I think a good Agile coach can save a lot of time and trouble. There are a lot of good ways to be Agile, but there are even more bad ways, and it’s hard to tell them apart until you’ve been around the block.

The careful eye will notice a common theme here: improving feedback loops. The more feedback we get, and the faster it comes after our actions, the quicker we learn. That’s the engine that makes Agile approaches superior to plan-driven ones, and we can use that engine to speed up Agile adoption as well.

Hard numbers

Numbers are tricky, especially when you want to use them to understand business problems. A book I read a few years ago and several blog posts more recently have highlighted this for me, so I got the urge to write about the challenges inherent in understanding with numbers and a couple of helpful tips I’ve picked up from the Agile community.

In The Elegant Solution, Matthew E. May devotes a chapter to talking about the importance of picking the right things to measure for your business (Chapter 11, “Run the Numbers”). The Elegant Solution is one of my two favorite business books (together with The Seven-Day Weekend by Ricardo Semler), filled with Lean principles learned from May’s years teaching for Toyota. But while most of the book is quite practical, its practicality kind of unravels in this chapter.

The problem is, the first several examples he gives show just how darned hard it is to get numbers right. It’s hard enough to decide what you want to measure. But the examples go way further, showing how even math PhDs can get correlations and probabilities horribly wrong. The author doesn’t even seem to be trying to make the point of how hard it is to get the math right. He seems to be trying to merely make the point that it’s important to get the math right. It just turns out that every example is more about getting the math wrong.

The first examples he gives aren’t even from business, starting with the Monty Hall problem. This problem evokes heated debates, with very smart people giving different answers and standing by those different answers insistently. The vast majority of people get it wrong, including people with advanced study in mathematics. I got it wrong at first, and it took several tries to finally accept the right answer. (In my defense, I did figure it out before getting out of college!)

In another example, May arguably gets it wrong himself:

You’re playing a video game that gives you a choice: Fight the alien superwarrior or three human soldiers in a row. The game informs you that your probability of defeating the alien superwarrior is 1 in 7. The probability of defeating a human soldier is 1 in 2. What do you do? Most people would fight the human soldiers. It seems to make intuitive sense. The odds seem to be in your favor. But they’re not. Your probability of winning three battles in a row would be 1/2 X 1/2 X 1/2, or 1/8. You have a better shot at beating the alien superwarrior. It’s a simple problem of probability. [Matthew E. May, The Elegant Solution, page 158]

Here’s the thing: If the chance of beating each human soldier is purely random and independent of beating any other human soldier, then the probability of beating three in a row is indeed 1/8. But if there’s any skill involved in the game, then the chances are not independent. When I’m playing a video game, I typically will grow through levels of skill where certain kinds of opponents become easy to defeat. When I start playing the game, maybe I almost never defeat “human soldiers”, but after some time playing I get to a level of skill where I almost always defeat them. So if I can defeat the first soldier, I will likely defeat all three; if the probability of defeating the first soldier is 1/2, the probably of defeating all three is arguably nearly 1/2 as well, which is clearly better than the 1/7 chance of beating the “alien super-warrior”.

I was reminded of this chapter from The Elegant Solution when I saw a flurry of blog posts on a similar problem recently: See Jeff Atwood’s question, his own answer, Paul Buchheit’s response, and the discussion on Hacker News… The problem is just as simple as the Monty Hall problem, and the response it just as heated. Paul Buchheit points out that the simple English statement of the problem can be parsed two different ways, which result in two completely different answers (both of which I verified myself by Monte Carlo simulation!). In another realm, Semyon Dukach suggests that the current financial crisis is due precisely to the difficulty of numerical intuition.

The examples of success with numbers in The Elegant Solution all come down to identifying very simple metrics underlying the businesses in question. Jim Collins gives similar counsel in Good To Great. But the problems mentioned here show that even given a very simple mathematical statement, you can still get into lots of trouble. In The Goal, Eli Goldratt gives some fascinating advice about how to orient your business toward the true goal of business (I won’t spoil the plot by saying what “the goal” is; it really is worth reading the book). But here again, the whole story line of The Goal shows just how unintuitive those principles are, and how long and painful (though valuable) a process it can be to learn them through real-world experience.

So what do we do? A couple pieces advice I’ve picked up over the years might help:

1. Measure, don’t guess. Specify the problem precisely enough to implement a Monte Carlo simulation, and then run it several times. This is the only way I’ve been able to convince myself of the answers to tricky problems like the Monty Hall problem. The more I think about the problem, the less sure I get about the answer. This principle also applies to optimization of code: Start with the simplest thing that could possibly work, and then measure how it performs under realistic conditions. The thing that needs optimization is often not what I guessed it might be.

2. Measure “up”. Mary and Tom Poppendieck talk about this principle in detail in their Lean Software Development books (the phrase is mentioned on page 40 of Implementing Lean Software Development). It is an antidote to local optimization: The more you focus on localized metrics, the more confused you’ll get. As they write, “The solution is to … raise the measurement one level and decrease the number of measurements. Find a higher-level measurement that will drive the right results for the lower-level metrics and establish a basis for making trade-offs.”

What other advice do you keep in mind when numbers get tricky?

“Agile” versus “agile”

There seems to be a lot of confusion these days about whether something or other is really Agile, and what that means. Here’s my take on how to sort that out.

Growing up, I lived in a city called Grand Rapids. As you’d expect, there’s a river running through it. Does the river actually have rapids, and if so, are they truly grand? Do other rivers have rapids more rapid, or perhaps more grand? Those questions are interesting, but from the perspective of the name, it doesn’t matter. A long time ago, somebody thought that was a good name for the place, and it stuck.

Capital-A Agile

A decade ago and more, a bunch of people were working on new software processes. They were very different than what had come before, but they all had something in common. It was hard to put a finger on exactly what that was, but eventually they got together and came up with four value statements and twelve principles. And they came up with a single word: Agile. As in “Agile Manifesto” and “Agile Software Development”.

Was this perfect? No. Was it meant to explain everything about software development for all time? No. Was it a software development process on its own? Definitely not. But it was a declaration of common purpose, a list of things they could agree on.

That’s what capital-A Agile is: a bunch of people seeing that they had something in common, and attempting to say what that common thing was. They gave it a name and a partial definition. And most importantly, they formed a community that is still working out what that means and how best to do it.

Small-a agile

It’s important to note that they weren’t saying that they had an exclusive lock on agility, or even what made software development agile. As with the naming of Grand Rapids, they were pointing at a particular spot in the landscape of ideas and naming it. The word agile has a variety of meanings, and there are a lot of aspects to software development to which you could apply those meanings. They weren’t trying to lay claim to agility as a whole, any more than Grand Rapids is claiming all the rapids in the world, especially the grand ones.

That also means that there are plenty of ways to develop software that aren’t Agile. After all, software got made long before the Agile Manifesto was written. And there are surely ways of being agile that aren’t included in either the Manifesto or in the current practices of the Agile community. Heck, that’s part of why we get together every year, and talk so extensively on line. Processes based on continuous improvement gives you a real taste for continuously improving that process.

Saying “that’s not Agile”

So given this, what does it mean when somebody says “that’s not Agile”? To me, it just means that the thing they’re pointing at is a different spot in the landscape of ideas.

Some people get upset when they hear that, because they believe they’re doing well at making software, or because they think they’re being pretty small-a agile. They may or may not be right, but that doesn’t matter. If people in the Agile community say that something isn’t Agile, then it probably isn’t, the same way the city of Grand Rapids gets to decide where the city limits are.

If it bothers you to get told that something isn’t Agile, you have three basic choices:

  • Find out more about what we mean about Agile. We’re generally a friendly bunch, glad to show you around. Join a mailing list, come to an event, or even ask in the comment box below.
  • Persuade us we’re wrong. If Agile is the city we’ve built on the landscape of ideas, it’s a city that’s grown a lot over the years. It in effect started with a number of different little towns growing together. More recently came Lean, but these days it’s getting hard to even tell where the boundaries used to be. We’re very open to new construction, and your idea might be the next big development.
  • Start your own thing. Agile may be the big thing of the moment, but it will eventually be as obsolete as the cavalry charge. Just because we say that something isn’t Agile doesn’t mean that it’s not a good idea. If you’re sure of yourself, do what the Agile founders did: stake out your own territory in the landscape of ideas and give it a name.

Regardless, there’s no need to get upset. Having different approaches or coming from different schools of thought doesn’t mean we don’t share the same goals in the end.

Measuring developer productivity

I just read George Dinwiddie’s interesting take on developer productivity, and I wanted to throw in my own two cents.

You can’t measure it

I agree with a number of others who say that there’s no good measure for developer productivity. There are several basic approaches people use, and all of them have flaws:

  • time spent - This is a classic way to measure productivity. How long did people work? If the number is large, things must be good, right?
  • apparent effort – Although this is even more flawed, it’s very popular. The “if you ain’t sweatin’, you ain’t workin’” metric is a favorite of seagull managers. But it’s easy to manipulate, and even when people are honest, it’s terribly misleading.
  • technical output – This includes things like keystrokes or lines of code produced. As Bill Gates says, “Measuring programming progress by lines of code is like measuring aircraft building progress by weight.”
  • functional output – Instead of counting lines of code, you can count features, through mechanisms like function points. Counting fields and data elements is a lot more work than counting lines of code, but it’s not clear the results are much better.
  • business value – That’s what we’re after, so it seems like it would be great to track this. And you should. But it’s incredibly difficult to assign that value to individual bits of work, and especially to individual players.

Over the years, I have seen a lot of places try to numerically measure how productive their developers are. I’ve never seen anybody have much success, butI have seen a lot of wasted effort. And worse, I’ve seen a lot of harm. Try to measure individual productivity, for example, and you create a disincentive to help others. Since some of your most productive developers are the ones who mentor others and keep them from wasting time or making messes, it’s easy to drastically reduce productivity just by trying to measure it.

But everybody knows

Does this mean that it’s impossible to know how well a team is doing, or who the top performers are? Not at all.

The team knows

On agile projects, it isn’t individuals who are responsible for delivering. It’s teams. If your team is working together in a room, has tight feedback loops, and delivers frequently to end users, everybody will be forced to work together and interact frequently. Every team member will have a good idea of who is a top performer and whether somebody isn’t pulling their weight. They can’t not know. Whether they’ll tell anybody else is a different thing, which leads to my next point.

Embed reporters and distribute power

There are many advantages to having business people, like product managers and business analysts, in the room with developers. Your products will be better designed, better built, and more efficient. But a side benefit is that there will be somebody management trusts to give them an honest opinion on developer productivity. For this to work well, the business representative should be — both culturally and organizationally — not part of the engineering organization.

It’s also important to give the team matched responsibilities and power relating to this. Involve the team extensively in interviewing and hiring — and also in firing. Make sure they know they’re responsible for total productivity, and give them the authority to make changes they need in that regard. Hint: if they’re not allowed to change the furniture or order new RAM for their machines, they sure won’t think they can pressure or fire a poor performer.

Focus on delivering value

The key though, is to get the whole team focused on whatever purpose the team exists for. As frequently as possible, measure key indicators, like sales, usage, or customer satisfaction. And don’t automatically measure some numbers that go on a web page nobody looks at. Metrics don’t matter unless somebody cares about them. In an Agile context, caring about something is contagious. You should visibly care about the numbers that matter, possibly through a hand-drawn big visible chart. Others will pick up the habit.

If people really care about achieving shared goals, then you won’t have to worry about their performance. They’ll be doing it themselves.

Clicky Web Analytics