Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Sat, 13 Nov 2004

Dealing with culture clash

Different disciplines have different cultures. There can be culture clash. How do you deal with that in an Agile project?

A group of us addressed this question at a Scrum Master get-together. (We were Christian Sepulveda, Jon Spence, Michele Sliger, Charlie Poole, and me.)

We focused on three more specific problems:

  • You need to get past cultural conflicts. (But you don't necessarily need to solve them.)

  • People are afraid they'll have to relinquish their disciplinary identities.

  • You can lead people to a cross-functional team, but you can't make them collaborate.

We recommend the following at the beginning of the project:

  1. Try to get the right people on the team. If it later becomes apparent that you didn't, separate the poison. Offer them additional training away from the team, put them on a special project (again, away from the team), help them find another project where they'd be more comfortable, and - as a last resort - suggest that they look for another position.

  2. Have an open forum at the start of the project. Get the issues out in the open.

  3. Use the "Word in a hat" game. Each team member writes down a word or phrase that best describes their main concern about the project, folds the paper, and places it in a hat. The phrases should be something like "rigid customers", "bonuses", or "schedule". The Scrum Master pulls the word out of the hat, reads it aloud, and starts a discussion that preserves anonymity.

  4. The open forum should result in an internal risk management plan to monitor cross-functional issues the team identified - and then deal with them. This can be simple, like a weekly pizza lunch to review open issues or new ones.

  5. The team should have a clear and common goal that all members can clearly articulate. For example: they should be able to recite the purpose of the project to their CEO should they find themselves in the elevator with her. Another idea is Jim Highsmith's "design the box" exercise. In it, the team designs the packaging of the software and puts it in a common area as a constant visual reminder of the project's ultimate goal.

Throughout the project:

  1. Monitor issues. Address them in retrospectives.

  2. Use "odd pairings". When a task needs doing, have programmers pair with testers, have testers pair with technical writers, have technical writers pair with programmers. This will spread knowledge through the team and cause people to sympathize with people in different roles.

  3. Ask "Why?" As team members take on tasks, they should think about how what they're doing helps to achieve the project's goals. By asking "why am I doing this?" the team is less likely to revert to non-agile form or start on wasteful activities.

Clearly, we've only scratched the surface. In particular, I notice that we haven't got anything specific to the problem of people afraid of having to relinquish their disciplinary identities. That's a problem near to my heart, because it's one that comes up a lot with testers.

## Posted at 19:17 in category /agile [permalink] [top]

Summary of the cost of change curve

There was a lot of email discussion about my post on the cost of change curve. A restatement of the problem:

Assume a classic waterfall process. On March 15, you release version 1 of your product. On March 20, you start work on version 2. On April 20, an urgent change request comes in. Assume two choices:

  • make the change in version 1 and release a patch.
  • make the change in version 2 and include it in version 2's release.

Let's assume that certain of the work is the same in either case. You have to scour version 1's requirements documents, architectural design documents, design documents, and code for the implications of the change. You have to update each of them. (Remember, we are assuming the kind of project to which the cost-of-change curve applies.) You have to make the change and test it.

So why would version 1 have a substantially higher expected cost?

Here's what people came up with:

  • In version 1, if the work toward the patch doesn't detect misimplementations, nothing will: you've just delivered a defective patch, which has substantial costs (especially in goodwill). In version 2, mistakes made in the change can be caught at many places along the way to the new release. Another way of putting it is that the cost-of-change curve is largely measuring risk of releasing defects.

  • There is some additional work in version 1 (preparing a patch release, special testing of the patch release, keeping track of which customers have which patches, maintaining multiple version control branches, etc).

  • In version 2, some of the work can be folded into things you're doing anyway (such as updating requirements documents for other reasons, changing the database schema, running manual test passes, etc).

  • Disruptive, interrupting work - "context switching" - costs more than doing work you planned on.

  • Money that wasn't budgeted (to make changes in version 1) "costs more" than money that was (to fold changes into version 2).

  • Some of the cost of the change is born by people outside the development organization. (They're the ones working with inadequate software while waiting for the patch, they're the ones who may have to relearn things, they have to install the patch, etc.) Even in an imperfect market, some of that cost would presumably be reflected back to the development organization. (Echoes here of Genichi Taguchi's cost of quality curve.) In the case of the version 2 release, the recipient's cost of the change is included in the expected cost of any new release.

  • At the end of Version 1, there may have been some cleanup that drives the costs back down (tactical hacks fixed, architecture rejiggered, etc.) Even without that, the team may have gotten some down time to refresh themselves. (That is, they're temporarily out of the trap wherein overwork visibly consumes hours but invisibly destroys efficiency.)

The agile methods are, in large part, about driving these costs down, it seems to me (and to others).

Thanks to Alex Aizikovsky, Laurent Bossavit, Todd Bradley, Clarke Ching, Jeffrey Fredrick, Chris Hulan, Chris McMahon, Glenn Nitschke, Alan Page, Andy Schneider, Shawn Smith, Glenn Vanderburg, Robert Watkins, and perhaps others I forgot to record. Because of the underwhelming response to my quirky network invitations thing, I'd concluded the 120,000 hits my blog got last month were mostly due to two out-of-control news aggregators hitting my site once per minute.

## Posted at 12:20 in category /misc [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo