Stop me if you’ve heard this one. To deliver more customer value, a software team decides to upgrade the database. They talk to their operations people, who say, “Sorry, we can’t upgrade to the latest version because another team shares the database, and their application won’t support it.” Resigned to their fate, they put the card in the backlog for a faraway day, and shelve any planned features that relied on it.

Not funny? I agree. The sad part is the word “database” in the story could variably be replaced with operating system, application server, VM platform, etc., and the story would ring true with just as many people. If you’ve worked in the trenches of software half as many years as I have, you know that things like this are the hidden friction in software. Often, the teams outside of an application are coupled together by shared dependencies as much as the code inside the application.

Lister and DeMarco famously observed that, “The major problems of our work are not so much technological as sociological in nature.” But I wonder if anyone has considered what technology (especially software) and people might share in common? Consider Conway’s Law, which makes such a connection: “organizations which design systems . . . are constrained to produce designs which are copies of the communication structures of these organizations…”

Conway’s insight is that software built by teams will be comprised of pieces roughly corresponding to those teams. This is a powerful observation, with the obvious application that software companies should organize into teams congruent with the software they want to build.

But I think we can extrapolate further. We all agree that software with certain properties is easier to change. Its components should have low coupling, high cohesion, well-defined interfaces, clear contracts, and so on. Extrapolating from Conway’s Law, we get a new observation: if we want to produce software with these properties, our teams must also have them. Teams should have strong separation of concerns, few dependencies, clearly defined boundaries, etc. Teams like that are able to evolve their piece of the system independently, making the entire organization more agile.

Another way to say it is that efficient communication between units is crucial to success, whether those units are modules of code or teams in companies.

Does this connection bear out in reality? For one example, take the rising popularity of the “platform” team. Ironically, organizing teams within a software company around an internal platform produces a sociological problem that mimics a classic software design paradox almost perfectly. It encourages reuse, which is good, but it also enmeshes the platform team in far too many decisions. They get coupled to every team, because they’re like a class imported into every other class in an application. Whenever it changes, every dependent class must change as well.

Another example can be seen in “vertically siloed” companies, where teams are organized by role: finance, product, engineering, support, operations, etc. In this scheme, how many different teams need to be involved to deliver a single customer feature? What does attendance in a project kickoff meeting look like? It reminds me of having a class with ten parameters in its constructor.

I think it’s worth asking how many dependencies your team has, even if it is cross-functional. You may be surprised how many there are. Try making a “Conway system diagram” of your company. Take a normal system diagram and mark each component with the team or teams that have responsibility for it. Connect all the teams required to enhance, support, or maintain a single customer scenario.

Minimizing team dependencies isn’t the only application, though. An analog to the Single Responsibility Principle might be that teams should have only one reason to change focus. If a team has too many customer concerns, maybe their attention is fractured and the resultant context-switching is killing them.

A final example might be seen in how adding a security review phase to a project always results in inadequate security in the product. The “bolt on” nature of such an approach violates the software design principle of “secure by design.” Teams that care about building secure software should place that knowledge in the team itself and it should be present throughout the project lifecycle.

I’ll leave further examples as an exercise for the reader. By thinking of team organization with a software lens, you may find key insights leading to better ways to organize. Conway’s Law is practical, not merely theoretical. Independent and de-coupled teams, like independent code modules, will be better positioned to produce software that can evolve, and software with evolvability allows the organization to deliver customer value more quickly and effectively.